Politique Internationale — You speak of the 21st century as one of technological shock. What exactly does this mean? What new risks and threats are involved?
Asma Mhalla — This technological shock I’m talking about has three main characteristics. The first of these is an extraordinary capacity for micro-targeting: no matter how small, the target is reached thanks to the wave of innovation. We’ll come back to this later. The second characteristic is hyper-speed: everything is happening at an accelerated pace, in a system that has become dual. Dual in the sense that new technologies embrace both the civilian and military spheres, the public and private spheres, the individual and society as a whole, with constant interaction. As a result, we are seeing the emergence of complex players with multiple roots and areas of intervention. Let’s take the example of influencers: at the outset, they are part of a small circle, with a fairly playful side, before very quickly expanding their reach, with less avowed aims.
As for the third characteristic of the technological shock, it refers to those giants that have become unavoidable, the Big Tech companies. Their rise is all the more impressive for the fact that it continues unabated. These groups had an original vocation: technology and its development. Now they have changed dimension: they have become systemic players, i.e. capable of deploying themselves universally, via numerous business segments, and of influencing major political, social and cultural trends. They are still economic players, but they have established themselves above all as instruments of power and might. We know, for example, that many of the latest technologies are being used on the battlefields of Ukraine and Gaza, to name two of the main theatres of operation today.
P. I. — Let’s start with micro-targeting: what makes you say that the target is reached so easily?
A. M. — Today, technology makes it possible to collect an infinite amount of data on individuals. This compares with what used to be the case, when the approach did not go beyond the large masses, if at all; we used to think in terms of companies, organisations, population segments... Today, the individual is almost stripped naked: all his or her personal data is practically out in the open. This means that those who have access to this data can carry out highly detailed analyses of behaviour. They are becoming almost entomologists.
This micro-targeting would not be so worrying if it remained confined to limited uses such as commercial advertising. This is not the case: the mass of data is exploited for a multitude of purposes, from defining consumer profiles, to influencing political orientations, guiding socio-cultural choices, finding one’s bearings within a community... Micro-targeting widens the focus on a host of registers that could previously go unnoticed. Everything is screened. I’m talking about individuals here because this plunge into the intimate is spectacular, but almost all players, companies and governments in particular, can be explored in the same way.
P. I. — You also insist on hyperspeed. Is it a perpetual race against the clock?
A. M. — A philosopher has perfectly theorised this dictatorship of the instantaneous, which prevents us from taking a step back and undermines the depth of our thinking. Paul Virilio’s work predates the golden age of digital technology, but that in no way detracts from its acuity. On the contrary, it has a visionary aspect, as in these assertions: ‘If we don’t fight our technology, if we give in to it, we will no longer be human beings. Technology is there to be fought with, it can only progress through our struggles, through denouncing its negative aspects.’
Virilio understood very well that a war must be waged against technology, otherwise we will become captive to one innovation after another, prisoners of breakthroughs that prevent the slightest critical thinking. Need we remind you of the incredible speed with which fake news is breaking the news cycle? Or how disinformation threatens us at every turn, as content is skilfully disguised? Virilio anticipated the damage caused by the technological tidal wave: the concept of speed, which underpins his considerations and which he has developed extensively, gives us a glimpse of a wide range of dangers.
P. I. — The rise of Big Tech: how do you illustrate the singularity that sets it apart?
A. M. — By becoming hybrid players, these groups have changed their status. Let’s not forget that they were originally private companies, closely linked to tech and capable, as innovations came along, of establishing themselves in markets as varied as industry, health, the media, telecommunications, services... In short, technological know-how affirms itself as a formidable gateway to entering a particular niche, via the development potential offered by digital technology.
But that’s not where the hybridity lies: it lies in the fact that these meta-platforms, which are omnipresent through their activities, now intervene in the political, diplomatic, geopolitical and military spheres, with their zone of influence extending into the spheres of state sovereignty and security. They are becoming key suppliers to the functioning of the state apparatus. Over the last three years, Big Tech has definitely changed its status. There is no shortage of examples of this: thus, when Elon Musk takes the floor, his speech is seen, not just by the media, as that of a ruler in direct contact with the affairs of the planet: we are well beyond the stage of the ‘simple’ economic leader. In this case, most people know who Elon Musk is and his declared desire to revolutionise modes of transport. The technological impact of this and other personalities is all the greater because they are profoundly changing everyday habits. In 2023, Sam Altman organised a diplomatic tour around the world to support the adoption of ChatGPT, and Peter Thiel structured the anchoring of the alt-right within a section of the Silicon Valley elite.
P. I. — From Big Tech to Big States, it’s just a short step...
A. M. — The Big States are those states that not only work closely with Big Tech but also, and above all, make massive use of technological tools to shape their power policies. In this respect, the two models, American and Chinese, are very similar. As far as cooperation is concerned, it would take too long to list all the projects: at regular intervals, we see countries awarding key markets to Big Tech in transport, digital services or health. At a certain stage, the projects are almost transformed into programmes of national importance, as in the UK, for example, with Microsoft.
The risk for ‘us’ – that is, the democratic society – is the temptation of mass techno-surveillance at low cost, at the cost of our fundamental freedoms. The desire for control, in an unstable world governed by risk, is there, with the means of surveillance increased thanks to new technologies. It goes without saying that in certain states – the techno-authoritarian states of Russia, China and, in some respects, India – this policy is pushed to the limit, unlike in other regions where safeguards still exist, such as here in France. This leads me to say that artificial intelligence, as such, can rapidly become ultra-structuring on a political level: if it is used for malicious, bellicose or coercive purposes, to muzzle opinion, transmit false information or increase the lethality of conflicts that are now hybrid, both cyber and kinetic. This is what has been happening since the war in Ukraine.
P. I. —Specifically, how did the technological shock enter the military sphere?
A. M. — AI systems for military use are no longer the exception or the subject of experimentation in warfare. The conflicts in Ukraine and Gaza are good examples of this: technology does not replace traditional warfare, but its tools – drones in particular – are a great support to existing methods. Israel, for example, uses special software to carry out ‘surgical strikes’, if that notion of targeted bombing is to be believed at all. Generally speaking, the battlefield concentrates an enormous amount of data, from which combatants can develop strategies.
We are talking here about war in its most visible and violent manifestations, but the war of influence has long used digital technologies. It was Washington, for example, that denounced the role of the TikTok network as that of a spy in the service of China.
P. I. — Faced with the excesses of the technological tidal wave, some politicians have raised the possibility of regulating the use of screens. What are we to make of the scenario of digital rationing?
A. M. — I’m sceptical about these limitation scenarios which, from a simple technical point of view, are difficult to implement. Similarly, there’s no point in shouting from the rooftops to stigmatise this or that drift. On the other hand, raising awareness around new cognitive reflexes will be very useful: making people understand that digital technology, the Internet and artificial intelligence can easily deceive them. And that they need to be robust enough to fight back against these ‘aggressions’. We return to the work of Paul Virilio: let’s learn to fight so as not to be manipulated. In particular, this means sorting out the tools ourselves, knowing how they work and what they are used for. Not all digital platforms and services are created equal.
P. I. — The exposure of young people to the avalanche of technology is a frequent cause for concern. How can we protect these young people?
A. M. — Battles are rarely lost in advance. It’s an important issue because, in education, digital technology has all but supplanted the written word or, at any rate, digital technology is growing at the expense of the written word. One of the current issues is how to prevent digital hegemony from sweeping away more traditional forms of learning. The issue of cognitive hygiene is raised: how to remain vigilant in the face of a flood and how to maintain some level of reflex to enable young people to learn, reflect, discern – in short, to flourish. We talk a lot about moderators, but moderation is sometimes a biased act. You have to rely on your own knowledge to benefit effectively from digital advances.
Of course, you might think that intervening in this area is almost a public service mission, but the task is a tough one: in the background, social inequalities are a reminder that not all young people are equally exposed to the digital tidal wave. Education and the right framework are needed to ensure that young people understand that technology is not the be-all and end-all.
P. I. — The importance of politics in spreading the technological shock is clear. But what about the legal aspect? Can effective barriers be erected on the regulatory front?
A. M. — If we take the case of Europe, it got off to a very late start. We relied mainly on declarations of principles to set the rules and much less on the regulatory side. Things are changing: regulations have been drawn up in Brussels that set limits to the omnipotence of the major platforms. We’re probably only at the beginning of digital legislation, but it’s essential to lay down some markers. The difficulty lies in having a global approach: at the moment, regulation is often too siloed.
P. I. — What can each of us do, in our own small way, to keep up with this technological shock?
A. M. — I encourage people to take the time to ‘cool off’. For example, when we are exposed to a piece of content, we shouldn’t rush into what Stiegler called ‘impulse capitalism’ and gorge ourselves on information, but instead take a few seconds to question the validity of the content, its source and its intention. The way the world works is that we are subjected to a flood of information, whatever the subject, even if it is unimportant. To be able to think properly – this famous cognitive hygiene – it is essential to abstain from the ambient ‘heat’. At the very least, this cooling off could be seen as gymnastics designed to guard against the dross of technological transmission. But let’s beware: if society economises on these seconds of reflection, not only the role of the citizen but also the political identity of our democracies could suffer.