Arthur Grimonpont
CeSIA has joined a coalition of international organisations and experts in an open letter to Ursula von der Leyen, urging the Commission to resist pressure from the tech industry to weaken the EU's AI legislation. The signatories are calling on the European executive to strengthen safety measures for general-purpose AI models and to increase the resources of the European AI Office.
CeSIA is adding its voice to that of around 15 civil society organisations and 30 international experts, including Nobel laureates Daron Acemoglu and Geoffrey Hinton, to call on the European Commission to prioritise citizen safety over the commercial interests of tech companies.
“Tech industry lobbyists complain of so-called 'regulatory uncertainty' in an attempt to weaken the EU's AI Act. But the primary source of uncertainty is clearly not regulatory: it's technological. Advanced AI systems, subject to fewer technical standards than a toaster, are currently being deployed to hundreds of millions of users without any external oversight. By correcting this dangerous situation and harmonising rules across Member States, the AI Act and its Code of Practice offer not only regulatory certainty but, above all, the urgent and essential protection European citizens need.”
Charbel Raphaël Segerie, Executive Director of the Centre for AI Safety (CeSIA).
The open letter puts forward three main requests to the European Commission:
The publication of this open letter comes amid intense pressure fromseveral tech industry lobbies, some of which have publicly called forthe regulation's application to be suspended by invoking a "stop-the-clock" clause, arguing that the implementation guidelines were not ready.
Yet, the exponential progress of AI is accompanied by growing risks.Companies developing cutting-edge AI, such as OpenAI, Anthropic, and Google, have themselves acknowledged that their new generations of models pose increasing threats, particularly concerning critical chemical, biological, radiological, and nuclear (CBRN) risks.