Cédric Villani on the Future of Artificial Intelligence

Home>News & Events>

On Thursday, March 17, 2022, the Office of AUP’s President Celeste M. Schenck hosted the seventh event in its Presidential Lecture Series: a presentation and Q&A by renowned mathematician, researcher and French politician Cédric Villani. The Presidential Lecture Series, titled “Technology and the Human Future,” invites speakers to participate in live online events, so they might engage with both theory and practice in responding to the question of how technology will continue affecting our lives beyond the Covid-19 pandemic. Villani presented on the topic of “AI and Algorithmics: A Sword for Progress or for Self-Destruction?” 

Villani studied mathematics at the École Normale Supérieure in Paris. In 2009, he became head of the Henri Poincaré Institute, one of the oldest research institutes in the world dedicated to mathematics and theoretical physicals. In 2010, he won the Fields Medal for his work on Landau damping and the Boltzmann equation. Villani has been a Member of Parliament for Essonne since June 2017. He sits on the Law Commission and chairs the Parliamentary Office for the Evaluation of Scientific and Technological Choices (OPECST) of the National Assembly and Senate of France. 

“There is no good definition of artificial intelligence,” began Villani. He explained that the term is applied to any algorithm automatically accomplishing tasks hitherto reserved for humans; he cited examples such as language translation and healthcare diagnosis. He highlighted the disconnect between tasks that algorithms find easy and humans find difficult and vice versa; AI players are today the best at games such as Chess and Go, yet no computer program can understand and interpret human social relations or successfully drive a car. 

“You have all these paradoxes about artificial intelligence,” explained Villani. “And the field has certainly gone through its ups and downs.” Though breakthroughs in recent years in neural networks and machine learning have greatly improved the efficiency of algorithms and overhauled sectors such as finance and online advertising, there has also been, as Villani put it, “a dark side to the story.” When AI is poorly or unethically used to prioritize profit over human considerations, it can have disastrous effects; Villani mentioned examples such as teacher evaluation algorithms that erroneously expelled talented teachers or scheduling algorithms that assigned workers unmanageable shifts. 

Villani also discussed the importance of tackling the spread of misinformation through artificial intelligence, discussing the importance of encouraging community monitoring. Algorithms that promote content in news feeds based on engagement may prioritize content that provokes anger or outrage. Villani argued that online communities must therefore take responsibility for confronting disinformation, citing Wikipedia as a success story in this domain. He also discussed the concern that investment in algorithms can divert attention away from other global challenges, such as climate change, inequality and global conflict. “The main obstacle today when it comes to climate change is not about the progress of science, but that societies have to take on this burden and make the changes science demands,” said Villani. “All of this requires more political action and energy.” 

Finally, Villani discussed the critique of AI in terms of resource use. New digital technologies like blockchain consume vast amounts of energy. There are global shortages of various precious metals, such as copper and lithium, due to the increasing demand for data storage. “We have to prepare for a world of finite computer science,” said Villani, noting that AI can be a valuable tool for activists as well as lobbyists. After his presentation, Villani took questions from the online audience on topics such as solutions to the problem of disinformation during elections and the use of AI in a defense context. You can watch Villani’s full presentation and Q&A in English in the video below.