AI Is Dangerous, but Not for the Reasons You Think | Sasha Luccioni

AI Is Dangerous, but Not for the Reasons You Think | Sasha Luccioni
AI Is Dangerous, but Not for the Reasons You Think | Sasha Luccioni

AI ethics researcher Sasha Luccioni implores us to shift our focus from fearing future potential hazards of AI technology to concentrating on its existing negative impacts on society.

In this talk, she highlights key areas where AI is currently endangering societal norms and offers feasible solutions to control its adverse effects.

AI and Climate Change

AI models, especially larger ones, consume significant amounts of energy and release carbon dioxide, contributing to climate change.

An example is the AI model ‘Bloom’ which used up as much energy as 30 homes in a year and emitted 25 tons of carbon dioxide.

Such environmental implications of AI use tend to be neglected by tech companies and remain undisclosed.

AI models can contribute to climate change, infringe copyrights, and discriminate against entire communities. But we need to start tracking its impacts. We need to start being transparent and disclosing them and creating tools so that people understand AI better. – Sasha Luccioni

The Environmental Cost of Larger AI Models

Trends in AI research suggest an inclination towards bigger models which consume more resources and therefore, have a more pronounced impact on the environment.

Using larger language models emits 14 times more carbon for the same task accomplished by a smaller, more efficient model.

As AI becomes a common feature in devices, the environmental cost escalates.

There’s no single solution for really complex things like bias or copyright or climate change. But by creating tools to measure AI’s impact, we can start getting an idea of how bad they are and start addressing them as we go. – Sasha Luccioni

🚀
Read Big Ideas from this + 100,000 of world's best books and podcasts in BigIdeas app (for free!)

➡️Download: Android, iOS

BigIdeas App brings you big ideas from world's best books, podcasts, videos in bite-sized format - across a range of topics. All for free!

Mitigating the Environmental Impact of AI

In order to lessen the environmental footprint of AI, tools are being developed to estimate energy consumption and carbon emissions of AI training code.

An instance of such a tool is ‘CodeCarbon’, co-developed by Luccioni.

These tools aim to give researchers insightful data about model choices and promote the use of renewable energy sources for AI training.