Abstract
If “data is the new oil,” then corruption in the data used to train artificial intelligence (AI) constitutes a new form of pollution. Environmental AI has traditionally been discussed in terms of its indirect effects on the environment—the irony of burning power, processor cycles, and heat to produce solutions to stop and heal environmental damage. But there is a deeper problem. When environmental AI suggests interventions, its outputs are written onto the landscape. If that landscape is then read as data to retrain AI, there is a risk of model collapse and catastrophic forgetting, as the snake devours its own tail. This article discusses the difficulty in fit between current legal regimes governing AI and the use of AI in the environmental space and then further details the problems of model collapse in the context of environmental AI.
Original language | American English |
---|---|
Journal | Oxford Intersections: AI in Society |
DOIs | |
State | Published - Mar 20 2025 |