Generative AI services such as ChatGPT are major carbon (CO2) emitters due to their intensive energy usage, but a new study has found that some AI prompts could cause 50 times more emissions than others depending on what is asked.
To answer user queries, these services use tokens. These are words or parts of words that are converted into a string of numbers that can be processed by the large language model (LLM). This conversion, as well as other computing processes, produces CO2 emissions.
OpenAI CEO Sam Altman has admitted that users saying “please” and “thank you” alone costs the firm “tens of millions of dollars” due to the extra energy usage of dealing with longer queries.
Researchers at the Hochschule München University of Applied Sciences in Germany measured and compared CO2 emissions of different, already trained, LLMs using a set of standardised questions.
“The environmental impact of questioning trained LLMs is strongly determined by their reasoning approach, with explicit reasoning...