A proof-of-concept AI system could cut energy use by around 100 times when compared with today’s large language models (LLMs), a team from Tufts University has said.

AI currently consumes massive of amounts of energy because training and running LLMs requires thousands of specialised GPUs running continuously in data centres. In the US, it’s estimated that AI systems and data centres used about 415TWh in 2024, accounting for more than 10% of the country’s total electricity production.

As reported in Science Daily, researchers at Tufts’ School of Engineering claim their proof-of-concept AI system is far more efficient as it relies on a hybrid approach called neuro-symbolic AI. The system combines traditional neural networks with symbolic reasoning, which is the use of human-readable symbols, rules and logic to solve problems, rather than finding patterns in data like modern LLMs. This method mirrors how people approach problems by breaking them into steps and categories.

The team focused...