There is no guarantee that artificial intelligence needs to match what happens in biology or even be biologically inspired. The early days of AI research focused far more heavily on building machines that could reason more formally about the world around them compared to the approaches in vogue today; these consist of feeding enormous quantities of data in the hope that a training algorithm will help a similarly large network of simple arithmetic blocks figure out some complex, common pattern intuitively.

“My big question is how do we get machines to learn more like animals and humans? We observe astonishing learning abilities from humans who can figure out how the world works partly by observation, partly by interaction. And it’s much more efficient than what we can reproduce on machines. What is the underlying principle?” Yann LeCun, Meta chief AI scientist asked rhetorically in a panel session organised by Nvidia at its Fall GTC conference that included...