At 'Hot Chips' last week (the annual technological symposium held in Silicon Valley), researchers from the University of Wisconsin-Madison and startup SimpleMachines described what they see as a necessary change to the way computing hardware is put together. Like many things in computing at the higher end, the apparent driver for this change is familiar: artificial intelligence.

Saru Sankaralingam, computer-sciences professor at the University of Wisconsin-Madison, argued in his talk on the Mozart architecture that AI models are getting bloated. If you look at things like language models, models are now outpacing Moore’s Law by a factor of ten. In that environment, it’s no surprise to find waferscale processors like those from Cerebras turning up at the same conference. There is one reason why what Stanford University’s HAI calls foundation models are so big: they work better than smaller models. Right now, there is not a strong incentive to...