No one seems to have told the AI community about the silicon shortage that has caused memory and graphics card prices to shoot up while car manufacturers struggle to find supplies. Because, based on current trends, there is nothing like a neural network for chewing up silicon.
Take Cerebras Systems as an example. The company is now on its second generation of AI processor using a design that consumes more or less a full wafer of silicon. In this second generation, the processor relies on a second external unit to feed it the data it needs. Cerebras today lies at the extreme end of the silicon-area scale, but many of the start-ups and systems companies making accelerators for AI have taken the view that they need to make them as big as they can.
Simon Knowles, chief technology officer at Bristol-based Graphcore, explained at the Hot Chips 33 conference in late August that his company’s mark-two processor is “as big as a reticle allows”, alluding to the...