September 24, 2019 at 3:40 pm #3210
Microprocessors are shrinking with each new generation. Or so it used to be. Andrew Feldman, of Cerebras, a Los Altos CA startup, has displayed a block of Plexiglas and baked into it is an A4 size microprocessor.
“It’s the world’s biggest,” he says proudly, 400,000 cores, 18 gigabytes of memory and 1.2trn transistors. That is, respectively, about 78,300 and 57m times more than the largest existing processor from Nvidia, another big chip maker.
Cerebras – the name echoes cerebrum, the front part of the brain, as well as Cerberus, the giant three-headed dog who guards the entrance to Hades – is leading a shift in semiconductors that was in plain display at Hot Chips, an industry gathering at Stanford University, where startups, including Cerebras and Habana, but also such giants as Nvidia and Intel, showed off their new silicon wares in August.
Thanks to Moore’s law, which states that computer power doubles every two years at about the same cost, cramming ever more transistors on standard chips used to be the way to go. But with transistors now the size of dozens of atoms, improvements have become less predictable. And with the spread of artificial intelligence (AI), demand for computing power has grown much faster than Moore’s law—by more than 300,000 times for certain applications between 2012 and 2018, according to some estimates. As a result, chipmakers are now dialling up performance by, among other things, increasing the size of processors that inhale data to train AI services, such as facial recognition.
Bottom of Form
Cerebras has pushed this approach to the limit: its chip is the biggest that can be cut from the largest available wafers. To get there, the firm had to overcome more than one technical hurdle.
One is defects: every wafer has some, so Mr Feldman’s team had to find a way to route around faulty cores. Another is cooling: water pumped through tiny pipes carries away the great heat that cores generate. But this is not all: Cerebras has also built a specialised computer for its new chip that it claims will deliver 150 times more number-crunching power than the best server based on graphics processing units, today’s AI workhorses.
At Hot Chips, attendees were passably impressed when Cerebras presented its new processor. But the biggest hurdle for the company may be economic, not technical, says Linley Gwennap of Microprocessor Report, an industry newsletter. The firm has to convince big providers of cloud computing, such as Amazon Web Services, Microsoft Azure and Google Cloud, that it is worth their while to use Cerebras computers instead of machines packed with Nvidia chips, for instance when it comes to power consumption. Another question is whether other firms that have huge demand for computing power, including banks and oil majors, will want to buy such AI supercomputers, instead of having their data crunched in a cloud. Don’t be surprised if Cerebras is taken over by a bigger firm, be it another chipmaker or a computer vendor—just like other AI-chip pioneers before it.
From: The Economist
You must be logged in to reply to this topic.