While most computer-chip makers over the past decades have touted the benefits of an ever-shrinking product, a well-funded Silicon Valley company has built what it says is the largest and fastest-ever computer chip, dedicated to AI.
When the five friends who formed Cerebras Systems Inc. decided to start a company, they wanted to build a new computer to address a big problem. They had previously worked on compact, low-power servers for data centers at SeaMicro, later acquired by Advanced Micro Devices
AMD,
Cerebras was officially formed in 2016, six years before the debut of ChatGPT 3, but the founders decided back then to focus on the tough computing problem of AI, where it now competes with the industry leader, Nvidia Corp.
NVDA,
as well as other chip giants and startups.
Also read: As AI matures, Nvidia won’t be the only pick-and-shovel company to thrive, BofA analysts say
“We built the largest part ever made, the fastest ever made,” Cerebras co-founder and Chief Executive Andrew Feldman said. “We made a set of trade-offs. It is 100% for AI.”
After starting up in downtown Los Altos, Calif., Cerebras is now in Sunnyvale, Calif., just minutes away from data-center partner Colovore in nearby Santa Clara, Calif. It now has more than five times the office space, along with a loading dock that a hardware company needs.
Cerebras has developed what it calls a wafer-scale engine with 2.6 trillion transistors and 850,000 cores, all on a silicon wafer about 8.5 inches wide. The wafers have been shipping since 2020, and are part of a total system designed specifically to process queries and training for artificial intelligence. And they are making inroads as an alternative to Nvidia in the high-performance computing market for AI systems.
“No one has built a chip this big in the history of compute,” Feldman told MarketWatch, as he held up the dinner-plate-sized wafer. “This replaces the CPU and the GPU. This is the compute engine. This replaces everything made by Nvdia and Intel and AMD.” Last year, Cerebras’ invention was inducted into Silicon Valley’s Computer History Museum, as the largest computer chip in the world.
Cerebras is still private, but with $720 million in venture funding, it is one of the better funded hardware/semiconductor startups. Several analysts believe it will be one of a handful of AI chip startups to succeed. In its last Series F funding round in 2021, the company said it had a valuation of $4 billion.
“They have come up with a very unique architecture,” said Jim McGregor, an analyst with Tirias Research. “Because of the way their system is architected, they can handle enormous amounts of data. It’s not the best solution for every application, because it is obviously not cheap,. But it is an incredible solution for high-end data sets.” McGregor, quoting Nvidia CEO Jensen Huang, said there will be both multi-purpose data centers running AI, and specialized AI factories. “I would put [Cerebras] in that second category of AI factory,” he said.
Cerebras’ systems are designed for dedicated AI workloads, because AI is so incredibly processing-intensive, with its system designed to keep all the processing on the same giant chip. Feldman gave a simple analogy of watching a football game at home with the beer already on hand, compared with having to go to a store to buy some during the game. “All the communication is here, you are doing very small, short movements,” he said. Just as you don’t need to get into your car to buy more beer in the football game scenario, “you don’t have to go off-chip, you don’t have to wait for all the elements to be in place.”
Feldman declined to say what the company’s revenue is so far, but said it has doubled this year. This summer, Cerebras got a big boost, with a major contract valued initially at $100 million for the first of nine AI supercomputers to G42, a tech conglomerate in the United Arab Emirates. The first of those systems is running live in the Colovore data center in Santa Clara, Calif., which has a white-glove service for customers behind an unassuming office entrance on a back street lined with RV campers, located a block from a Silicon Valley Power station. The proximity to the power station has now become an important feature for data centers.
“This is the cloud,” Feldman said, standing amid the loud, humming racks and water-cooled servers in a vast windowless room at Colovore. “It’s anything but serene.”
Before the recent deal with G42, Cerebras’ customer list was already an impressive collection of high-performance computing customers, including pharmaceutical companies GlaxoSmithKline
GSK,
for making better predictions in drug discovery, and AstraZeneca
AZN,
for running queries on hundreds of thousands of abstracts and research papers. National laboratories including Argonne National Labs, Lawrence Livermore National Laboratory, the Pittsburgh Supercomputing Center and several others, are using the systems to accelerate research, simulate workloads and develop and test new research ideas.
“If they can keep their trajectory going, they could be one of the companies that survives,” said Pat Moorhead, founder and chief analyst at Moor Insights and Strategy. “Ninety out of 100 companies will go out of business. But for the sole fact that they are driving some pretty impressive revenue, they can establish a niche. They have an architectural advantage.”
As large corporations and small businesses alike rush to adopt AI to save on labor costs with (hopefully) better chatbots, conduct faster research or help do mundane tasks, many have been ramping up spending on their data centers to add the extra computing power that’s needed. Nvidia has been one of the biggest beneficiaries of that trend, with its graphics processing units (GPUs) in huge demand. Analysts estimate Nvidia currently has between 80% and 90% of the market for AI-related chips.
“I like our odds,” said Eric Vishria, a general partner at Benchmark Capital, one of the earliest investors in Cerebras, when asked about Cerebras’ potential to survive and succeed in an environment where some AI chip startups are said to be struggling. “I am not an investor in any of the others. I have no idea how well they are doing, in terms of revenue and actual customer traction,” he said. “Cerebras is well ahead of the bunch as far as I understand it. It’s not easy, and one of the challenges has been that AI is moving and changing so fast.”
Feldman said that, of course, the next milestone for the company would be an initial public offering, but he declined to give any kind of timeframe.
“We have raised a huge amount of money and our investors need a return,” he said when asked if the company plans to go public. “We don’t see that as a goal but as a by-product of a successful company. You build a company on enduring technology that changes an industry. That is why you get up every morning and start companies, to build cool things to move the industry.”