Inside a cavernous room this week in a one-story building in Santa Clara, Calif., six-and-a-half-foot tall machines whirred behind white cabinets. The machines made up a new supercomputer that went live just last month.

The supercomputer, which was unveiled Thursday by Cerebras, a Silicon Valley start-up, was built with the company’s specialized chips, which are designed to power artificial intelligence products. The chips stand out because of their size – like a dinner plate, or 56 times bigger than a chip usually used for AI. Each Cerebras chip contains the computing power of hundreds of traditional chips.

Cerebras said it built the supercomputer for G42, an AI company. G42 said it plans to use the supercomputer to create and operate AI products for the Middle East.

“What we’re showing here is that there’s an opportunity to build a very large, dedicated AI supercomputer,” said Andrew Feldman, Cerebras’ chief executive. He added that his startup wanted to “show the world that this work can be done faster, it can be done with less energy, it can be done for a lower cost.”

Demand for computing power and AI chips has soared this year, fueled by a global AI boom. Tech giants like Microsoft, Meta and Google, as well as a myriad of startups, have rushed to launch AI products in recent months after the AI-powered ChatGPT chat room went viral for the eerily human-like prose it could generate.

But making AI products typically requires significant amounts of computing power and specialized chips, leading to a wild hunt for more of those technologies. In May, Nvidia, the main maker of chips used to power AI systems, said appetite for its products — known as graphics processing units, or GPUs — was so strong that its quarterly sales would be more than 50 percent above Wall’s estimates. Street. The forecast sent Nvidia’s market value soaring above $1 trillion.

“For the first time, we’re seeing a huge jump in computing requirements” due to AI technologies, said Ronen Dar, founder of Run:AI, a Tel Aviv-based startup that helps companies develop AI models. That “created a huge demand” for specialized chips, he added, and companies “rushed to secure access” to them.

To get enough AI chips, some of the biggest tech companies – including Google, Amazon, Advanced Micro Devices and Intel – have developed their own alternatives. Startups like Cerebras, Graphcore, Groq and SambaNova have also joined the race, aiming to enter the market that Nvidia has dominated.

Chips are poised to play such a key role in AI that they could shift the balance of power between tech companies and even nations. The Biden administration, for example, recently weighed restrictions on the sale of AI chips to China, and some US officials say China’s AI capabilities could pose a national security threat to the US by enhancing Beijing’s military and security apparatus.

AI supercomputers have been built before, including by Nvidia. But it’s rare for startups to create them.

Cerebras, which is based in Sunnyvale, California, was founded in 2016 by Mr. Feldman and four other engineers, with the goal of building hardware that accelerates AI development. Over the years, the company has raised $740 million, including Sam Altman, who heads the AI ​​lab OpenAI, and venture capital firms like Benchmark. Brains are valued at $4.1 billion.

Because the chips typically used to power AI are small—often the size of a postage stamp—hundreds or even thousands of them are needed to process a complicated AI model. In 2019, Cerebras took the wraps off what it claimed was the largest computer chip ever built, and Mr. Feldman said its chips can train AI systems between 100 and 1,000 times faster than existing hardware.

G42, the Abu Dhabi company, started working with Cerebras in 2021. It used Cerebras system in April to train an Arabic version of ChatGPT.

In May, G42 asked Cerebras to build a network of supercomputers in different parts of the world. Talal Al Kaissi, the chief executive of G42 Cloud, a subsidiary of G42, said the cutting-edge technology would allow his company to make chatbots and use AI to analyze genomic and preventive care data.

But the demand for GPUs was so high that it was difficult to get enough to build a supercomputer. Cerebras’ technology was both available and cost-effective, said Mr Al Kaissi. So Cerebras used its chips to build the supercomputer for G42 in just 10 days, Mr. Feldman said.

“The timescale has been greatly reduced,” said Mr Al Kaissi.

Over the next year, Cerebras said, it plans to build two more supercomputers for G42 — one in Texas and one in North Carolina — and, after that, six more distributed around the world. It calls this network Condor Galaxy.

Startups, however, are likely to find it difficult to compete against Nvidia, said Chris Manning, a computer scientist at Stanford whose research focuses on AI. That’s because people who build AI models are used to using software that runs on Nvidia’s AI chips, he said. .

Other startups have also tried to enter the AI ​​chip market, but many have “effectively failed,” Dr. Manning said.

But Mr. Feldman said he was hopeful. Many AI companies don’t want to be locked into just Nvidia, he said, and there is global demand for other powerful chips like those from Cerebras.

“We hope this will advance AI,” he said.

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *