ADVERTISEMENT

Technology

Amazon to Provide Anthropic Chip Clusters With Five Times Power

Published: 

The app called "haul" gives customers a wide variety of products at twenty-dollars or less.

(Bloomberg) -- Amazon.com Inc. is ramping up its artificial intelligence offerings, rolling out powerful new chip arrays and a large language model it says can compete with leading rivals.

The Seattle-based company is stringing together hundreds of thousands of its Trainium2 semiconductors into clusters that will make it easier for partner Anthropic to train the large language models required for generative AI and other machine learning tasks. The new arrays will quintuple the startup’s current processing power, Amazon said. 

Amazon Web Services, the cloud services division, began offering its latest chips to customers on Tuesday, the company said at its annual re:Invent conference. 

Andy Jassy, marking his first appearance at the trade show since becoming Amazon’s chief executive officer in 2021, introduced the new models — called Nova. Capable of generating text, images and video, they represent Amazon’s latest effort to compete with OpenAI and other builders of the large language models that power chatbots and other generative AI tools.

AWS, the largest seller of rented computing power, runs many of the servers that other companies rent to train artificial intelligence applications. AWS also makes models built by other companies, including Anthropic’s Claude and Meta Platforms Inc.’s Llama, available to AWS customers. But the company has yet to produce a large language model widely seen as competitive with OpenAI’s most advanced GPT models. 

Prior Amazon-built models released in the last two years, called Titan, were generally smaller in scope. The Nova models, some available now and others next year, include a “multimodal to multimodal” version that can take text, speech, images and video as inputs and generate responses in each mode.

Amazon, Jassy said, would continue to both develop its own models and offer those built by others. “We are going to give you the broadest and best functionality you can find anywhere,” he said.

Amazon last month said it was investing an additional $4 billion in Anthropic. As part of the deal, Anthropic said it would use Amazon’s cloud and its chips to develop its most advanced models.

The new chip cluster, called Project Rainier, will contain “significantly more” than 100,000 chips, Gadi Hutt, who works with customers at Amazon’s Annapurna Labs chipmaking unit, said in an interview. Amazon says it expects the cluster to be the world’s largest set of dedicated AI hardware. 

Amazon hopes the chips, the company’s third generation of AI semiconductors, will prove competitive with Nvidia Corp.’s products, offering AWS customers an alternative when developing generative AI products. For most companies, Nvidia’s graphics processing units, which are costly and often in short supply, are the default hardware for such tasks today. 

Amazon says it will offer customers computing power backed by Nvidia’s new Blackwell chip starting early next year.

(Updated with large language model announcement.)

©2024 Bloomberg L.P.