Inside Amazon.com’s chip lab in Austin, Texas, a team of engineers was busy testing a new, highly confidential server design on a Friday afternoon. This server is equipped with Amazon’s AI chips, which aim to rival those of industry leader Nvidia, according to Amazon executive Rami Sinno during a lab visit.
Amazon is developing its own processors to reduce its dependence on Nvidia’s costly chips, which power much of the AI cloud business at Amazon Web Services (AWS), the company’s primary growth driver. By creating its own chips, Amazon aims to help customers perform complex calculations and process large data sets more affordably. Competitors Microsoft and Alphabet are pursuing similar strategies.
Sinno, director of engineering for Amazon’s Annapurna Labs (part of AWS), noted that customers increasingly seek cheaper alternatives to Nvidia. Amazon acquired Annapurna Labs in 2015.
Although Amazon’s AI chip development is still in its early stages, its Graviton chip, which handles non-AI computing, has been in development for nearly a decade and is now in its fourth generation. The AI chips, Trainium and Inferentia, are more recent innovations.
David Brown, Vice President of Compute and Networking at AWS, stated on Tuesday that Amazon’s chips could offer up to 40-50% improved price and performance, potentially making them half as expensive as using Nvidia’s models.
AWS, which represents nearly a fifth of Amazon’s overall revenue, saw a 17% sales increase to $25 billion in the January-March quarter compared to the previous year. AWS holds about a third of the cloud computing market, while Microsoft’s Azure has around 25%.
During the recent Prime Day, Amazon deployed 250,000 Graviton chips and 80,000 custom AI chips to manage the increased activity across its platforms. This shopping event generated a record $14.2 billion in sales.