Nvidia's stronghold in the AI chip market faces unprecedented challenges from tech giants like AMD and Amazon, as well as startups such as SambaNova Systems and Groq. These companies are focusing on inference computing, offering cost-effective alternatives. Meta is using AMD's MI300 chip for its AI model Llama 3.1 405B, while Amazon's Trainium 2 chip has gained positive feedback, including from potential users like Apple.
Omdia forecasts a 49% increase in data center spending on non-Nvidia chips by 2024, reaching $126 billion. AMD's MI300 is expected to generate over $5 billion in sales in its first year. Amazon is investing heavily, with $75 billion allocated to AI chips, and its Trainium 2 chip offers 40% better performance per dollar compared to Nvidia's hardware.
Despite growing competition, Nvidia's CEO Jensen Huang emphasizes the company's strong position in AI software and inference capabilities.