Race for Generative AI Supremacy: Amazon Invests in AWS Custom Chips

C.T. Carvalho

Amazon is investing heavily in custom chips, called Inferentia and Trainium, for training and accelerating generative AI, competing directly with Microsoft and Google. These chips are becoming a popular alternative to Nvidia GPUs, especially for training large-scale language models, as purchasing GPUs is becoming more difficult and costly.

While Amazon is entering the generative AI scene with determination, Microsoft and Google have already taken significant steps to capitalize on the technology’s boom. Microsoft rose to prominence by hosting OpenAI’s ChatGPT and investing billions in the company itself, while Google launched its own language model, Bard, and invested heavily in OpenAI competitor Anthropic.

Amazon is building its differentiation through custom chip development, offering unique technical alternatives like Inferentia and Trainium. These chips promise to improve AI model training and inference performance, making them an attractive choice for companies seeking generative AI capabilities.

AWS, with its dominance in the cloud, has a strategic advantage. Its existing install base and customers’ familiarity with the platform could attract them to Amazon for generative AI solutions. AWS also offers a growing portfolio of tools and services focused on generative AI, such as AWS HealthScribe and SageMaker.

Despite the focus on tools, Amazon is also rumored to be developing its own expansive language models, showing a growing interest in competing directly in the field of generative AI.

While Nvidia still leads in AI model training, Amazon is looking to establish itself as a force in this fast-growing landscape. With more than 100,000 customers using machine learning on AWS, Amazon is positioned to significantly impact the generative AI market and shape its future.

Source/credit: CNBC / Digital Agro