Google launches new chipset i.e. Ironwood, increasing speed of AI tool and compete with Nvidia
Ironwood: The Ironwood chip meets the data processing needs that occur when users enter queries in AI tools like ChatGPT. It is called 'Inference Computing' in the technical world, that is, chips that do fast calculations to answer questions or generate other responses in chatbots.

Google's company Alphabet has released its seventh-generation artificial intelligence (AI) chip 'Ironwood'. The company has designed this new processor with the sole intention of accelerating the speed of AI software. Ironwood chip satisfies the needs of data processing when queries are typed by users in AI platforms like ChatGPT. It is referred to as 'Inference Computing' in technical circles, i.e., chips performing rapid calculations to provide questions' answers or any response in chatbots.
Google's years-long, billion-dollar effort is one among very few contenders to take on Nvidia's stranglehold on the AI chip market and open up new horizons in the AI processing market. Google's Tensor Processing Units (TPUs) are only accessible to company engineers or through its cloud service. It has likewise provided Google with an advantage vis-à-vis some rivals in in-house AI work.
Google previously divided its TPU chips into two parts. One chip capable of training large AI models. The other chip that makes inference (runtime use of AI) cheaper and faster. Now the new Ironwood chip combines both these features in one. The Ironwood chip is designed to work in a cluster of 9,216 chips simultaneously. This chip has more memory than the previous generation, making it better for serving AI applications.
For Latest News update Subscribe to Sangri Today's Broadcast channels on Google News | Telegram | WhatsApp