Nvidia Licenses Groq's LPU Tech in a Deal Reportedly Worth $20 Billion
Nvidia has struck a licensing deal with AI chip rival Groq, hiring its founder. CNBC reports a $20 billion asset acquisition, which Nvidia denies.
The AI chip titan just embraced its rival. But this isn't a friendly handshake; it's a strategic power play that could reshape the market. Nvidia has struck a non-exclusive licensing agreement with its AI chip competitor Groq, and will hire its founder and other key employees in a stunning move.
A $20 Billion Handshake?
As part of the deal, Groq founder Jonathan Ross, president Sunny Madra, and other staff will join Nvidia. While Nvidia told TechCrunch this is not an acquisition of the company, CNBC reported that Nvidia is acquiring assets from Groq for a staggering $20 billion. If those numbers are accurate, this would be Nvidia's largest purchase ever.
The Power of the LPU
So, why Groq? The startup has been making waves with a different kind of chip called an LPU (Language Processing Unit). Groq has claimed its LPU can run Large Language Models (LLMs) 10 times faster while using just one-tenth the energy of conventional hardware. Its CEO, Jonathan Ross, is a giant in the field, known for helping invent Google's TPU (Tensor Processing Unit).
Groq's growth has been explosive. The company raised $750 million at a $6.9 billion valuation back in September. It also said its technology now powers the AI apps of more than 2 million developers, a huge jump from about 356,000 last year.
This content is AI-generated based on source articles. While we strive for accuracy, errors may occur. We recommend verifying with the original source.
Related Articles
Microsoft deploys its first homegrown AI chip, Maia 200, in data centers, claiming superior performance over Amazon and Google chips while maintaining partnerships with Nvidia and AMD.
Microsoft unveils Maia 200, claiming 3x faster performance than Amazon's chips. The cloud AI battle is moving to custom silicon as tech giants reduce NVIDIA dependence.
Nvidia's $2 billion investment in CoreWeave reveals a strategy to control AI infrastructure beyond just selling chips. Here's what it means for the AI ecosystem and market competition.
Microsoft's Maia 200 chip promises 3x better AI inference performance than competitors. But the real story is about breaking Big Tech's expensive NVIDIA dependency.
Thoughts
Share your thoughts on this article
Sign in to join the conversation