A Blog by Jonathan Low

 

Dec 27, 2025

In Classic Tech Ploy, Nvidia Licenses New Rival's AI Chip

Buy the competition has been one of Silicon Valley's most effective strategies. As the early winners of the dotcom era - and their successors - got so big that they had more money than they knew what to do with, they became able to simply acquire any startup that threatened to challenge them. 

Nvidia's licensing agreement with Groq appears to demonstrate that such flexing of financial muscle is now prevalent in AI. JL

Kate Clark reports in the Wall Street Journal:

Groq’s “language processing unit” chips are built for inference, the everyday process that occurs when consumers or businesses ask AI models to provide answers, make predictions or draw conclusions on new data. The company’s design, with embedded memory, means its chips can be produced and deployed faster and use less power than graphics-processing units, which typically consume a lot of energy and are more necessary for training models. Nvidia increasingly contending with new entrants including Google and Amazon.com. Some of Nvidia’s big customers such as OpenAI and Meta are starting to design their own custom chips.

Nvidia has forged a licensing deal with the chip startup Groq for its AI-inference technology, the companies said Wednesday, a sign of growing demand for cutting-edge AI chips.

Under the nonexclusive deal, Groq’s chief executive officer and founder, Jonathan Ross, will be joining Nvidia, along with its president and some of the startup’s staff. The company, founded in 2016, makes chips and software to run artificial-intelligence models.

Groq’s “language processing unit” chips are built for inference, the everyday process that occurs when consumers or businesses ask trained AI models to provide answers, make predictions or draw conclusions on new data. Ross has said that the company’s design, with embedded memory, means its chips can be produced and deployed faster and use less power than graphics-processing units, which typically consume a lot of energy and are more necessary for training models.

Demand for inference is on the rise, and a group of startups including Groq have been working to redesign hardware that addresses that need while better limiting energy consumption.

The deal follows a string of recent AI licensing agreements. Meta Platforms invested $14 billion in Scale AI, a transaction that led the startup’s CEO to join the social-media company to help lead its AI efforts. Last year, Alphabet’s Google agreed to hire top executives from Character.AI while licensing the company’s technology. And Microsoft struck such a deal with a startup, Inflection AI.

Groq was last valued at $6.9 billion in a $750 million September funding round that included the money managers BlackRock and Neuberger Berman as well as Cisco Systems and Samsung. The company has said that its chips are designed, fabricated and assembled in North America using partners including Samsung.

Close up of the GroqNode product by AI chip startup Groq.
Groq makes chips and software to run artificial-intelligence models. groq/Reuters

Ross said on LinkedIn on Wednesday that he was joining Nvidia to help integrate the licensed technology and that the company’s GroqCloud, a service provider that sells its AI processing to software developers in lieu of chips or servers, would continue to operate independently. Groq’s finance chief, Simon Edwards, will become its new CEO.

Ross, who previously studied under the AI pioneer Yann LeCun, previously worked at Google, where he started developing the Google processors that would become known as tensor processing units, or TPUs.

Nvidia shares, which are up more than 35% year to date, were little changed in aftermarket trading.

The company has long been the dominant seller of advanced computer chips that power AI, becoming the most valuable company in the world. It has also increased the cadence of its advanced AI chip releases.

It is increasingly contending with new entrants including Google and Amazon.com. Some of Nvidia’s big customers such as OpenAI and Meta are starting to design their own custom chips.

0 comments:

Post a Comment