Nvidia shares dipped on Tuesday after reports surfaced that Meta is in talks to spend billions on Google’s custom AI chips – a move that could signal the first proper challenge to Nvidia’s dominance in the AI hardware market.
The Information reported that Meta is negotiating a deal to use Google’s tensor processing units (TPUs) in its own data centres from 2027. It may also rent TPU capacity through Google Cloud as early as next year.
Meta currently relies heavily on Nvidia’s GPUs, which power most large-scale AI training globally.
Investors reacted swiftly, sending Nvidia down around 4% in US trade.

The drop comes despite Nvidia’s extraordinary rise this year, which pushed the company to a world-leading US$4.2 trillion (A$6.3t) market valuation.
If finalised, the Meta-Google deal could redirect as much as 10% of Nvidia’s annual revenue, according to the report – representing billions in potential chip orders shifting toward Alphabet.
Google has pitched its TPUs as a cheaper, more power-efficient alternative for some AI workloads.
Unlike Nvidia’s general-purpose GPUs, TPUs are purpose-built for machine learning and already underpin services such as Search, YouTube and DeepMind’s models.

Both companies played down the competitive tension.
Google said demand for both its TPUs and Nvidia GPUs is accelerating, while Nvidia issued a statement praising Google’s “great advances in AI” and insisting it remains a generation ahead of rivals.
Meta has not commented publicly but the social media giant has been under pressure to diversify supply as hyperscale buyers battle ongoing chip shortages and rising costs.
Analysts say even a partial shift by a buyer of Meta’s size could reset expectations across the industry.
Google has been building momentum after unveiling its largest AI model, Gemini 3, earlier this month and signing a deal to supply up to one million TPUs to Anthropic.
When that agreement was announced, Nvidia swiftly responded with its own multi-billion-dollar investment.
For now, Nvidia remains the market leader, commanding as much as 90% of the AI accelerator space.






