The long-standing near-duopoly of OpenAI and NVIDIA in the US artificial intelligence market is beginning to show signs of change. With Google and Amazon each launching their latest AI models and self-developed semiconductor chips, the competition for technological dominance is entering a new phase.
The most notable development centers on Google. Recently, Google unveiled its “Gemini 3” AI model and the seventh-generation Tensor Processing Unit “Trillium.” This model is considered to surpass OpenAI’s latest ChatGPT 5.1 in quantitative assessments of reasoning and programming responsiveness. Notably, Google did not use NVIDIA’s graphics processors in the development process, opting instead for its own chips. This signals Google’s intention to reduce reliance on NVIDIA GPUs—long synonymous with AI computing—and build a proprietary ecosystem.
This trend is spreading beyond Google to other major tech companies. Amazon, through its AWS cloud service, has released its self-developed AI chip “Trainium3.” This chip is designed to reduce AI model training and operational costs by more than half. A significant feature is the substantial improvement in energy efficiency, reflecting the reality that power consumption from AI computing is becoming a major challenge for the industry.
As Google and Amazon continue to push forward with these initiatives, the established leaders OpenAI and NVIDIA are feeling the pressure. Reportedly, OpenAI CEO Sam Altman recently issued a “red alert” internally, urging the company to fully commit to improving ChatGPT’s performance. Meanwhile, other AI competitors such as Anthropic and DeepSeek have also released new models, accelerating the pace of technological competition.
On the other hand, while NVIDIA currently holds 80-90% of the AI chip market, efforts to curb its monopoly are becoming increasingly apparent. Companies like Meta and Anthropic are introducing or negotiating to procure Google’s TPUs; AWS, while selectively retaining NVIDIA’s connectivity technology in subsequent products, is also speeding up its own chip development. However, most analysts believe that NVIDIA’s market dominance will be hard to shake in the short term.
As high-end AI models and dedicated chip competition become more pronounced, the market is moving beyond simple feature comparisons and expanding into a contest for ecosystem dominance between platforms. Given the rapid pace of technological advancement, which company will set the standard and which chips will attract more AI customers may soon reshape the leadership structure of the AI industry.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
Google and Amazon begin to strike back... The AI chip war shakes Nvidia's dominance
The long-standing near-duopoly of OpenAI and NVIDIA in the US artificial intelligence market is beginning to show signs of change. With Google and Amazon each launching their latest AI models and self-developed semiconductor chips, the competition for technological dominance is entering a new phase.
The most notable development centers on Google. Recently, Google unveiled its “Gemini 3” AI model and the seventh-generation Tensor Processing Unit “Trillium.” This model is considered to surpass OpenAI’s latest ChatGPT 5.1 in quantitative assessments of reasoning and programming responsiveness. Notably, Google did not use NVIDIA’s graphics processors in the development process, opting instead for its own chips. This signals Google’s intention to reduce reliance on NVIDIA GPUs—long synonymous with AI computing—and build a proprietary ecosystem.
This trend is spreading beyond Google to other major tech companies. Amazon, through its AWS cloud service, has released its self-developed AI chip “Trainium3.” This chip is designed to reduce AI model training and operational costs by more than half. A significant feature is the substantial improvement in energy efficiency, reflecting the reality that power consumption from AI computing is becoming a major challenge for the industry.
As Google and Amazon continue to push forward with these initiatives, the established leaders OpenAI and NVIDIA are feeling the pressure. Reportedly, OpenAI CEO Sam Altman recently issued a “red alert” internally, urging the company to fully commit to improving ChatGPT’s performance. Meanwhile, other AI competitors such as Anthropic and DeepSeek have also released new models, accelerating the pace of technological competition.
On the other hand, while NVIDIA currently holds 80-90% of the AI chip market, efforts to curb its monopoly are becoming increasingly apparent. Companies like Meta and Anthropic are introducing or negotiating to procure Google’s TPUs; AWS, while selectively retaining NVIDIA’s connectivity technology in subsequent products, is also speeding up its own chip development. However, most analysts believe that NVIDIA’s market dominance will be hard to shake in the short term.
As high-end AI models and dedicated chip competition become more pronounced, the market is moving beyond simple feature comparisons and expanding into a contest for ecosystem dominance between platforms. Given the rapid pace of technological advancement, which company will set the standard and which chips will attract more AI customers may soon reshape the leadership structure of the AI industry.