Nvidia remains dominant in chips for training large AI models, while inference has become a new front in the competition.
Microsoft has announced the launch of its latest chip, the Maia 200, which the company describes as a silicon workhorse ...
The next generation of inference platforms must evolve to address all three layers. The goal is not only to serve models ...
Microsoft’s new Maia 200 inference accelerator chip enters this overheated market with a new chip that aims to cut the price ...
The Chosun Ilbo on MSN
OpenAI seeks inference chips beyond Nvidia's GPUs
Reuters reported on the 2nd (local time) that OpenAI has been dissatisfied with certain performance aspects of Nvidia’s ...
A.I. chip, Maia 200, calling it “the most efficient inference system” the company has ever built. The Satya Nadella -led tech ...
In September 2025, Nvidia and OpenAI announced a letter of intent for Nvidia to invest up to $100 billion in OpenAI’s AI ...
OpenAI is reportedly looking beyond Nvidia for artificial intelligence chips, signalling a potential shift in its hardware ...
OpenAI seeks chip alternatives from AMD and Cerebras while $100 billion Nvidia investment stalls. Both companies dismiss ...
Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities ...
The seed round values the newly formed startup at $800 million.
Nvidia joins Alphabet's CapitalG and IVP to back Baseten. Discover why inference is the next major frontier for NVDA and AI ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results