AI: Qualcomm into the AI Chips Breach. RTZ #889
This year has seen remarkable deals at unprecedented scale this AI Tech Wave. Especially led by OpenAI doing AI Chip Infrastructure ‘second-source’ with Broadcom and AMD in particular.
This after closing a $100 billion Nvidia investment that will go around in a ‘circular’ deal fashion back to Nvidia and others. These deals have already climbed to almost a trillion and a half dollars if one includes the big OpenAI deals with Oracle to build the multi-gigawatt AI data centers with the requisite Power.
Again underlines the ‘Frenemies’ environment I’ve highlighted to date for AI chips and software to date, as the key issue is securing AI Compute Supply at Scale.. And it leaves opportunities for other established chip companies that have not yet played a meaningful role in the centralized AI data center chip front.
I’m talking about $200 billion market cap chip company Qualcomm of course, a global leader in AI ‘Snapdragon’ chips for smartphones. As the WSJ notes in “Qualcomm Launches AI Chips to Challenge Nvidia’s Dominance”:
“Qualcomm’s shares rose by as much as 20% after announcing new artificial-intelligence accelerator chips, the AI200 and AI250, to rival Nvidia.”
“Shares in Qualcomm jumped as much as 20% early Monday after the company announced it was launching new artificial-intelligence accelerator chips to rival Nvidia .”
“The AI200 will start shipping next year and the AI250 in 2027, the company said. Both will be available as stand-alone components or cards that can be added into existing machines.”
Qualcomm with this move shifts from a focus on CPU and AI GPU chips for local compute on PCs and smartphones, to centralized AI data centers for AI inference computing in particular. And with Microsoft as a partner for AI Copilot PCs.
“The move puts Qualcomm, which has so far mostly focused on semiconductors for mobile devices, in direct competition with Nvidia, which has dominated the market for AI chips. Qualcomm joins Intel and Advanced Micro Devices in trying to mount AI-chip competition against the semiconductor giant.”
“In an interview, Durga Malladi, a senior vice president at Qualcomm, said that the processors represented the natural evolution of the company’s product line. The company has developed a strong lineup of device-based chips and now wants to scale up its capabilities for AI data centers. Qualcomm says the AI200 and AI250 have an edge because of their memory capabilities and energy efficiency.”
“We’re bringing customers extremely high memory bandwidth and extremely low power consumption,” Malladi said. “It’s the best of both worlds.”
The move makes sense for Qualcomm, especially as it competes with Apple indirectly, which makes its own Apple Silicon for its global ecosystem of Apple iphones and computers. And is also expanding its ARM-driven Apple Silicon chips into AI data centers as I’ve discussed to date. And Microsoft of course with its AI Copilot PC ramp globally using Qualcomm chips in particular.
“Qualcomm is trying to break into the growing market for AI software and services. The demand for AI processing power is sparking a modern-day gold rush, with technology and power companies pumping hundreds of billions of dollars into the sector. Large companies, or hyperscalers, such as Amazon.com and Microsoft could easily spend about $3 trillion by 2030 to build and operate data centers for their businesses, according to BlackRock Investment Institute.”
“Earlier this month, OpenAI and chip-designer Advanced Micro Devices announced a multibillion-dollar partnership to collaborate on AI data centers that will run on AMD processors, marking one of the most direct challenges yet to industry leader Nvidia.”
Not to forget Intel, which has its own recent partnership with Nvidia no less
“Troubled chip maker Intel has also made recent inroads into data-center computing by working with Nvidia to design its first data-center CPUs, or central processing units—the computer brains that power most servers.”
Qualcomm’s first customer again is in the middle east, with the Saudi Arabia as a key customer. It’s a pattern we’ve seen with Saudi sponsorship with other AI ASIC chip infrastructure from US chip startups like Groq, Cerebras and others.
“The first customer for the AI200 chips will be Humain, an AI company established by the Kingdom of Saudi Arabia’s Public Investment Fund, Qualcomm said. Humain plans to deploy 200 megawatts worth of the chips next year at Saudi data centers, to be used mainly for inference computing, or the functions that allow AI models to respond to queries.”
“The deal expands upon a partnership between Qualcomm and Humain announced in May during President Trump’s visit to a Saudi-U.S. investment forum in the capital of Riyadh.”
“Humain also announced a partnership with Nvidia at the same event, which involves Humain deploying 500 megawatts of power and purchasing hundreds of thousands of servers powered by Nvidia’s Grace Blackwell chips, its most-advanced semiconductors currently on the market.”
The Qualcomm move solidifies the trend of growing AI chip alternatives for Nvidia, which for now is NOT a competitive issue for that company.
The AI Tech Wave is in its early supply-constraint days, a reality that likely does not change much for a couple of years at least. Stay tuned.
(NOTE: The discussions here are for information purposes only, and not meant as investment advice at any time. Thanks for joining us here)