Qualcomm Makes a Play for AI Chip Market Dominated by Nvidia, AMD
Qualcomm, known for smartphone semiconductors, announced a pair of AI accelerator chips set to hit the market in 2026 and 2027.

Sign up for smart news, insights, and analysis on the biggest financial stories of the day.
Qualcomm wants everyone to know that it’s not chopped liver in the AI economy.
On Monday, the San Diego-based veteran in the semiconductors-for-smartphones game climbed aboard the AI hype train, with a pair of accelerator chips for massive data centers set to hit the market in 2026 and 2027, respectively. The move adds yet another late-breaking entrant into the race dominated by Nvidia and AMD.
Qual(comm)ity Concerns
Shares of Qualcomm climbed as high as 22% on Monday, ultimately closing up 11%. And the pop is hardly a surprise. After all, who wouldn’t want to invest in a new picks-and-shovels company in the midst of the tech industry’s multi-trillion dollar gold rush/AI capex spending spree? And better yet, the picks and shovels (in this case, chips) need constant replacing; by most estimates, AI chips will have an average lifespan of just two to three years. That means Qualcomm could snap up some beefy contracts when AI’s most prominent players replace their Nvidia and AMD hardware.
In the meantime, Qualcomm said Monday it already has one major client lined up: Humain, the start-up backed by Saudi Arabia’s Public Investment Fund and launched earlier this year that seeks to build an AI-powered alternative to Windows and macOS operating systems. Humain plans to have about 200 megawatts of data centers powered by Qualcomm chips online by the end of next year.
Still, experts have already flagged the chips’ potential tradeoffs:
- Qualcomm touts its new chips, specifically designed for AI inference (running actual models) rather than AI training, as offering a “generational leap in efficiency and performance,” with each chip housing as many as 768 gigabytes of low-power dynamic random access memory.
- That may sound like a lot, and memory has proven a key constraint for AI chips, but not everyone is so sure it’s the right type of memory solution. Bank of America analysts were skeptical of Qualcomm’s announcement, CNBC’s Kristina Partsinevelos reported Monday, calling them “lower-end chips” and flagging their lack of critical high-bandwidth memory capacity.
The Chips Are Coming from Inside the House: Be that as it may, Bank of America’s analysts project the new chip line could open up as much as $2 billion in annual revenue for Qualcomm. It’s sorely needed, with longtime smartphone client Apple increasingly adopting its in-house-designed chips to achieve independence from third-party contractors by 2027. Apple was responsible for roughly 20% of Qualcomm’s $39 billion in revenue in 2024. Unfortunately for Qualcomm (and Nvidia and AMD), Big Tech is already starting to get the same idea for AI chips.











