The AI chip war is heating up — and Qualcomm (NASDAQ: QCOM) just entered the ring.
The company unveiled two new data center chips — the AI200 and AI250 — designed for AI inference, the process that powers responses from large language and multimodal models like ChatGPT.
The AI200 packs 768 GB of memory per card, while the AI250 introduces a near-memory computing architecture delivering 10x higher effective bandwidth with lower power use — a major leap in AI efficiency.
Both systems feature direct liquid cooling, PCIe scale-up, Ethernet scale-out, and confidential computing for secure workloads. The AI200 will launch in 2026, followed by the AI250 in 2027.
This puts Qualcomm in direct competition with Nvidia’s H100/H200, AMD’s MI300X, Intel’s Gaudi, and AI chip efforts from Google and Amazon.
Following the announcement, QCOM stock rose 0.97% premarket to $170.58, as investors bet on Qualcomm’s growing role in the AI data center boom.
Be the first to comment