You are currently viewing SemiWiki as a guest which gives you limited access to the site. To view blog comments and experience other SemiWiki features you must be a registered member. Registration is fast, simple, and absolutely free so please, join our community today!
On October 27, 2025, Qualcomm unveiled the AI200 and AI250, next-gen AI inference chips set to launch in 2026 and 2027, respectively, accelerating its data center ambitions. Tailored for cost-effective AI deployment, these chips challenge Nvidia’s 80% market dominance with lower total cost of ownership (TCO) and high efficiency for large language models and multimodal AI. The AI200, leveraging Qualcomm’s Hexagon NPU, supports 768GB LPDDR per PCIe card, while the AI250 introduces near-memory computing for 10x bandwidth gains.
Qualcomm’s strategic moves include acquiring Alphawave for $2.4 billion and partnering with Nvidia for NVLink integration, bolstering its ecosystem. Saudi Arabia’s Humain, committing to 200 megawatts, signals strong demand. Shares jumped nearly 15% to $182.23, reflecting market optimism. However, Qualcomm must overcome Nvidia’s ecosystem lead and Intel’s upcoming Crescent Island to gain traction. With AI inference demand soaring, Qualcomm’s modular, power-efficient chips could disrupt the $200 billion market by 2030.
Qualcomm on Monday unveiled two artificial intelligence chips for data centers, with commercial availability from next year, as it pushes to diversify beyond smartphones and expand into the fast-growing AI infrastructure market.
AI200: Set to launch in 2026, available as individual chips, PCIe cards, or full liquid-cooled server racks, featuring Qualcomm’s Hexagon Neural Processing Unit (NPU) and supporting up to 768GB of LPDDR memory per card.
AI250: Scheduled for release in 2027, this chip introduces near-memory computing for over 10x higher effective bandwidth and reduced power consumption, enhancing efficiency for large-scale AI inference workloads.
From what I heard they are on TSMC N3 but it could be N4. Does anyone know for sure? Maybe AI200 is N4 while AI250 is N3? Interested minds want to know.