Start now →

Qualcomm signs major data center customer, expands market reach beyond mobile chips

By Editorial Team · Published May 17, 2026 · 2 min read · Source: Crypto Briefing
RegulationAI & Crypto
Qualcomm signs major data center customer, expands market reach beyond mobile chips

Qualcomm signs major data center customer, expands market reach beyond mobile chips

The mobile chip giant is returning to data centers after a seven-year absence, betting that AI inference workloads need a different kind of silicon.

Share

Add us on Google by Editorial Team May. 16, 2026

Qualcomm has landed a major hyperscale customer for custom data center silicon, marking the company’s most aggressive push into server infrastructure since it abandoned that market entirely in 2018.

Shipments of the new custom chips are expected to begin in December 2026.

From mobile giant to data center contender

Qualcomm launched its Centriq server processors in 2017, positioning ARM-based chips as a power-efficient alternative to Intel’s dominance. By 2018, the entire effort was shuttered.

Advertisement " document.getElementById("alkimi-leaderboard").innerHTML = iFrame var iframeDoc = document.getElementById(idIFrame).contentWindow.document pbjs.renderAd(iframeDoc, highestCpmBids[0].adId); } } setTimeout(function () { renderAds(); }, FAILSAFE_TIMEOUT);

The AI inference play

Rather than trying to compete head-on with NVIDIA’s training GPUs, Qualcomm is targeting a specific slice of the AI compute stack: inference.

The company is developing custom ASIC-based AI accelerators designed to handle inference workloads at lower power consumption than GPU-heavy alternatives.

Qualcomm signaled this strategic direction in August 2025, and the signing of an unnamed hyperscaler validates that the pitch is landing with at least one major customer.

A crowded and complicated competitive field

NVIDIA remains the undisputed leader, with its GPU architecture serving as the default platform for both training and inference. AMD has gained ground with its MI-series accelerators. Intel continues to push its Gaudi accelerators.

Amazon has built its own Trainium and Inferentia chips for AWS. Google has its TPUs. Microsoft is developing its own Maia accelerators.

Reports also indicate Qualcomm is exploring AI inference hardware deployment in new geographic regions, particularly Latin America.

The December 2026 shipment timeline means the company needs to execute on custom silicon development. One hyperscale customer is a proof point, not a business. Qualcomm will need to demonstrate that its inference architecture can attract multiple large buyers to justify the R&D investment.

Disclosure: This article was edited by Editorial Team. For more information on how we create and review content, see our Editorial Policy.
This article was originally published on Crypto Briefing and is republished here under RSS syndication for informational purposes. All rights and intellectual property remain with the original author. If you are the author and wish to have this article removed, please contact us at [email protected].

NexaPay — Accept Card Payments, Receive Crypto

No KYC · Instant Settlement · Visa, Mastercard, Apple Pay, Google Pay

Get Started →