DeepSeek V4 points to growing use of Huawei chips in AI models
DeepSeek’s V4 AI model may run on Huawei chips.
The move reflects a shift to domestic AI chips and software in China.
China’s push to reduce reliance on US technology is showing up in AI infrastructure.
A recent report by
The Information
, cited by
Reuters
, said that DeepSeek’s upcoming V4 model may run on Huawei chips rather than
the
NVIDIA hardware
that
still powers most large AI systems today.
The shift points to a broader change in AI infrastructure.
According to the report, DeepSeek has been adapting parts of its model to work with Huawei’s Ascend chips. At the same time, large Chinese firms such as Alibaba, ByteDance, and Tencent have placed orders for hundreds of thousands of Huawei chips ahead of the V4 release,
according to
Reuters
. The scale of those orders suggests a wider shift in infrastructure planning.
A move away from NVIDIA dominance
For years, companies building large AI models have
depended
on
GPUs from
NVIDIA
.
Its CUDA software stack and hardware, like the A100 and H100 chips, have set the standard for training models.
See also:
OpenAI flags DeepSeek model copying concerns in AI race
DeepSeek
appears to be
taking a different route.
Reuters
reported that DeepSeek withheld early access to V4 from US chipmakers such as AMD and NVIDIA, while
giving
Chinese suppliers such as Huawei more time to tune their software.
Models built for one chip type do not always run well on another. DeepSeek is said to be rewriting parts of V4’s code so the model can run on Huawei’s chips.
How V4 may compare to current frontier models
DeepSeek has already drawn attention with earlier models such as V3 and R1.
Reuters
has reported that DeepSeek claimed it built earlier models at a lower cost than many US rivals. Details on V4 remain limited, but it is
being built
on domestic chips.
The key difference is how those results
are achieved
. US models often rely on large clusters of high-end GPUs, which are expensive and energy-intensive. DeepSeek’s approach focuses more on efficiency per unit of compute.
This
does not mean Huawei chips match NVIDIA’s best hardware.
Reuters
reported in 2025 that Huawei and other Chinese chipmakers had struggled for years to match NVIDIA’s top-end chips for training models. Performance in real-world AI systems depends on hardware, software, and data. By adjusting the model to the hardware, DeepSeek may be narrowing that gap.
Why Huawei chips are central to this shift
Huawei has been developing Ascend AI chips. These chips
are already used
in some data centres and AI workloads within China.
Export controls from the United States have limited access to advanced NVIDIA chips,
which has increased
demand for local
options
.
The reported surge in orders from companies
like
Alibaba, ByteDance, and Tencent shows that demand is not limited to
one
or two projects.
It reflects a move to secure supply chains and reduce reliance on foreign hardware.
Huawei’s role goes beyond hardware.
Reuters
reported that DeepSeek worked with Huawei and Cambricon as it rewrote parts of V4’s code.
A parallel AI stack is taking
shape
These changes suggest that two AI ecosystems may be forming. One
is centred
on US technology, with NVIDIA hardware and software at its core. The other is forming around Chinese companies, with Huawei chips and local software stacks.
See also:
DeepSeek proposes a workaround to train bigger AI models with less powerful chips
DeepSeek’s V4 model may become a key test of how viable that second system is. If the model performs well, it could encourage more companies to follow the same path.
At the same time, this shift could affect how AI costs are structured.
Reuters
has linked DeepSeek’s earlier low-cost model claims to investor concerns about the high spending levels of some US AI firms.
What this means for the broader AI race
The race to build better AI models
is often framed
as a contest of model size or benchmark scores. Control over chips, software, and supply chains can shape what is possible and at what cost.
DeepSeek’s move to Huawei chips does not settle that race, but it adds a new dimension. It shows that high-level AI work
is no longer tied
to a single hardware path. Different regions may build their own stacks, each with its own trade-offs.
If V4 delivers strong results, it could mark a turning point. It would show that
competitive AI systems can be built
outside the NVIDIA ecosystem.

Want to learn more about AI and big data from industry leaders?
Check out
AI
& Big Data Expo
taking place in Amsterdam, California, and London. The comprehensive event is part of
TechEx
and
is co-located
with other leading technology events
, click
here
for more information.
AI News is powered by
TechForge Media
. Explore other upcoming enterprise technology events and webinars
here
.
