The AI infrastructure market just got a clear signal: the CPU layer is back in play.
SiFive announced a $400 million oversubscribed Series G round at a $3.65 billion valuation, with participation from Nvidia and a heavyweight investor list. In a market obsessed with GPUs, that kind of conviction around a CPU IP company is not routine.
The reason is straightforward. SiFive’s bet is RISC-V, an open-standard architecture that gives chipmakers and hyperscalers more room to customize than traditional x86 or licensed ARM designs. Instead of selling finished chips, SiFive licenses CPU and accelerator IP — effectively selling flexibility and time-to-market to companies building AI systems at scale.
That model matters now because AI data centers are becoming heterogeneous by necessity. Running large models in production is no longer just about inference throughput; orchestration, memory movement, and power efficiency are now board-level concerns. That creates space for alternatives that can be tuned for specific workloads instead of forced into one-size-fits-all roadmaps.
SiFive is positioning for modern data center requirements, including compatibility with major ecosystem software and interconnect approaches. If execution lands, RISC-V enters a much stronger competitive posture than it held even a short time ago.
Why it matters
This is bigger than one funding round. It reflects a structural shift in AI infrastructure strategy: buyers want optionality, not lock-in. As cloud providers and enterprise AI teams chase lower cost per token and tighter control over hardware roadmaps, open CPU architectures become strategically valuable.
GPUs still dominate headlines, but the CPU layer is quietly becoming a new battleground for AI advantage.
Sources: TechCrunch, SiFive Press Room.