SambaNova Unveils SN50 AI Chip Secures $350 Million Funding and Intel Partnership
  • News
  • North America

SambaNova Raises $350 million to Challenge GPUs

New SN50 chip targets cost-efficient AI inference at enterprise scale

2/24/2026
Yassin El Hardouz
Back to News

SambaNova has unveiled a multi-faceted strategy to redefine the AI infrastructure landscape with its new SN50 chip. The announcement includes a planned strategic collaboration with Intel and a successful $350 million Series E funding round. These developments position the company to challenge GPU dominance by offering a cost-effective solution for the growing demands of agentic AI.


A New Chip for the Agentic AI Era

The newly introduced SN50 AI chip is engineered for the next wave of autonomous AI agents. It boasts performance up to five times faster than competitors while offering a threefold reduction in total cost of ownership. This efficiency aims to transition AI from an experimental phase into a profitable, production-scale engine for businesses.

At the core of the SN50 is SambaNova's Reconfigurable Data Unit (RDU) architecture, a stark contrast to traditional GPUs. This dataflow approach optimizes the movement of data across the processor, drastically cutting latency and power consumption. The chip's air-cooled design further enhances its efficiency, allowing it to operate within existing data center infrastructure.

Strategic Alliance with Intel

To accelerate market adoption, SambaNova and Intel have entered into a planned multi-year strategic collaboration. The partnership aims to deliver high-performance, cost-efficient AI inference solutions as a compelling alternative to GPU-centric systems. As part of the agreement, Intel plans to make a strategic investment to help scale an Intel-powered AI cloud.

This collaboration will focus on expanding AI cloud capacity and creating integrated infrastructure for heterogeneous data centers. Kevork Kechichian of Intel emphasized the goal of providing customers with more choice and efficient ways to scale AI. Together, the companies aim to unlock a multi-billion-dollar market opportunity in AI inference by combining their respective strengths.

Major Customer Adoption and Funding

Demonstrating immediate market confidence, SoftBank Corp. will be the first customer to deploy the SN50 chip. The technology will be integrated into SoftBank's next-generation AI data centers in Japan. This deployment is set to power low-latency inference services for sovereign and enterprise clients across the Asia-Pacific region.

Hironobu Tamba of SoftBank highlighted the goal of building a world-class AI inference fabric for Japan with superior economics and control. This move deepens the existing relationship between the two companies, positioning SambaNova as the inference backbone for SoftBank's sovereign AI initiatives. It serves as a powerful validation of the SN50's capabilities for large-scale services.

The company's strategic initiatives are supported by an oversubscribed Series E funding round that raised over $350 million. The financing was led by Vista Equity Partners and Cambium Capital, with significant participation from Intel Capital and other new investors. These proceeds will be used to expand SN50 production, scale the SambaCloud platform, and deepen enterprise software integrations.


SambaNova's recent announcements signal a coordinated and aggressive push into the competitive AI hardware market. By combining its innovative SN50 chip, a strategic alliance with industry giant Intel, and substantial new funding, the company is well-equipped for its next growth phase. This integrated approach directly addresses the industry's need for efficient, scalable, and economically viable infrastructure for the burgeoning era of agentic AI.