Nexthop AI Secures $500M in Series B Funding at $4.2B Valuation
  • News
  • North America

Nexthop AI Secures $500 million in Series B Funding at $4.2 billion Valuation

The AI infrastructure company also introduced new switching platforms to enhance data center efficiency.

3/15/2026
Othmane Taki
Back to News

Nexthop AI, a key player in artificial intelligence infrastructure, has successfully closed a $500 million Series B funding round, elevating its valuation to approximately $4.2 billion. Led by Lightspeed Venture Partners, this investment coincides with the launch of new high-performance networking hardware designed for AI workloads. The dual announcement signals strong investor confidence and positions the company to address the escalating demands of modern AI data centers.


Addressing AI Infrastructure Demands

The rapid expansion of AI has created unprecedented challenges for data center networking, where traditional architectures struggle to keep pace. AI clusters require massive, real-time data exchanges between GPUs, a task for which conventional cloud networking is ill-equipped. Nexthop AI's technology is specifically engineered to eliminate these bottlenecks and support the high-volume communication essential for advanced AI models.

This funding arrives as hyperscalers are projected to spend hundreds of billions on AI-related hardware in the coming years. Industry forecasts estimate this spending could reach $650 billion by 2026, creating a substantial market for specialized solutions. The significant investment in Nexthop AI underscores the critical role that foundational networking infrastructure plays in the broader AI ecosystem.

Innovative Hardware and Architecture

In response to these demands, Nexthop AI has introduced a new portfolio of Ethernet switches designed for performance and power efficiency. The lineup includes the low-power NH-4010, the high-density NH-4220, and the NH-5010 deep-buffer switch, all based on Broadcom silicon. These platforms are engineered for seamless migration and rapid deployment within next-generation AI clusters, already shipping to leading hyperscalers.

A cornerstone of the company's innovation is its new 'Disaggregated Spine' data center architecture. This design strategically separates networking layers into functional tiers to enhance scalability and operational efficiency. Nexthop AI suggests this approach can achieve up to a 30 percent reduction in both energy consumption and infrastructure costs compared to traditional designs.

Commitment to Open Networking

Nexthop AI is also deeply committed to the open networking community, actively contributing to software systems like SONiC and FBOSS. The company has quickly become one of the top 10 global contributors to the SONiC project, a Linux Foundation initiative. This approach empowers customers with flexibility, allowing them to avoid vendor lock-in and run their preferred network operating systems.

The company's efforts have earned praise from industry leaders, including Dave Maltz of Microsoft's Azure Networking, who commended its contributions and speed of execution. This collaborative strategy, combined with its co-development approach to product design, has been highlighted by analysts as a key differentiator. It positions Nexthop AI to address the efficiency and reliability challenges of future 800G and 1.6T deployments.


With its substantial new funding and a portfolio of innovative hardware, Nexthop AI is strongly positioned to capitalize on the AI infrastructure boom. The company's focus on power efficiency, open standards, and scalable architecture directly addresses the core challenges facing modern data centers. As the industry continues its rapid expansion, Nexthop AI is set to become a foundational pillar in building the high-performance networks that power the future of artificial intelligence.