Nexthop AI has closed an oversubscribed $500 million Series B round, vaulting the AI networking startup to a $4.2 billion valuation and underscoring investor appetite for the infrastructure layer that makes large-scale AI training possible.
The Funding Round
The round was led by Lightspeed Venture Partners, with Andreessen Horowitz joining as a major new investor alongside Altimeter and all existing backers. The oversubscription reflects strong demand — Nexthop reportedly turned away additional capital as interest outstripped the original target.
Founded in 2024 by Anshul Sadana, former chief operating officer of networking giant Arista Networks, Nexthop AI has moved quickly from stealth to market leader. The company's thesis is straightforward: as AI clusters scale to tens of thousands of GPUs, the network connecting them becomes the critical bottleneck, and legacy enterprise networking gear is not up to the task.
New Products
Alongside the funding, Nexthop launched two significant products. Its new AI data center switching platforms are purpose-built for the traffic patterns of distributed model training, where GPUs must exchange gradient updates at extremely low latency. The company also unveiled a "Disaggregated Spine" architecture aimed at hyperscale and NeoCloud operators, allowing them to scale network capacity independently of compute expansion.
The disaggregated approach is notable because it lets data center operators upgrade networking without ripping out existing infrastructure — a practical consideration as facilities race to keep pace with the explosive growth of AI workloads.
Why Networking Is the Next AI Bottleneck
Most attention in the AI infrastructure space has focused on chips — Nvidia's GPUs, AMD's accelerators, and custom silicon from Google and Amazon. But as training clusters grow beyond 100,000 GPUs, the network fabric becomes the limiting factor. Data must flow between processors at speeds measured in terabits per second, and any congestion or latency directly impacts training efficiency.
Nexthop is positioning itself as the company that solves this problem. Its hardware is designed from the ground up for the all-to-all communication patterns that dominate AI workloads, rather than the north-south traffic flows that traditional data center switches were built to handle.
Market Context
The raise comes at a time when AI infrastructure spending is surging. Gartner recently forecast worldwide AI spending will reach $2.5 trillion in 2026, with a significant share directed at data center buildouts. For Nexthop, the opportunity is to become the default networking vendor for the new wave of AI-first data centers being constructed by hyperscalers, sovereign AI initiatives, and well-funded startups alike.
With $500 million in fresh capital and a product lineup targeting the fastest-growing segment of enterprise infrastructure, Nexthop AI is betting that the next great platform company in AI will not make chips — it will connect them.



