Lightmatter Raises $400M Series D; Quadruples Valuation to $4.4B as Photonics Leader for Next-Gen AI Data Centers | Kisaco Research
AI Hardware

Lightmatter Raises $400M Series D; Quadruples Valuation to $4.4B as Photonics Leader for Next-Gen AI Data Centers

Source: Businesswire

Source: Businesswire

Lightmatter, the leader in photonic supercomputing, announced today it has raised a $400 million Series D, valuing the company at $4.4 billion and bringing the total capital raised to date to $850 million. The round was led by new investors advised by T. Rowe Price Associates, Inc. with participation from existing investors, including Fidelity Management & Research Company and GV (Google Ventures). With this financing, Lightmatter will ready Passage™ for mass deployment in partner data centers, enabling the scaling required for sustained AI innovation.

“We’re not just advancing AI infrastructure—we’re reinventing it,” said Lightmatter co-founder and CEO Nick Harris. “With Passage, the world’s fastest photonic engine, we’re setting a new standard for performance and breaking through the barriers that limit AI computing. This funding accelerates our ability to scale, delivering the supercomputers of tomorrow today.”

As frontier AI models expand and training clusters surpass 100,000 XPUs, traditional electronic interconnects are becoming a critical bottleneck. These interconnects can’t keep pace with the growing need for high-bandwidth, low-latency data movement, which is crucial for scaling AI workloads. Lightmatter’s Passage technology addresses this challenge by leveraging 3D-stacked photonics chips to move data. This breakthrough dramatically increases AI cluster bandwidth and performance, while reducing power consumption. Passage, the first photonic engine to deliver IO in 3D, frees XPU shoreline to support more memory, addressing another critical bottleneck for scaling AI performance. By transforming data movement across AI clusters, Lightmatter enables systems to scale efficiently, unlocking new levels of performance and preparing computing infrastructure for the demands of next-generation AI models.

Read the full article here.