9.1 C
Bucharest
Sunday, February 15, 2026

Photonics and high-speed data movement is the next big AI bottleneck — following copper, power, DRAM, and NAND

The voracious appetite of the generative AI revolution has overhauled any number of industries so far in its three-year history. First, it upended demand for high-end chips, pushing companies like Nvidia to record high valuations and putting pressure on all parts of the manufacturing process to churn out chips to meet that need. Then it began to make power grids break and buckle, requiring the need for a rethink about how we send energy to data centers. And those data centers are also facing the strain as they’re needed more often for AI training and inference, even eking out extra demand for commodities like copper that are integral to their operations.

Those data centers need to respond to that demand for more capacity and the challenges of copper shortages, argues Vaysh Kewada, CEO and co-founder at Salience Labs, a silicon-photonics company focused on networking bottlenecks in AI data centers. The bigger and more intensive AI models that continue to roll out, alongside the shift away from chatbots to agentic AI, are pushing those within the sector towards photonics.

Microsoft data center in Mount Pleasant, Wisconsin

(Image credit: Microsoft)

“We’re targeting the scale up domain of AI data centers, where we’re seeing that they’re increasingly limited by not just the bandwidth, but the latency of predictability, especially as we scale to larger workloads and agentic workloads,” she said in an interview with Tom’s Hardware Premium. For that reason, “there’s a lot of attention at the moment around photonics.”

Link

- Advertisement -
Latest
- Advertisement -spot_img

More Articles

- Advertisement -spot_img