Why Optical Infrastructure Is Becoming Core To The Future Of AI

Posted by Kolawole Samuel Adebayo, Contributor | 1 day ago | /ai, /blk-leadership, /innovation, /leadership, AI, ForbesBLK, Innovation, Leadership, standard | Views: 13


The AI boom has a power problem, and it’s not just about electricity. As generative models balloon in size and data centers race to keep up, the conversation has mostly centered on compute: More GPUs, clusters and chips. But the deeper you look, the clearer it becomes that the real constraint isn’t about how fast machines think but something else at the infrastructure layer.

Feeding AI models with high-dimensional data at hyperscale demands infrastructure that can move terabits per second with minimal loss and minimal power draw. Copper-based interconnects — the default wiring of the internet age — are beginning to buckle under that load. That’s why infrastructure builders and investors are placing new bets on silicon photonics: A fiber-based alternative that uses light instead of electricity to move data faster, cooler and more efficiently.

On Tuesday, Teramount, an Israeli silicon photonics startup, announced it had raised $50 million in Series A funding with backing from AMD Ventures, Samsung Catalyst Fund, Koch Disruptive Technologies and Hitachi Ventures. Their proposition is simple but consequential: Make it easier to connect chips using photons instead of electrons and in doing so, keep AI’s physical systems from becoming its limiting factor.

The Big Problem

Copper has held up surprisingly well for decades. But in today’s AI clusters — where GPUs number in the thousands and training runs can span weeks — traditional wiring starts to look less like infrastructure and more like a bottleneck. Power leakage, thermal output and bandwidth ceilings all compound as systems scale.

Silicon photonics offers a way out. By sending light signals through fiber, these systems reduce energy use, cut heat, and massively expand data throughput. Teramount’s approach centers on detachable, fiber-to-chip connectors, optimized for co-packaged optics — a design architecture that integrates optical components directly with compute silicon.

According to Yole Group, the co-packaged optics market is expected to reach $2.1 billion by 2028, with the broader silicon photonics market projected to grow to $9.65 billion by 2030 — nearly four times its 2023 size.

Tech heavyweights like Nvidia, Intel, AMD and Broadcom are already building toward this future. But serviceability and deployment at scale remain open questions. That’s where companies like Teramount are carving out space, not just by building fast interconnects, but by making them field-ready.

Why This Matters

The timing of Teramount’s raise is no accident. The cost of AI, both in dollars and watts, has been rising consistently over the last two years.

The International Energy Agency estimates that total data center electricity use could jump to 1,000 terawatt-hours by 2026, almost double current levels, driven largely by generative AI systems. For context, that’s roughly the same electricity demand as Japan.

Meanwhile, a recent Reuters Breakingviews column argued that the AI boom is just as much about infrastructure as it is about algorithms. It projects that global data center investments could top $3.7 trillion in the coming years — underscoring the urgent need to slash power consumption while boosting bandwidth.

And here’s the kicker: Most of that power isn’t going into compute. It’s going into moving data — between processors, racks, storage arrays, and memory pools. Without faster, lower-power interconnects, AI’s scale advantage turns into an energy liability.

It’s easy to think the future of AI is all about powerful models that can write code, draft documents, or analyze images. But behind all those smart tools are the physical systems that move data around, manage heat and keep everything running smoothly. If that foundation isn’t strong, the entire system starts to crack.

However, silicon photonics isn’t a plug-and-play solution. It still needs new standards, better packaging techniques and more advanced manufacturing before it becomes mainstream. That’s why it’s taken years to move from lab experiments to real-world products.

But that’s starting to change. Companies like Meta, Microsoft, and Amazon are already using photonic connections in some of their newest AI systems — doing so quietly but with growing consistency.

And when investors start backing the tough, unglamorous parts of AI, like silicon photonics, it’s a strong sign the industry sees this as essential, not optional.

The Takeaway

Teramount’s raise is a reflection of the big push in the broader AI infrastructure world and how that could be the biggest decider of the winners in this era. They won’t just be those who train the smartest models, but the ones who build systems that can handle them at scale, at speed, and without overheating the grid.

“If AI is to evolve from a marketing buzzword into a lasting engine of innovation, its success will depend on infrastructure that’s as intelligent and efficient as the algorithms it supports,” Taha said. “In short, the future of AI hinges on rethinking the wiring that holds it all together.”



Forbes

Leave a Reply

Your email address will not be published. Required fields are marked *