Data And Energy Are Building Blocks For AI

Modern server room, corridor in data centre with Supercomputer racks, neon lights and conditioners. … More
In brainstorming the creation of new artificial intelligence infrastructure, some common guidelines and concepts apply.
I’ve seen a lot of these kinds of plans developed over the last year, as companies start embracing one of the biggest transformational technology shifts ever, really, the only one of its kind. Yes, we had the Internet and the cloud, and Moore’s law shrinking hardware, but we never had technologies really able to pass the Turing test in the ways that they can now.
In any case, when you look at all of the moving parts in any AI project, you come up with two major fundamental pieces – one is data, the digital traffic required to interact with a neural network. The second is energy – the power that it takes to run these systems.
After all, our central nervous systems run on a certain kind of power, too. The data centers of the AI era will be requiring a lot of juice, and so you could think of energy as the physical component complementing data, which is the actual content in the system.
With this duality in mind, we can explore how companies are moving the ball forward.
Feeding Data Centers in a Green Way
A recent article on the company Supermicro talks about “a holistic approach to AI focusing not just on launching cutting-edge hardware, but on the entire AI stack, from compute to network architecture and energy efficiency.”
Supermicro is, famously, a contractor for the xAI Colossus project that’s one of the largest data centers of its kind in the world.
I reported on its colossal build early in the game, as Musk was continuing to double down on the number of GPUs slotted into the project.
In any case, the liquid cooling systems developed at Supermicro are an example of how to apply energy concepts to the AI space.
Read the article to see how Supermicro has innovated the build.
“These look like standard servers, but the infrastructure needed to cool and power them at this scale doesn’t exist,” says Johnson Eung, a senior growth products manager in AI supercomputing for Supermicro. “So we’ve worked with customers to create this from the ground up, providing clear parameters to ensure it’s done responsibly, safely, and sustainably.”
Government Input
The Department of Energy is also weighing in on these processes. Citing the scale of AI growth, agency leaders are making recommendations and evaluating how all of this works in enterprise.
Here’s a list of stakeholders enumerated by the department for collaboration:
· Hyperscalers: Amazon, Google, Meta, Microsoft, OpenAI
· Data center developers/innovators: Blackstone/QTS Data Centers, Digital Realty, Verrus
· Technology providers: Fervo, General Electric, Hitachi, Intel, HPE, Long Duration Energy Storage Council, Nvidia
· Electricity companies: Associated Electric Cooperative, Constellation, Duke Energy, Evergy, NPPD, NextEra, PPL, Portland General, PSEG, Southern Company/Georgia Power, Vistra
· Independent system operators and regional transmission operators: CAISO, MISO, PJM, SPP
· Environmental NGOs: NRDC
· Researchers: Association for Computing Machinery, Brattle, Caltech, Carnegie Mellon, Department of Energy, EPRI, Johns Hopkins, IEEE, LBNL, MIT Lincoln Lab, NYU, UC-Santa Barbara, University of Chicago
You can read the rest of the report here.
Interviews with Founders and Leaders
I got more insight on some of this type of architecture at two IIA interviews in April.
I interviewed Sridhar Ramaswami of Snowflake about how that company maintains a marketplace for data, with collaborative approaches and cross-cloud systems.
He talked about AI as a new medium of interaction, and discussed product philosophy for chatbots and workflows.
“AI has to be easy to use, it has to be efficient, and it needs to be trustworthy,” Ramaswami said.
He also discussed customer service models.
“When it comes to investments, we don’t require our customers to sign up for AI as a product, to make commitments to buy a certain amount of AI,” Ramaswami said. “It’s very much a pay as you go kind of model. And the last point … on trustworthiness is something that we keep hammering on, which is that every software product comes with an intuitive understanding of what is right and what is wrong.”
Another interview was with Chase Lochmiller and Nadav Eiron of Cruso.
Here’s where we talked about energy in the equation.
Discussing “vertically integrated AI infrastructure,” and the fundamental role of AI in business, Lochmiller went back to that idea of data use supported by energy:
“AI infrastructure at scale really leads to this convergence of energy and computing,” he said. “You know, the computing infrastructure to run AI at a really meaningful scale just requires tremendous amount of energy. … I think it’s important to put scale into context, because sometimes, if I just start standing up here and, you know, citing really, really big numbers, they just seem sort of meaningless.”
He explained that the company is contemplating a project in Abilene that is intended to consume 1.2 gigawatts of power, and mentioned how the firm wants to support that system.
“We’ve taken an energy first approach, which means actually bringing the demand for computing to areas that we can access, low cost, clean abundant energy, energy we can effectively produce clean energy at scale in a cost-effective basis,” he said. “We can deliver computing infrastructure that is powered both cleaner and cheaper than existing infrastructure in large markets like Virginia, the way we affect that is that for the first time in history, we’re actually seeing clean energy solutions being the cheaper ways to produce energy.”
So, powering the right data sets leads to the kind of equation that you want to promote when building for AI data centers of the future.
I’ll keep bringing these sorts of perspectives as we brainstorm the optimization of data centers in 2025 and in the years ahead.