Nvidia CEO Reveals AI's Power Crisis: Data Centers Waste 30% of Electricity
By admin | Mar 17, 2026 | 2 min read
Electricity serves as a fundamental resource for artificial intelligence, yet emerging computational methods are overwhelming the capacity of data center operators to balance their power grid demands, compelling them to reduce operations by up to 30%. “A significant amount of power is wasted in these AI facilities,” stated Nvidia CEO Jensen Huang in a keynote at the company’s yearly GTC conference. “Every watt not utilized translates to lost revenue,” the presentation emphasized.
EMBED_PLACEHOLDER_0
Now, the startup Niv-AI has launched publicly with $12 million in seed financing to address this issue. The company aims to accurately monitor GPU power consumption through novel sensors and create tools for more efficient power management. Founded last year in Tel Aviv by CEO Tomer Timor and CTO Edward Kizis, Niv-AI is supported by investors including Glilot Capital, Grove Ventures, Arc VC, Encoded VC, Leap Forward, and Aurora Capital Partners. The startup has not disclosed its valuation.
When advanced research labs coordinate thousands of GPUs to train and operate sophisticated AI models, rapid, millisecond-level spikes in power demand occur as processors alternate between computing tasks and inter-GPU communication. These surges complicate data centers’ efforts to regulate power drawn from the grid. To prevent shortages, data centers either invest in temporary energy storage to buffer surges or limit GPU usage—both approaches diminish returns on costly hardware investments.
“Current data center construction methods are unsustainable,” remarked Lior Handlesman, a partner at Grove Ventures and Niv board member.
Niv’s initial focus is on diagnostics: the company is installing rack-level sensors that monitor GPU power usage at millisecond precision on its own hardware and with design partners. The objective is to analyze the distinct power patterns of various deep learning workloads and devise strategies that enable data centers to maximize their existing infrastructure.
Ultimately, the team plans to develop an AI model using the collected data, training it to forecast and coordinate power loads throughout the data center—acting as a “copilot” for facility engineers. Niv anticipates deploying a functional system in several U.S. data centers within six to eight months.
This solution is particularly timely as large-scale cloud providers encounter challenges in expanding data centers due to land-use constraints and supply chain disruptions. The founders envision their product as an essential “intelligence layer” bridging data centers and the power grid.
“We’re tackling a dual-sided challenge,” they explained. “First, to assist data centers in activating more GPUs and better leveraging the power they already procure. Simultaneously, we aim to establish more sustainable power interactions between data centers and the grid.”
Comments
Please log in to leave a comment.
No comments yet. Be the first to comment!