🚀 Discover this awesome post from TechCrunch 📖
📂 **Category**: AI,Startups,datacenter,Niv-AI
📌 **What You’ll Learn**:
Electricity is a key raw material for artificial intelligence, but new processing technologies are outpacing data center operators’ ability to manage their relationship with the power grid, forcing them to reduce it by up to 30%.
“There’s a lot of wasted power in these AI factories,” Nvidia CEO Jensen Huang said during a keynote speech at the company’s annual GTC customer conference. “Every unused watt represents a loss of revenue,” the company announced during its annual presentation.
Today, startup Niv-AI has come out of nowhere with $12 million in seed funding to solve this problem by precisely measuring GPU power usage using new sensors and developing tools to manage it more efficiently.
The Tel Aviv-based startup was founded last year by CEO Tomer Temur and CTO Edward Kizis, and is backed by Glilot Capital, Grove Ventures, Arc VC, Encoded VC, Leap Forward, and Aurora Capital Partners. The company declined to share its assessment.
As frontier labs run thousands of GPUs in concert to train and run advanced models, there are frequent spikes in millisecond-scale power demand as processors switch between computation tasks and communicating with other GPUs.
These increases make it difficult for data centers to manage the power they draw from the grid. To avoid being left without adequate electricity, data centers pay for temporary power storage to cover surges, or throttle their GPU usage. Both cases reduce the return on investment in expensive chips.
“We can’t keep building data centers the way we’re building them now,” said Lior Handelsman, a partner at Grove Ventures and a member of Niv’s board of directors.
TechCrunch event
San Francisco, California
|
October 13-15, 2026
The first step in Neff’s roadmap is to understand what’s happening; The company is now deploying rack-level sensors that detect power usage at the millisecond level on GPUs it owns alongside design partners. The goal is to understand the specific power profiles of different deep learning tasks, and develop mitigation techniques that allow data centers to unlock more of their current capabilities.
Naturally, engineers expect to build an AI model on the data they collect, with the goal of training it to predict and synchronize power loads across the data center — a “co-pilot” for data center engineers.
Niv-AI expects to have the system operational in a few U.S. data centers within the next six to eight months. It’s an attractive idea as hyperscalers trying to build new data centers face land use and supply chain difficulties. The founders see their final product as the missing “intelligence layer” between data centers and the electrical grid.
“The network is actually afraid that the data center will consume too much power at a given time,” Taymur told TechCrunch. “The problem we’re looking at is a problem with two sides of the rope. One is trying to help data centers use more GPUs and hopefully leverage more of the power they’re already paying for. On the other hand, you can also create more responsible power profiles between the data centers and the network.”
💬 **What’s your take?**
Share your thoughts in the comments below!
#️⃣ **#NivAI #stealth #wring #power #performance #GPUs**
🕒 **Posted on**: 1773820168
🌟 **Want more?** Click here for more info! 🌟
