Researchers at Rice University have developed a method that can increase the amount of electricity recovered from data center waste heat by 60 to 80 percent annually.

The findings are relevant as data centers currently consume hundreds of terawatt-hours of electricity each year, with demand growing alongside the expansion of AI and cloud computing.

The research modeled the system’s performance in two US data center hubs, indicating a 60 percent increase in electricity recovery in Ashburn, Virginia, and an 80 percent increase in Los Angeles.

 The system also lowered the projected cost of the recovered electricity. According to the researchers’ models, the cost per unit of electricity dropped by 5.5 percent in Ashburn and 16.5 percent in Los Angeles.

The study also showed that the hybrid system operated with an over 8 percent higher efficiency during peak sunny hours.

To read more, click here.