As renewable energy continues to grow, producers, distributors and customers alike keep grappling with the growing pains of increasing variability in the nation's power supply. A key challenge is curtailment, the deliberate reduction in renewable output below what could have been produced, in order to balance energy supply and demand.
But the huge data centers that do everything from running Google searches to producing your Facebook news feed may offer one solution, researchers say.
In states like California, a rapid buildout of renewables has led to excess supply, especially during peak solar hours. As a result, the California Independent System Operator (CAISO) has often sold excess energy from its grid at a loss rather than undertake costly shutdowns and re-starts of natural gas-fired power plants — and renewable generation is often curtailed, leading to less carbon-free power production than would’ve been possible.
Seeing the lost opportunities resulting from periods of oversupply, UC-Santa Barbara researchers have proposed one possible solution: migrating data center workloads from geographic areas with low renewable penetration to places with more — especially when the destination is at risk of oversupply — to use more clean energy for data processing. They argue that it would help reduce both curtailment and overall emissions. Nationally, data centers are estimated to account for up to 1.8% of total electricity consumption.
The question is how to better match supply and demand across both space and time, according to Sangwon Suh, professor of industrial ecology at the Bren School of Environmental Science & Management at UC-Santa Barbara and a member of the research team. "When you think about information, it’s at the speed of light, and we already have large-scale infrastructure to transmit information across the continent," Suh told Utility Dive. "This should provide an opportunity."
The researchers zeroed in on the potential for shifting data workloads from Northern Virginia, where many of the nation’s hyperscale data centers are located, to California. Northern Virginia’s grid operator, PJM, produces some of the most coal-heavy energy in the country. California, with some of the highest wind and solar penetrations, also suffers the most curtailment on its CAISO-managed grid. Data-load shifts between the regions could be triggered automatically and near-instantly, the UC-Santa Barbara team noted.
The researchers found that migrating loads within the nation's existing data centers "can potentially absorb excess variable renewable energy, reducing both curtailment and GHG emissions at no or negative cost." The data center capacity served by CAISO could reduce 113–239 kilotons of carbon dioxide equivalent emissions per year and absorb up to 62% of the excess variable renewable energy, they found.
"Most of the technologies that enable greenhouse gas mitigation take investment and payment," Suh said. "But there are still options that reduce costs while mitigating emissions, and this is one of them."
Tech transparency a key challenge
Scores of tech companies have pledged to green their energy, but the sector is often called "a black box" when it comes to disclosing information about actual energy usage.
Utility Dive reached out to some of the biggest players, including Amazon Web Services, Microsoft Azure, Google Cloud Platform, Oracle, Facebook and Salesforce about the potential for significant data-load migration. Some, like Facebook and Google, operate their own data centers, while others like Amazon and Microsoft, provide cloud services to major corporations, so would be shifting loads on behalf of their customers. None agreed to respond on the record.
Due to the data industry’s size, it has huge potential for smarter energy use — but its relative newness presents an obstacle to assessing the exact potential: a lack of accountability, observers say. "Because it’s so new, it doesn’t have the same transparency in terms of its electricity demand or even its geographic footprint compared to other industries," said Arman Shehabi, a research scientist in energy technologies at Lawrence Berkeley National Laboratory. "That’s the irony — the data industry does not provide data."
Software developers and clean-energy advocates like Kerri Devine, engineering leader at Arcadia, a tech platform supporting clean energy, have proposed that software engineers take it upon themselves to help reduce renewable energy curtailment and emissions. Devine calls for engineers to consider whether they can reasonably run processing loads in a service region with higher average renewable energy, and — additionally or alternatively — shift the hours when they run time-agnostic server loads to optimize for climate impact.
But Devine acknowledges that her solution takes working around cloud providers like Amazon, Google and Microsoft, who she says don’t provide real-time tools to make data loads (and thus carbon footprints) more transparent to software engineers. "My argument is in lieu of cloud providers providing those types of tools," she said. "The things we can do that are in our control are designing our systems around the black box that is the providers."
Data-load migration is a smart potential solution for curtailment, said Angela Chuang, principal technical leader in customer technologies at the Electric Power Research Institute (EPRI). EPRI highlighted the possibility specifically — along with other potential renewable-power uses to address curtailment, like refrigerated warehouse centers — in a 2017 study. But Chuang called for multiple large data centers to prove data-load migration’s viability by participating in a well-publicized demonstration, which she believes would reassure fellow tech companies.
One challenge might be that data is moving across different regions, which have different data protection laws, Chuang said. "You’ve got to meet user expectations that you’re routing within [U.S.] jurisdictions that users would agree to."
Citing reliable performance of IT equipment as a key example, she said "the risks in doing this need to be addressed." Data companies want to avoid job process delays, ensure responsive Web browsing experiences for users and avoid overheating servers, she said. "We want to see industry movement among data centers," she said. "We need to examine the awareness, the possibility, the feasibility, and the economics [of data-load migration]. Many people don’t want to go first, but they’re willing to go second."
Another logistical challenge to undertaking data-load migration would be changing the way data centers pay for electricity, according to Josh Novacheck, electricity system research engineer at the National Renewable Energy Laboratory (NREL). Shifting loads would require that data centers either be able to access the wholesale electricity market, or reach an agreement with a local utility, he said.
"Some utilities are starting to introduce time-of-use pricing or peak-hour pricing, but it’s not always coupled to what’s happening in real-time with wind and solar," Novacheck said. "So you would have to directly couple those two things — which I certainly think is feasible."
Other solutions
Some argue that renewable energy curtailment isn’t as big of a problem as it might seem. They call it a missed opportunity or near-term issue that will be corrected for in the long run, but that could be addressed in the meantime in a number of ways that go well beyond data-load migration.
Better transmission is one solution. With some of the highest-producing wind and solar resources located far from demand centers, the proportion of renewable power to total installed U.S. capacity is expected to double from 15% in 2018 to 30% by 2030, reaching a total of 442.8 GW, according to GlobalData, and that will require new or improved transmission lines to reach customers.
Transmitting that power across distances and so-called seams, something NREL has studied, would require policy shifts, but will be necessary, Novacheck said. "If we’re transitioning from 10% wind and solar today to 30% over the next few decades, some transmission is going to be required."
Other hopes are pinned on advances in battery storage, which would allow renewable energy from high-production periods to be used later, when demand is higher. Short-term storage could address the "California duck curve," particularly allowing solar produced during the day to be used just a few hours later, starting around sunset; long-duration storage could hold the power for days or weeks.
Another option is producing green hydrogen with a wind- or solar-powered electolyzer that splits water (H2O) to make hydrogen (H2) gas. The process makes renewable hydrogen (RH2) gas more expensive than the wind or solar used to create it, but it also holds advantages that are getting more and more attention. Among them: Green hydrogen can generate zero-emissions electricity in turbines or fuel cells, be stored in higher densities and lighter weights than batteries for long-duration storage, and be used in high-heat industrial processes.
Researchers — including Suh of the UC-Santa Barbara study — called for a mix of solutions.
"Do we store the energy or come up with a demand-response system? It’s going to be both, and data centers are one opportunity there," Shehabi said. "I anticipate the best technologies will rise to the top, whether it’s batteries or hydrogen or something we’re not thinking of yet."
Both Chuang and Novacheck called for more research into demand-response solutions — and ways to tap into that from both the technical and policy sides, including time-of-use pricing. "We need to learn how to incentivize demand flexibility," Novacheck said. "Money talks, and there’s money on the table — we just have to make sure everyone knows how to get to that table."