Demand-side management (DSM) has long been a tool for electric system management. In short, when power demands start mounting (think hot summer afternoon as air conditioners kick on), the call goes out to registered major users to reduce their demand (from turning off lights to shutting down production lines). Once required much negotiations and pre-arrangements to set up with phone calls to on-scene staff to execute, an increasing ‘smart’ grid and IOT (internet of things) enables near speed of light moves to shave demand to save money (for multiple parties) and reduce brownout/blackout risks.
A reverse item of DSM has been an element of price arbitrage: move uses to when power is cheap and thus flatten demand. This could range from homes running dishwashers in the middle of the night if they a “time of use” rate structure to large users making ice in off hours for use in cooling when demand is higher. Thus, time shifting is double DSM: decreasing peak while increasing off-peak demand.
Google is now taking a leap forward with (what we can call) double DSM (squared) to both clean up its data center usage and lower its operating costs. The basic concept:
- Google will track electricity prices and carbon footprints in real time
- It will shift data demand and usage to server farms that can leverage lower cost and lower carbon electrons created by high renewable production exceeding demand.
Now, this will first be a time-shifting exercise done at each Google facility:
our hyperscale (meaning very large) data centers [will] shift the timing of many compute tasks to when low-carbon power sources, like wind and solar, are most plentiful. … Shifting the timing of non-urgent compute tasks—like creating new filter features on Google Photos, YouTube video processing, or adding new words to Google Translate—helps reduce the electrical grid’s carbon footprint, getting us closer to 24×7 carbon-free energy.
But Google does see a path toward double DSM (squared) power demand shifting — across time and across location.
first version of this carbon-intelligent computing platform focuses on shifting tasks to different times of the day, within the same data center. But, it’s also possible to move flexible compute tasks between different data centers, so that more work is completed when and where doing so is more environmentally friendly. Our plan for the future is to shift load in both time and location to maximize the reduction in grid-level CO2 emissions.
One of renewable energy system challenges has been “stranded assets” — production that can’t find a viable end use due to constrictions in the grid or intermittent production poorly matching demand. Responses to this have included storage systems (from hydro-storage to, increasingly, batteries to moves for green hydrogen production). Google’s “carbon-intelligent computing” is an exciting next step in addressing the ‘stranded asset’ challenge for maximizing value creation from clean electrons.
NOTE: This is particularly interesting to see as, for awhile, I was an (informal, friendly) advisor to a pre-revenue start-up (that sadly didn’t move to commercialization) with a similar concept: to schedule and shift data center usage around the country to minimize data center electricity costs and computing carbon footprints. That startup had twists in its approach that vary from Google’s but, well, doesn’t have the $billions, massive(ly competent) staff, and footprint of Google.