For the last three years Alphabet, Google’s parent company has been funding a project aimed at making the power grid work effectively with both renewable and legacy power generation sources, in other words, suitable to meet 21st-century requirements.
The work is being done by a team within X Lab, a skunkworks operation within Alphabet that focuses on breakthrough technologies with the potential to create disruptive, sustainable transformative businesses. Why transformative? Because the projects X Labs works on aim to deliver global solutions to current and anticipated human challenges.
X Lab’s work is aimed at using artificial intelligence (AI) to create a comprehensive visualization of the entire power grid leading to better planning and more effective operations. Astro Teller, who heads up X Lab has brought on Audry Zibelman, a former CEO of AEMO, an Australian energy company to head up this particular endeavour. Zibelman brings expertise in decarbonizing the generation and delivery of electricity. But he also will focus on grid stability, something that was truly non-evident when Texas was subjected to a polar vortex in February this year that led to massive power disruptions across the state.
The subject of modernizing the grid is one that is near and dear to my heart. Back when I was doing telecommunications and network consulting in the late 1990s, I began studying how the power grid could become a data transmission source. I envisioned data streaming along with electricity to everywhere the grid went. Instead of an Internet cable, the power cord would not only provide the energy to run your computer but also all the data you received and all you transmitted. There were a number of challenges to this concept. Probably the biggest was handling step-down transformers at the last mile. How could data not get scrambled or destroyed when it passed along with the electricity through a transformer?
The technology I proposed was an AI solution plus centralized network monitoring that could serve to monitor both power and data delivery. It meant deploying sensors containing software neural agents (pieces of AI code) that would establish baseline performance at every critical point throughout the grid and learn all its behaviours. The neural agents would in time anticipate when performance was not nominal and would compensate. This was what I conceived of back then. Alas, the technology was still too early in development to meet what I had in mind. But it made me cognizant of what could be done in the future.
Enter Alphabet and its X Lab electric grid moonshot headed up by Teller. He writes in an April blog posting, “The electric grid is an engineering marvel — a vast and complex system that keeps our lights on and powers all of the devices that are now essential to modern life. Designed more than a century ago for electricity to flow in one direction from fossil-fueled plants to cities and towns, it wasn’t built for what the modern world is asking of it.”
Why is it incompatible with 21st century needs?
- It is not set up to manage and direct intermittent power from renewable wind, solar, tidal and wave energy sources without major tweaking.
- It was not designed for a low carbon economy.
- It was not built to take on the energy demands of consumers as they shift to electric vehicles.
- It was not set up to manage standalone energy source surpluses like solar on home roofs feeding it.
- It has no visual map of how energy flows through it in real-time, a map that all operators could view whether managing a power plant or generating power and feeding the system from a solar roof.
- It doesn’t come with a universal standard of tools to be shared by all operators.
- It is not capable of integrating data and energy flows, a hindrance in the age of the Internet of Things (IoT).
Teller wants to reimagine the grid by introducing machine learning, advanced computing tools, and AI. He wants to create a virtual grid that would be an accurate representation of the real one. The virtual grid could be used as a simulated testing ground. The questions he raises include:
- Is it possible and useful to create a virtual electric grid so detailed it can understand where all the power is coming from and where it is all going in real-time?
- Is it possible to forecast the weather accurately to know when and where the sun will be shining and wind will be blowing to maximize renewable energy sources at all times?
- Is it possible to create tools to rapidly predict and simulate what might happen on the grid whether in the next few nanoseconds or decades from now?
- Is it possible to make information about the grid, past, present, and possible future, useful to everyone involved in building, planning, updating and managing it?
With the exception of predicting the weather, I believe all of the challenges Teller poses can be answered satisfactorily. And I still dream of a grid that can serve not just to be an energy, but also an information highway, something I hope the folks in the X Lab can try and execute. The latter would mean data would share the power lines, come directly to your home through the same power cords you use to plug in all your appliances, computers, lights, and more. Imagine what that would conceivably do for smart homes.