The following is a guest post written by Carol Stimmel. Stimmel is the founder and managing director of Manifest Mind, a clean energy advisory firm.
Have you ever bought a new car and then you notice that nearly everyone on the road seems to drive exactly the same car? Mine is a bright red Jeep Wrangler, and trust me, they are everywhere. This “that’s weird” effect is actually called the Baader-Meinhof Phenomenon, or frequency illusion, which means that something that has recently come to one’s attention suddenly seems to re-appear with an improbably high (and slightly creepy) frequency. It should not have surprised me then, as I mused about the possible collapse of the electric power industry, that seemingly every trend aligned to prove my hypothesis: every regulatory shift, every new business model, even the Duck’s Belly. Obviously, my confirmation bias is fully intact.
Here’s the logic of this alarming theory —
When reliable supplies of cheap electricity increase, people have access to an abundance of electrons at a very low cost. Therefore, electricity as a marketable product becomes less valuable, simply because it is in great supply. Without any shift in current strategies, this situation creates uncertainty about the future worth of the economic value of electricity generation. Further, with the ease of deploying rooftop solar and as the availability of costeffective storage solutions increases, there will be no financial value in participating with the macrogrid. With business as usual, we will rapidly progress towards a state where full demand for the bulk transmission of electrical energy product could diminish to a level of negative and conceivably nonexistent demand. Thus, there is very little value in linking energy demand and supply to price.
That’s how I achieved my somewhat dramatic conclusion of utter collapse — of course, only to the extent that the U.S. taxpayer is willing to bail it out or prop it up through an extended subsidy. As you might imagine, this theory has been difficult to discuss. While many people in the industry full acknowledge its possibility, they have a day job they want to keep. Frankly, we’ve been dancing around this reality for years in the industry. Few seem to be willing to acknowledge this scenario; at least publicly (okay, I talked about it in the very first chapter of my book about the smart grid, but I didn’t make it up). I admit I have felt a little bit like the rejected Billy Mitchell who made the prediction that one day Pearl Harbor would be attacked. But here is what I believe to be the truth of the matter — there is plenty of evidence that without major policy shifts coupled with rapid technological change, the failure of the utility business model is completely predictable. Mounting activity in the market shows clues that this reality is already emerging. Vendors are announcing aggressive strategies that show they are ready to make a full end-run around utilities, including new infrastructure, internet-based energy products, and services that reach the energy customer through the back door. How much further shall I point than the recent developments in the oil and gas industry? There we see a frantic rolling back of production, trying to shut off the faucet until the sink can drain. Oil fracking advancements especially helped put an end to “peak oil,” just like rooftop solar is doing to the “energy crisis.”
The following SolarCity chart is brilliant marketing and an easy case to make to the energy consumer. So easy, that no one at Solar City even felt compelled to scale the graph. It’s brilliant because it capitalizes on the fact that electricity is fungible. If electrons are carried to the house on solar panels strapped to the back of a donkey, no one will care — as long as it’s there all the time, it’s reliable, of good quality, and there is someone who can fix it when it breaks. In fact, the economics are so powerful in some jurisdictions that thousands of customers have backlogged interconnections.
I asked for help to dispel my nightmare vision. So, the team at Manifest Mind discussed this problem with each other, analysts, solutions providers, policy wonks, an economist, and even the press. Mostly, we’re told that the transactive framework is the answer; maybe it is. There are some very seasoned proponents of this framework developed by the GridWise Architecture Council, and we certainly aren’t prepared to discard their years of experience and vision. Yet, the Council invites us to participate in the process, asking readers to review and debate the material to “understand whether this Transactive Energy Framework is realizing its objectives and how it can be refined to become an instrument of value.” In that spirit, we wish to raise a few questions about the model’s assumptions in the hopes of furthering insight:
- The framework assumes that the utility will be able to access and control a massive system of distributed assets based on economic signaling that are not currently available. That’s not just an operational change, but also binds the financial elements of the system to discrete behaviors in the physical grid. This undertaking seems impossible to effect in the necessary time frame to handle current-day realities (notwithstanding the policy changes necessary), and requires a near-total level of consumer acceptance, which is even more daunting.
- Given the examples of possible uses of this new system, such as “prices-to-device,” the framework seems to be inherently based on optimizing the value of the electricity itself that is delivered to an end use (how much is it worth to you?). If the value of the electron continues to drop, as theorized, the center won’t hold. Is it too extreme to say that this sounds like a shakedown?
- And the really troubling one? As the energy network grows (more generating units and more nodes that demand electricity) and generation sources become pervasive, the complexity of the interactions will increase, in exactly the same manner as any distributed computing environment. It is not clear the framework fully comprehends this issue, or the alternative possibility of a mass customer exodus where the grid becomes little more than an emergency backup source for prosumers. In the best case (for the utility), the complexity itself becomes the limiting factor to scale, potentially crushing the system under its own success. In the worst case, it is redundant.
In a system that requires always-on reliability but must adapt to new operational and economic realities, it is important that the industry and government quickly work to conceive and implement new grid management strategies. Unfortunately, there are some severe limitations in the familiar command-and-control approach that should not be overlooked. There are ways to guarantee reliability under a different paradigm: If utilities instead work to deploy a system of management that focuses on optimizing interactions instead of transactions, it could scale without external intervention. An interaction network (enabled by autonomic computing theory) focuses on various aspects and conditions (or features) within the ecosystem rather than transactional parameters and rules. It does this not by managing a network of nodes, but by ensuring successful relationships between all nodes based on goals. Such a system provides the ability to achieve both operational and economic success at a price point that is acceptable to both customers and the grid operators themselves. At its core, an autonomic approach stabilizes the energy delivery system in much the same way as a financial exchange handles the policy of transactions, but not the instruments of the transaction (cash or checks), which are instead handled intrinsically by the system.
At Manifest Mind, the team has been working on a methodology for such a system that would be a useful extension to what has been proposed by GridWise. We describe this extension (with apologies to my marketing friends) as, “an autonomic system that addresses the emergent problems of high-penetration DERs, including those that impact grid reliability, flexibility, and market constructs.” In short, we are developing a model to monitor sensor data, analyze generation sources for availability, use subroutines and logic to plan for its most economic use, and then effect a plan based on knowledge of the current state of the overall ecosystem. In that manner, it adapts to the needs of the grid and its participants despite unanticipated shifts in generation and demand.
There is a scenario that as energy customers exit the grid under the auspices of self-generating and community solar projects, the sustainable growth of the macrogrid will falter, operational difficulties will likely increase, and utility insolvency may become a reality. The simple fact is, if a customer can acquire electricity in a reliable and low cost manner from their own or other sources of generation, then the business of electricity will necessarily become less about a commodity product and more about managing the services related to that product. This can only be achieved if the grid operator is able to see and manage the system of relationships among consumption nodes, dispersed generating units, and distribution in a manner where they can provide meaningful, value-added services. This concept is not completely new to the industry, but unfortunately not fully-embraced, as the concepts defined in the transactive energy framework seems to indicate. The future operation of the grid must move beyond a sole focus on moving electrons in favor of managing to the goals and purposes of the distributed energy network for the benefit of society. To do otherwise is to believe that the value of the grid will continue to be extracted in the same way as it has for decades, which may be the perpetuation of a dangerous illusion in exchange for the comfort of business-as-usual.