Christopher Hailstone is a veteran in the energy sector, recognized for his ability to navigate the complex intersection of utility finance and grid resilience. As California faces escalating climate risks and shifting energy demands, his insights into regulatory reform and infrastructure modernization provide a vital roadmap for the industry’s stability. In this discussion, we explore the financial stakes of wildfire mitigation, the surge of data center interest, and the delicate balance between aggressive expansion and fiscal discipline in an era of high borrowing costs.
The conversation centers on the critical need for legislative reform to protect investor confidence and the massive $73-billion five-year capital plan currently at play. We examine the multi-billion dollar liabilities from historical fire events and the rigorous process of cost recovery through state regulators. Additionally, the dialogue covers the technical shift toward undergrounding thousands of miles of lines and the strategic vetting of data center projects to ensure they provide a net benefit to existing ratepayers.
Legislation for wildfire reform is currently under debate to ensure investor stability. How would failing to pass a comprehensive package by late summer impact a utility’s long-term infrastructure investment strategy, and what specific regulatory changes are most critical for securing institutional pension and retirement funds?
The ticking clock toward the end of August creates a palpable tension for any utility trying to manage a massive $73-billion five-year capital plan. If the California legislature fails to pass an acceptable reform package by that deadline, it forces a complete re-evaluation of where every dollar is allocated, potentially stalling life-saving infrastructure projects. We are looking for a holistic package that goes beyond simple utility fixes; it needs to include changes to state building codes to harden entire communities against the threat of fire. At the end of the day, institutional investors need to feel a sense of absolute stability before they are willing to commit their clients’ pension and retirement funds to California’s grid. Without these protections, the cost of capital rises, and the financial foundation required to modernize the grid begins to crumble under the weight of uncertainty.
Financial liabilities from past fire events can reach billions of dollars, often exceeding available insurance coverage. When seeking recovery for billions in mitigation and catastrophic losses, what challenges arise during the regulatory approval process, and how can utilities bridge the gap between recorded liabilities and recoverable amounts?
When you look at the sheer scale of the debt—$1.325 billion for the 2019 Kincade fire and a staggering $2.15 billion for the 2021 Dixie fire—the weight on the balance sheet is incredibly heavy. The primary challenge in the regulatory process is proving that every dollar spent was “just and reasonable,” a standard that can lead to significant gaps between what a utility spends and what the commission allows it to recover. For instance, while one might request $2.18 billion for recovery, the commission may only approve $1.9 billion, leaving a shortfall that must be addressed through rehearings or internal cost-cutting. Utilities often have to lean on mechanisms like the California Wildfire Fund, having already drawn $674 million as of late 2025, but even those funds come with the risk of required reimbursement if regulators find the claims lacking. Bridging that gap requires a meticulous recording of every mitigation action to ensure that shareholders aren’t left holding the bag for catastrophic events that exceeded all applicable insurance policies.
Undergrounding thousands of miles of power lines is a massive undertaking intended to reduce ignitions. Beyond physical hardening, how do high-definition AI cameras and weather stations integrate into daily operations, and what specific metrics are used to prove these investments are effectively reducing the risk of large-scale fires?
There is a certain peace of mind that comes with knowing 1,240 miles of lines are already safely buried, a move that has already yielded over $100 million in cumulative avoided costs. However, physical hardening is only half the battle; we’ve integrated over 1,600 weather stations and 700 high-definition cameras with AI capability to create a real-time digital shield over the landscape. These AI systems act as silent sentinels, scanning for smoke and heat signatures that the human eye might miss, allowing for rapid dispatch before a spark becomes a crisis. The ultimate metric for success is the “10-acre fire”—we are specifically tracking and reporting a significant reduction in ignitions that lead to fires burning more than 10 acres. By focusing 80% of a $73 billion spending plan on transmission and distribution, we are seeing the direct correlation between these high-tech tools and a safer, more reliable network.
Large load demand from data centers is projected to grow significantly by 2030, yet many projects drop out during the engineering phase. How do you distinguish viable developments from speculative ones, and what specific criteria must a project meet to ensure this new load actually reduces rates for existing customers?
The drop in the overall data center pipeline from 7.3 GW to 5.4 GW tells a story of rigorous vetting and the harsh reality of engineering challenges. To distinguish viable projects, we focus on those that have successfully transitioned into the final engineering phase, which currently accounts for about 4.6 GW of our potential load. We are strictly committed to only adding load that is definitively rate-reducing, meaning the new revenue must outweigh the cost of service to provide a net benefit to all customers. Projects must survive a comprehensive cluster study and prove they can integrate into the grid without requiring traditional, expensive upgrades that would burden the average homeowner. This selective approach ensures that the 1.8 GW we expect to have online by 2030 is composed of stable, high-value partners rather than speculative ventures that might vanish mid-construction.
With tens of gigawatts of new capacity being added to the grid, the focus often shifts to resource utilization. How do you balance the need for aggressive grid expansion with the financial necessity of maintaining high credit ratings and avoiding additional debt while borrowing costs remain elevated?
We have already successfully integrated 33 GW of capacity into the California grid, with another 22 GW under contract for the next four years, but this growth must be tempered by fiscal reality. Because borrowing costs are high and stock prices are relatively low, we have to be incredibly disciplined; any new project we add to our $73 billion plan essentially means something else must be removed. Maintaining a high credit rating is non-negotiable because it dictates our ability to fund future hardening and expansion without spiraling into unmanageable debt. We are currently leaning into the fact that our grid is underutilized during certain periods due to low air conditioning demand, which allows us to add new load without always needing to build new, expensive generation. It is a delicate “one-in, one-out” strategy that prioritizes the financial health of the company while still meeting the state’s aggressive energy goals.
Developers are increasingly looking for sites outside traditional high-tech hubs to support their power needs. What infrastructure challenges do these rural or non-traditional locations present, and what steps are required to convince large-scale developers that the grid has the capacity to support their business expansion?
The shift away from the saturated Bay Area into more rural or non-traditional locations presents a unique set of logistical hurdles, primarily regarding existing transmission capacity and local infrastructure readiness. Convincing a developer who thinks California is “tapped out” requires showing them the hard data from our new cluster study, which has already attracted over 10 GW of pre-application interest. We have to actively market the fact that we have 33 GW of existing capacity and are adding more, debunking the myth that the state cannot support large-scale industrial growth. By providing clear timelines for preliminary engineering and demonstrating how our underutilized grid can handle high-density loads, we are effectively signaling that California is open for business. It’s a process of education as much as it is engineering, proving that the infrastructure exists if you know where to look.
What is your forecast for wildfire-resilient grid development?
My forecast is that we will see an era of unprecedented physical and digital hardening, characterized by the completion of 11,000 miles of hardened lines by 2037. This will be supported by a steady rhythm of rate adjustments, likely ranging between 2.8% and 5% annually through 2030, to fund these essential safety upgrades. As the results of the 2027 general rate case become clear, we will see a utility model that is far less reactive to fire seasons and far more proactive in its technological deployment. The integration of AI and massive undergrounding efforts will eventually transition from “emergency mitigation” to the standard operating procedure for any utility operating in a high-risk climate. Ultimately, the grid of the future will be defined by its ability to disappear—either underground or behind a wall of automated sensors—effectively decoupling the delivery of power from the risk of catastrophe.
