Just like your kitchen, your regulatory business needs the occasional new tool.
So Sous Vide Me.
Time spent in the kitchen is very rewarding, but it’s easy for the home cook to become complacent. One critical way to maintain interest and excitement is to explore the use of wildly new kitchen tools, such as a sous vide cooker, which can take the home cook on an exploration to learn whole new techniques, recipes, tricks and hacks.
AI — Now Well Beyond Search
The energy Industry, along with many others, is at that inflection point with the new and rapidly evolving suite of AI tools. Like a sous vide device, these tools are novel, but even a little exposure to them reveals their enormous power.
My favorite use case is to give a generative AI tool a list of stuff in the cupboard and a command to come up with a dish, like a salad dressing, based on those ingredients.
These tools create value because the public large language models, like ChatGPT, Bard and CoPilot, are based on the mass of knowledge already contained in internet-available resources, but which are otherwise hard to analyse. The tools decompose the human question, craft a query that gets to an answer more quickly than general search can, and compose complex answers.
In this way, the tools can generate reliable and useful answers to questions that cannot even be asked with best in class search tools. This is more like search on steroids, that is strapped to a rocket that moves at light speed, and then navigates to the destination.
But is that all that there is to it? Can these tools move beyond reasoned search to reasoning?
Energy Industry Meets AI
Many working domains of the energy sector, but particularly the capital and regulatory functions, should be well situated to working with these tools, driven by its features.
A Rich Data Environment
Like internet resources, regulatory data is highly fragmented, complex, and text heavy. Any operating asset such as a pipeline, power plant, or a transmission asset, is going to be covered by a rich blanket of on-going rulings, briefs, dockets, statements of intent, rate applications, waivers, rate reviews, adjudications, complaints, new compliance requirements, and reviews.
Specialized Terminology
The procedural rules and regulations, compliance methods, time tables, texts, documents, applications and forms are written by technical professionals (engineers and lawyers) so as to be legally sound, unbiased, and with minimized chance of misinterpretation, but arcane and hard to grasp.
Specialist Preparers
Preparing the actual submissions and documents for an action, or to react to a third party’s submission has been the domain of equally highly skilled and knowledgeable professionals across a matching range of skills and expertise. Those tasked with reviewing, analysing, challenging, and approving submissions are equally matched in skills and capability.
Enormous Scale
The submissions themselves, along with all the supporting documents, briefs, correspondence, datasets, and calculations, can number well into the many tens, or even hundreds of thousands of pages of content.
The volume of content generated vastly exceeds the capacity of any one organization to fully absorb, within a single jurisdiction, let alone across multiple jurisdictions, geographies, asset types, and energy classes.
Dynamic Content
To add to the challenge, the energy industry capital landscape, and its corresponding regulatory environment, are not static. As new infrastructure is built, or as new energy products like hydrogen are adopted, or products wane in demand, the rules evolve and adapt, the economic fundamentals that once underpinned specific investments are threatened, and new opportunities emerge.
Over time, the submissions, for new greenfield investments through to the oldest of brownfield asset, gather an equally rich and constantly growing set of comments, feedback, challenges, notices and more, from a huge range of stakeholders (customers, land owners, tribal communities, competitors, intervenors, associations, and activists).
Policy Changes
The public policy landscape, particularly related to energy transition drivers, brings its own pressures. For example, the provisions of the Inflation Reduction Act (IRA) are adding significant new strain on regulatory processes by unlocking a huge inflow of money and submissions for new infrastructure. New energy products are coming quickly (carbon, hydrogen, geothermal, renewables), impacting cross-commodity supply and demand dynamics and thereby price. Incumbent energy transporters and shippers react to market forces with rate adjustments, incentives and fees, service changes, complaints, and more.
This huge and growing body of regulatory content represents a very valuable new resource for those involved in the energy sector, in much the same way that the internet proved to be key to training the large language models to create generative AI tools.
The AI Prize
As I see it, there is an ample prize here formed at the intersection of the highly attractive data-rich regulatory landscape, the high business value available from the analysis of the content, currently slow, fragmented and unpredictable regulatory processes, and clever new tools that decouple expertise, work to be done, and the available time.
There are two leverage points for energy market participants in applying these new tools.
Business Value Capture
The first is to extract that business value, since the value already exists, can be applied on both existing brownfield energy assets as well as to new greenfield investments, and is not contingent on third parties.
An example of business value can include a review and analysis of the protests, feedback, and comments received by a regulator for a given asset development, to determine the sentiment, narratives, sources, relationships between commenters. Such analysis can help a proponent assess the receptivity to a similar proposal or concept, and can factor into feasibility as well as planning for permits and siting.
Another example would be the analysis of regulatory cost reports, focusing on those assets with a similar profile, for the purposes of improving the accuracy of cost estimates for project development. Carefully vetted benchmark cost data are instrumental in helping a developer realistically assess project economics.
Once the limitations imposed by time and available resources are removed, analysts can then pose an unlimited number and range of queries against the data, in parallel. Most importantly, analysts will be able to explore entirely new lines of enquiry, to develop new lines of reasoning in areas that have simply been out of reach.
Content Creation
The second leverage point is in using these tools to create net new content (documents, briefs, rulings), which would apply mostly to greenfield investments, and may require collaboration with third parties.
Creating the submission documents for any given need —assembling the right data, building the analytics, drafting the copy, creating the indexes, and completing all the other document construction tasks that humans have to do — can be considerably enhanced by these tools. More submissions may potentially be assembled in parallel, and tested to confirm which ones are likely to be successful.
Preparers will be able to leverage the enormous body of prior rulings, briefs, rate applications, and waivers by using AI tools to sift through the content and retrieving the successful prior events. AI tools may be able to suggest possible angles of argument, useful analytics, and superior presentation approaches based on previously successful proposals.
The tools may even compose the questions to be asked. In time, asset owners and regulators will create vast complex dialogues of questions, and these will be executed in the first few minutes after a submission is received. The tools will be able to rank responses for human review. The speed and quality gains will be impressive, along with the speed to approval or disapproval. Those preparing submissions and applications should prepare accordingly.
Making AI Actually Work
Like the first trial of a sous vide machine, our first attempts to use ChatGPT have been tasty but a bit crude. To use these AI tools on something as mission critical as preparing a complex rate application, or assembling a relevant benchmark dataset, really calls out for adult supervision.
Ingest
The first step will be to ingest the regulatory content, using language models that can interpret that content, to create the relevant datasets. This is not as simple as it sounds, given the volumes and types of sources. It would be impractical for an owner of a single asset, or an owner of multiple but very different assets, to invest here, as the datasets created would have such limited usage. It would be both faster and lower cost to collaborate with a business, such as Arbo, that has already completed such an investment.
Of course, unless the right model is fed the right data, the model is likely to return poor results (or hallucinations). Expert judgement from specialist firms can help assure that the data inputs and outputs are reliable, valuable and relevant.
Query and Analyze
Next, the data sets must be properly queried and manipulated to derive reasoned responses. As with the public generative AI models, the quality of the analysis and outputs from the models very much depend on the nature of the question asked, and how it is presented. Users will need to apply expert judgement in framing any planned analysis, as well as evaluating the quality of the response. For example, a benchmarking analysis of a specific project against possible peers will require very careful selection of relevant peer set. A recommendation to a developer to proceed based on that analysis will demand expert reasoning. For the moment, this is the domain of the human analyst.
Adapt for Enterprise
Finally, asset owners and operators will need to adapt their use of analytics, organizational approach, work metrics, and their staffing models reflective of how these tools can transform intellectual work.
For a parallel example from engineering, consider this article on making engineering data sweat.
Conclusions
In the same way that a new sous vide appliance can transform the kitchen experience, modern analytic tools offer the same transformative possibilities for regulatory data. However, the stakes are much higher, and getting it wrong can be disastrous. Project risks actually go up, as these tools can impart a false sense of security. Your pro move is to work with a top tier regulatory analytics firm, such as Arbo, that brings data and expertise in energy industry commercial, project, regulatory, policy and legal domains covering complex network asset infrastructure, including power lines and grids, generation facilities, and pipeline systems.
Artwork is by Geoffrey Cann, and cranked out on an iPad using Procreate.
Latest Podcast
In this podcast, I speak with David Reinhart of Visionaize on digital twin technology. Digital twin solutions are rapidly advancing to a point where the model will be running tightly coupled with the actual asset it models, using live data. It’s not a stretch to see how the digital twin can become the day-to-day supervisor of the actual asset. This kind of work even applies to brownfield assets that predate the internet, mobility, cloud computing, and other modern inventions.
Geoffrey Cann writes about, speaks to and teaches the energy industry about digital innovation. For more about Geoffrey Cann, click here.