Co-authored by Wes Geisenberger (VP, Sustainability & ESG), Rob Allen (SVP, Ecosystem Acceleration), Jonathan Rackoff (VP, Head of Global Policy), Paul Madsen (Head of Identity), Hania Othman (Director Sustainable Impact Europe/Africa), and Mariquita de Boissiere.
In Part I of the SIF’s five-article investment thesis series, we showcased the real-world climate actions being delivered on the ground via projects building on Hedera’s carbon-neutral, enterprise-grade public DLT network–specifically how web3 is being leveraged to bring unparalleled levels of auditability to the historically opaque domain of climate finance. In this piece, the second of our series, we dig into how the Guardian, a powerful Hedera-native Policy Workflow Engine (PWE) application funded and closely supported by the SIF, is catalyzing an unprecedented effort to digitize and open-source environmental methodologies used to create nature-based assets from natural capital inputs. Join us as we explore what the Guardian and other DLT innovations mean for raising climate ambition through accelerated learning and collaboration.
The mission of The HBAR Foundation’s Sustainable Impact Fund (SIF) is to leverage advances in distributed ledger technology (DLT) to bring the balance of the planet to the public ledger. At the heart of this project is our belief in the necessity of forging a single, evidence-based source of truth on climate, one that is grounded by open and transparent data that is accessible to and discoverable by all, comprehensively auditable, and immutably recorded. Corporate interests, government regulators, environmental activists, citizens of both rich and poor countries – all are entitled to their own separate policy viewpoints. But we must forge consensus on the informational underpinnings of those views.
Humanity has developed an addiction to patterns of extractive and consumptive resource use that, like most addictions, cannot be sustained forever without progressively worsening harm. This is neither judgment nor condescension. We at the SIF drive cars, fly in planes, live and work in buildings made from concrete, and heat our homes with gas. We are technology optimists who believe in the power of innovation, not advocates for the wholesale abandonment of modern life. But the rapidly increasing scale and scope of climate-fueled severe weather events are a wake up call. We can no longer dismiss the social and ecological costs of natural resource use and environmental damage as externalities. Methodologies now exist that, when applied in conjunction with rapidly advancing DLT-based tools, finally allow us to calculate and disseminate trusted information about these externalities – and to do so on a truly global scale. What has for so long been omitted from countries’ and corporations’ accounting ledgers can now be seen, weighed, and measured by all.
This is a core part of the SIF’s mission: to give the global economy the necessary tools to perform searching inventories of itself in the climate and environmental contexts. The real costs to society of GHG emissions and other forms of ecological damage, both direct and indirect, must be identifiable, calculable, and attributable. Conversely, it must be possible to assign economic value to the global commons, from ecosystems, to carbon sinks, to biodiversity, to all of the natural processes that sustain our collective welfare, as well as the labor of indigenous communities on the ground whose efforts can preserve and protect these resources.
Making climate finance auditable is the first step in achieving this vision. Yes, removing barriers that prevent the public from “following the money” is key to holding financial actors to account. This is how we ensure that investments reach the projects around the world driving real, quantifiable emissions reductions. But true auditability, as discussed in our last piece, is about more than balance sheets and the flow of finance. It requires consensus on the processes and procedures, on roles and rulesets, and on math and the methodologies. A tall order, but one that for the first time is within reach, thanks to newly developed web3 tooling by SIF grantees that uses enables traceability of tokens account-to-account, visibility into the corresponding actions and data submissions based on each role, as well as the specific data used to create assets by each actor.
Currently, the process from submission to the approval of a project methodology is slow, often taking several years and delaying implementation. It is also error-prone and expensive, with costs running into the tens or even hundreds of thousands of dollars for project developers. With the Voluntary Carbon Market (VCM) surpassing the two billion dollar mark in 2022 and estimated to increase one hundredfold between now and 2050, both the quality of project methodologies and the speed with which they are vetted and approved must step up by orders of magnitude.
Hedera’s Guardian makes it possible to keep pace with these demands. By lowering barriers to entry for project developers and compressing timeframes for processing methodologies, the Guardian gives projects a means to incorporate richer, broader, and more complex data sets than previously possible. It also paves the way for unfettered collaboration, learning and reiteration; creating the basis for empirical knowledge structures that can match problems as complex and urgent as those defining the climate crisis.
Disintermediating Access to Markets
Fundamentally, the auditability of any specific project is made possible through the timestamped, immutable capture and quantification of real-world events on-chain. This process is sustained via a suite of digital Monitoring, Reporting and Verification (dMRV) tools, as well as through verifiable credentials (VCs) and decentralized identifiers (DIDs) that tie all roles and actors together with the flow of data and assets across one single chain of trust, or ‘trust chain’.
If traditional, “boots on the ground” Monitoring, Reporting and Verification formed the backbone of legacy environmental asset creation, dMRV underpins their digitization. Minting of tokens on open-source public distributed ledgers makes the intangible – namely ecological benefit claims associated with carbon emissions reductions (or other ecosystem processes) – tangible. Drawing upon a range of Internet of Things (IoT) apparatus, remote sensor and satellite data, together with machine learning, AI, and third-party oracles; dMRV establishes granular visibility, down to the metric ton of carbon dioxide equivalent (mtCO2e). This data can complement manual data where automation is not possible or desired. However, the methodology used will ultimately determine what kind of data is required and the way in which reporting is conducted. Once captured, this data is stored immutably, in real-time, on the public ledger, ensuring that assets are fully traceable and trackable from origin to retirement.
Where it comes to methodologies, ‘digitization’ allows for the encoding of policies into standardized workflows. As a consequence, it becomes easy to reference and compare the underlying methodologies and Quality Standards that inform a given asset; a function that is particularly valuable for increasing investor confidence in the context of otherwise intangible ecological assets.
Furthermore, not only are the ‘rules’ that determine a methodology fully auditable on the public ledger - down to each assigned role, actor, or function - but those functions are also automated. In other words, the tokenization of any real-world asset does not require human intervention and can only occur if all criteria set out in the policy workflow are adequately fulfilled according to the code underlying the smart contract.
The bottlenecks and barriers mentioned earlier mean that, while the VCM has roots that date back as far as the late 1990s, the number and scope of project methodologies remain limited. Launched in 2007, Verra, one of the two largest carbon registries in the world, has amassed a total of forty-nine project methodologies in over a decade. At this rate and under current (analogue) conditions, the task of creating workflows that serve to identify, create and retire high-quality carbon credits or forwards from a diverse range of sources, contexts and geographies before the door closes definitively on a 1.5 degree world, appears doubtful.
Dismantling the proprietary interests that have kept methodologies in the possession of a small concentration of gatekeeping organizations involves leveraging simultaneous processes of convergence and divergence: on the one hand, standardizing project methodologies for seamless communication between stakeholders while, on the other, running policy workflows on open-source software (OSS) to optimize for collaboration over time.
Closing the Information Gap to Close the Finance Gap
Together with the InterWork Alliance (IWA) from the Global Blockchain Business Council (GBBC), Hedera contributed earlier this year to the creation of Token Taxonomy standards designed to encourage interoperability and cooperation across the multitude of industries and sectors engaged with carbon markets. This taxonomy informs the way in which project methodologies, each with their corresponding roles and functions, are mapped out in the Guardian. It is this mapping, also known as schema or common data modeling, that forms the basis for the unprecedented degree of insight DLT affords into and across project value chains; and that ultimately makes it possible to hold market actors accountable. For project developers, many of whom are based in the Global South, such accountability is key to ensuring that they are fairly compensated for their emissions-saving labor. With IWA standards as a foundation to build upon, open-source projects allow developers, from wherever they are based in the world, to access and edit methodologies into policies that reflect their unique context.
Scaling through decentralization necessarily means adopting a design-thinking like approach in which perfectionist, top-down, static solutions are shelved in favor of outcomes-centric, bottom-up, dynamic feedback loops in which ‘failure’ is quickly internalized and reframed as a learning experience. Applying such open learning principles to carbon markets - an industry that has been notoriously slow to change - implies embracing ways of working that are more collaborative and reiterative. To put it another way, it demands that processes run more like software than institutions.
Within this iterative, feedback-centric model, methodologies are leveraged to build reputations over time, thereby incentivizing continuous improvement. And by design, the Guardian is set up for ease of use. Coupled with the friction-reducing attributes of the Hedera ecosystem, it will operate as an engine of empirical knowledge. Run using the highly accessible, lightweight programming language JSON, it takes a developer an average of two weeks to onboard their chosen methodology. Programmers can focus on what they do best, rather than spending months building out an initial platform. This project has only just begun – at full speed, we anticipate on-boarding over 300 methodologies a year – but even in its infancy, our accessibility-first approach has already enabled the Guardian to become the largest repository of open-source digital policy workflows in the world.
It is clear that native tooling on Hedera offers project developers more than a means to create individual policies, but industrial-scale workflows. Global technology solutions company, KrypC, is one example of a leading-edge platform that is harnessing these building capabilities. Drawing upon the Guardian’s IWA standards and the rapidly expanding library of open-source policies, the company is onboarding a new wave of climate-conscious businesses while guaranteeing the high quality of associated carbon projects reaching markets. Working on both the supply and the demand sides of the VCM, KrypC is leveraging DLT to eradicate issues such as double-counting and human error in reporting, while paving the way for smaller organizations from previously underserved regions of the world to participate in, and equitably benefit from, from carbon markets.
The process of bringing the balance sheet of the planet to the public ledger is setting the scene for a borderless digital commons that is accessible internationally at the SME level. Opening access to climate financing instruments, by digitizing and open-sourcing methodologies, therefore also has a vital role to play in helping drive bottom-up engagement in climate action as well as bringing much-needed visibility into country-by-country progress on climate commitments. Equity, accessibility and broader inclusion in emissions reductions worldwide are key to unlocking higher climate ambition.
Networks and processes run on DLT facilitate a level of borderless, social cooperation that would, until recently, have been unthinkable. Open-source systems built, from the ground up, on DLT and sustained through dMRV processes create the conditions for trust that is disintermediated and scalable.
It is this new, Peer-to-Peer, open-source access to workflows that not only serves to provide confidence in a market that has been brought into question over the decades; it also promises to raise ambition and diversify the organizations responsible for verifying and granting credentials. But that is a topic for our next piece.
Next Up: Scaling Validation & Verification
Once auditability is enabled pursuant to a digitized, open-source methodology, natural assets must still be validated and verified. In Part III, we examine the systemic bottlenecks currently preventing the legacy processes for validation and verification from scaling, and explore how DLT tooling may be useful in overcoming those blockers.
To keep the conversation going, join the over 6,600 entrepreneurs and professionals connected on LinkedIn and be part of the future being built on the Hedera Hashgraph.