Is it time, now, for a new version of the Uptime Institute Tier Classification for data center uptime availability — this one, however, to reflect differing business-requirements pointing toward “Climate Positive?”
Until a minute ago, “sustainability” was where it was at — where IoT edge-to-cloud infrastructure thought leaders and business leaders (rarely the same people, but every once in a while…) were pointed.
COP26/Glasgow and the resultant just-published report have now (or should have) disabused us of any notion that “sustainability” would be sufficient on its own merits for the scale and velocity at which climate change is barreling down upon us, with all of its attendant and mounting (by the day, it seems) risks and costs. There are sufficiently ginormous (an economic term not to be confused with its engineering corollary humongous) categories of climate-altering eco-destruction.
Would that it could be ever so simple if there were only an internationally recognized declaration of the costs per metric ton of greenhouse gasses (GHG). That might be a good place to start.
WHAT? You mean there are no estimates for the cost of three GHG: carbon, methane, and nitrous oxide? Whoa! And further, that there are ISO standards for this?
Overview of the Standards Council of Canada (SCC)
Accreditation Program for Greenhouse Gas Validation/Verification Bodies
As Canada’s national accreditation body, The SCC is the only Canadian organization offering internationally recognized accreditation for greenhouse gas (GHG) validation and verification. The program is for organizations providing third-party validation or verification services for the reduction and removal of GHGs.
“Requirements for SCC’s accreditation program for greenhouse gas validation and verification bodies are defined in ISO 14065:2013. A new version of that standard, ISO/CD 14065, is under development…
“Validation and verification bodies worldwide seek SCC accreditation as proof that their work is in line with the most current national and international standards and regulations.”
Here’s the economic value of the “social costs” applied to GHG by the SCC:
For carbon dioxide, the social cost of releasing a metric ton is $51.
Methane costs $1500/t.
While nitrous oxide rolls in at a whopping $18,000/t.
As this is Canadian-writ, we’ll assume this is Hudson Bay Dollars at whatever today’s exchange rate is.
Ok, now we can no longer pretend we don’t have an ISO-level of agreed-up understanding of the per-ton costs across all three Scopes of the GHG Protocols. This is the valorization of GHG.
In a perfect world, the creators of GHG would be assessed to offset the people and planet costs of GHG and be required, as a matter of verifiable, auditable accounting, to apply Generally Accepted Accounting Principles that include GHG.
Just this week (on March 21), the US Securities and Exchange Commission (SEC) Chair Gary Gensler released a draft proposal for required disclosures of climate risk for transparency to investors and regulators of publicly traded companies.
(Our good friends at nZero are one of only three real-time carbon/GHG monitoring and reporting companies mentioned in the report.) Scopes 1 and 2 of GHG from operational energy use are relatively “easy” to measure and calculate.
Scope 3, Embodied GHG –- that “baked into” the full supply chain is where the heavy-lifting complexities can be seen as daunting, if not overwhelming.
That, however, doesn’t absolve us of starting now as an ethical professional matter for designers and builders. Or for developers, owners, and their capital markets partners.
In the data center sense, a colo owner/operator is by definition a supply-chain services provider to its tenant companies. Its tenants are responsible for the colo operator’s GHGs; whether you are a cloud services provider operating from single- or multi-tenant lease facilities or an enterprise IT shop that farms out its data center services to colo operators. That means that SLAs are now and will increasingly be written to include GHG disclosure requirements and standards.
What would a Regenerative Tier Classification System look like?
With a nod to our old friends at The Uptime Institute, now appears to be the right time for a new Tier Classification System for the entire digital infrastructure continuum, the foundation of the global digital transformation economy, the Industry 4.0 IoT-edge-to-cloud moment across all societal and economic sectors.
The harbors, ports, market-hub cities, and financial centers of the Fourth Industrial Revolution are its telecom internet exchange hubs. Its primary trading routes are no longer shipping lanes and air traffic routes but rather broadband telecom subsea fiber optic cables and low-orbit earth satellites networks.
The Regenerative Flying-Cloud Classification System
During the still-early years of the First Industrial Revolution, before coal-fired steam engines ruled the seas, the clipper ship, Flying Cloud, built in East Boston, set the record for ‘round Cape Horn passage from New York to San Francisco of 89 days and eight hours in 1854. This record held for 135 years, only to be broken only in 1989. The Flying Cloud was built to move goods and would-be miners to meet the demand of the California Gold Rush. (Worthy of note: Flying Cloud’s navigator? Eleanor Creesy.)
According to Verizon’s network latency charts in February of this year, a packet of data averaged its round-trip journey traversing the continental US in 29.234 milliseconds. (Flying Cloud really needs to be the name of Boston’s next new hyperscale facility. C’mon Markley Group, step it up!) I would have given that name to the BOS IX facility at 1 Summer Street. Famed retailer, Filene’s Basement, may be long gone from Boston’s Downtown Crossing, but that it has been replaced by Boston Internet Exchange (BOS IX) is part of the new world order in both old Boston’s core and the mill town of Lowell. It is practically eponymous with the First Industrial Revolution. That and, oh yeah, child labor, too!
What most clearly mark the entry into a new economic era are science, engineering and technologies that didn’t previously exist and now do. And, in aggregate, their aggregate formulae transform value creation from one state into entirely new, previously un-envision-able states.
Introducing the Flying Cloud Regenerative Tier Classification System for Regenerative Digital Infrastructure
Tier I = 100% Net Neutral, 2030 (2 nines equivalency),
what isn’t net-zero is offset (or on-prem “inset”). accurate, validated, verifiable GHG emissions, Scopes 1, 2, 3; GHG costs reported in triple-bottom-line accounting or equivalent; certified GHG offsets or on-prem “insets”
Tier II = 50% Net Neutral; 50% Net-Zero, 2030 (3 nines equivalency) all of Tier 1 50% offsets/insets, maximum allowed 50% internal Net-Zero required Corporate pledge to corporate policy and governance mandate for achievement by 2030; annual GHG reporting pledge
Tier III = 100% Net-Zero, Circularity, 2040 (4 nines equivalency) There are those that say this is not possible, but our counter to that is that if this is declared as a first-order, declared-declared “Basis of Design” principle at the point of design, things will show up that may make it possible. And this may stand the entire “Design Thinking” on its head. Design every effort to shrink GHG out of all building materials and processes.
Tier IV = 100% Net-Zero plus 50% Regenerative, Climate Positive, 2050 (5 nines) Achieving Net-Zero is laudable and admirable. Worthy of 4 nines-equivalency. However, NetZero does nothing about re-claiming or regenerating beyond creating the opportunity where such may occur. So, Tier IV in our metaphor reflects that the business-imperative is when the business imperative isn’t just 7x24xforever uptime availability but is the equivalent in a moral, ethical, self-regulatory demand. Just as in the original TUI Tier Classification System, a Tier IV data center cost something on the order 50 percent more than did Tier III.
So should the Flying Cloud Regenerative Tier IV require a doubling of Net Zero as contribution to Climate Positive.
Is this a game worthy of our playing? Is there anything here in this? This is what I call the digital transformation convergence with climate change.
Bill Hewlett famously said: “You cannot manage what you cannot measure… And what gets measured gets done.”
So let’s start by measuring as a part of Basis-of-Design principles. Then, let’s agree to verifiably and certifiably measure, benchmark, and make visible to all stakeholders.
That’ll be hard enough. Look how long it took the data center industry to adopt, benchmark, and manage PUE. (And just look at how we misinterpreted and mismanaged that!) The TUI Tier Classification System is not prescriptive. Merely descriptive of the desired end state. We may have a need, now, to be a bit diagnostic.
A Master Class series, anyone? A Design Charrette?