Cloud Computing’s Age Of Constraint Is Here
Just flip on Netflix, and you can easily find a few dozen dystopian visions of a climate-ravaged future. As the standard storyline goes, the end of this century will see temperatures rise to ecologically disastrous heights, impacting billions of lives and changing living conditions forever. We may have to live deep underground, avoiding a dust-blown and radiated earth surface, surviving in units that might require giant oxygen-CO2 exchange and temperature-control machines just to keep the human race alive.
Those life-giving machines, by the way, are likely to all be running on the cloud. Sadly, it’s the same cloud-computing environment that, in this dystopic vision, may have accelerated our current serious climate change problem into an irreversible catastrophe. Of course, the very channel showing you all this dystopia, Netflix itself, is a cloud-based service whose power-gulping servers are adding to a very wide and deep carbon footprint.
So how did we get pointed in this direction in the first place? It’s useful to think of the business cycle that got us here in terms of abundance and constraint.
There is an age of abundance in the beginning of any innovation when resources and growth seem unlimited, and there is a gold rush toward building whatever is the moneymaker of the day.
We’ve seen this in nearly every industry, with every technology. Let’s take the automobile as one prime example. When gas distribution finally became ubiquitous and a car’s price point came down far enough to be affordable to a middle-class consumer, no one gave much of a thought about whether we should limit burning so much gasoline (and leaded gasoline at that). Later, when gas became more scarce and expensive, we embraced the fuel efficiency trends of the 1980s (until we didn’t and decided large SUVs were back in style). Today — even with a recognition of the climate change problem, with cars accounting for a quarter of that problem — we are only reluctantly moving into an age of constraint, when reducing the carbon footprint isn’t just a “nice-to-have,” but could be the only way to save us from living underground.
Cloud computing is currently seeing a similar cycle of abundance and constraint. When all of this started, we thought cloud computing had solved the capacity problems of the earlier age of the data center, when capacity expansion meant buying expensive physical server farms.
The cloud makes it all so easy. When we have a capacity problem, why not just buy more? Increase the CPU. Increase our memory and storage ad infinitum. Boost the bandwidth whenever the business requires it and click a button so that more hardware shows up the next day to be ready for the cloud expansion.
Sounds good. But are you designing your underground bunker yet? There’s a real consequence to an infinite expansion. Whenever we scale up we use more electricity, much of it still generated from fossil fuel sources. When we decide to make more chips, we need more precious and semi-precious materials, in turn, requiring more mining operations that come with their own serious environmental impacts. Assembling, packaging and transporting all the extra materials and hardware with diesel or gas-burning vehicles adds to the carbon footprint even more.
You see where this resource spiral is going. Cloud computing, far from being our savior, could be the straw that breaks the climate change camel’s back.
We don’t have to keep that spiral going. In the next phase of capacity planning, we can make real environmental impact calculations about every business expansion decision. Up to now, many companies’ environmental considerations have been dismissed as “greenwashing,” but there are executives out there who understand that accelerating climate change is bad on a moral and future business level.
In the days of on-premises computing, smart IT executives decided that delaying hardware in favor of optimizing software was a good strategy. Why can’t we do the same in the cloud? If we resurrect that kind of optimization strategy — whether that means clean-coding standards that run more efficiently or finding significant optimization in our cloud configurations — we could save, based on my own estimates and experience, between 5% and 20% of the cloud-computing carbon footprint. Taken on a global scale, even a small percentage reduction can make a huge difference.
The first step is to use real cloud-to-carbon calculations and algorithms that paint a full picture of how much carbon footprint can be attached to every cloud capacity decision. Without standardized robust data on what the true carbon cost of an expansion decision will be, there’s no way for an executive or a board to even consider it as part of corporate strategy.
There are already efforts underway by Google and other cloud giants — they offer a carbon footprint dashboard for businesses to measure their cloud usage impact. All three of the major cloud providers are spending money and resources to color themselves green, but carbon tracking must eventually move beyond sustainability marketing messages and emissions reporting in annual reports to become a priority business goal. The next time we use Netflix to peer into a dark future, we should ask ourselves what the streaming giant is doing to offset the carbon footprint of billions of streams every year.
Speed-to-market will always be a priority, but there is room for electricity-saving cloud optimization, too. If we are serious about saving the planet, IT executives will have to be part of a solution that keeps us living above ground. We enjoy a bit of Netflix dystopia for entertainment, but let’s not make those scenarios real for our grandchildren.
This article originally appeared on forbes.com, to read the full article, click here.
Nastel Technologies is the global leader in Integration Infrastructure Management (i2M). It helps companies achieve flawless delivery of digital services powered by integration infrastructure by delivering Middleware Management, Monitoring, Tracking, and Analytics to detect anomalies, accelerate decisions, and enable customers to constantly innovate, to answer business-centric questions, and provide actionable guidance for decision-makers. It is particularly focused on IBM MQ, Apache Kafka, Solace, TIBCO EMS, ACE/IIB and also supports RabbitMQ, ActiveMQ, Blockchain, IOT, DataPower, MFT and many more.
The Nastel i2M Platform provides:
- Secure self-service configuration management with auditing for governance & compliance
- Message management for Application Development, Test, & Support
- Real-time performance monitoring, alerting, and remediation
- Business transaction tracking and IT message tracing
- AIOps and APM
- Automation for CI/CD DevOps
- Analytics for root cause analysis & Management Information (MI)
- Integration with ITSM/SIEM solutions including ServiceNow, Splunk, & AppDynamics