Interoperability a long way off as enterprises target multicloud
Clouds remain segmented, leaving businesses little recourse for how to best navigate complexity.
The multicloud movement has flavors to it. Intentional or accidental, an enterprise can blend a kaleidoscope of infrastructure services with existing software tools. IaaS requirements can tag along with SaaS adoption, easily creating a multicloud environment before technology teams can consider the sprawl.
This dynamic plays out in higher education. Miami University in Ohio follows a need-based multicloud strategy, using Amazon Web Services as its primary provider, Microsoft 365 with Active Directory (AD) in Azure and Google services for education apps, calendar and some research computing, according to David Seidl, VP of information technology and CIO at Miami University.
With multiple services providers, “you need to do enough multicloud to make sure you have authentication [that] works amongst these three different clouds,” Seidl said.
Whether foreseen or not, multicloud is becoming the standard enterprise computing strategy. More than one-third of IT decision makers operate in a multicloud framework, a number which is expected to exceed 60% within three years, a Nutanix-sponsored Vanson Bourne survey of 1,700 IT decision-makers shows.
Deployment is even more prevalent in large enterprises, where more than half of organizations use multicloud, a number that is expected to grow to 80% within three years, the research finds.
But the challenge for companies is navigating a tech stack where interoperability is an unrealized dream. Clouds remain segmented leaving businesses little recourse for how to best navigate the complexity. Instead, the attainable aspiration is to choose a main provider, and integrate where needed.
Seidl’s main goal is to “get really good at our primary provider,” he said. As part of a technology transition, a campus still has to provide must-have services, where “OK” execution supersedes expert deployments in the name of continuity.
“We have to be OK at AD, we have to be OK at Office 365 to be able to maintain our servers,” Seidl said. The university is looking ahead for opportunities for cost savings or effectiveness in another cloud like Azure.
A strategy for interoperability
Simplicity is largely absent in multicloud deployments, and nine in 10 Nutanix respondents say they need simpler management tools to succeed. The greatest challenges lie in managing security, data integration and cost.
“The original rationale for multicloud was to take advantage of the benefits” of each of the three major cloud providers, said Next Pathway CEO Chetan Mathur. Multicloud could also allow organizations to avoid vendor lock-in.
Businesses want the flexibility of where to place their workloads and possibly drive down cost, he said. There are not, however, clear mechanisms to easily port data from one cloud to another outside of containers.
“To my mind, everything is shifting to Kubernetes,” said Daniel Herndon, director of cloud services at Laserfiche. “If you read about multicloud management platforms, they’re all based around standardization of Kubernetes.”
To manage the breadth of assets, companies need a singular software image and management console, Herndon said. Otherwise, businesses are left to understand how to work across clouds without a holistic view of what’s happening.
The ideal cloud deployment for the majority of IT experts is hybrid multicloud, where an organization can work across private and public clouds with interoperability built in. But the vast majority of Nutanix respondents say they lack the IT skills to meet business demands.
Without architecting for a hyper mobile infrastructure stack built on containers, many organizations will end up earmarking a primary provider to handle the bulk of their workloads. As is seen in strategies like Wells Fargo’s, companies can use a secondary provider to handle specialized workloads, capitalizing on one cloud’s expertise over another.
As Forrester notes, many enterprises are choosing a multicloud strategy to run the “right workload on the right vendor,” without too much dependency on a single provider.
The advice for CIOs is to plan and architect infrastructure for the long term and avoid chasing one shiny object, said Mathur.
This article originally appeared on ciodive.com, to read the full article, click here.
Nastel Technologies is the global leader in Integration Infrastructure Management (i2M). It helps companies achieve flawless delivery of digital services powered by integration infrastructure by delivering Middleware Management, Monitoring, Tracking, and Analytics to detect anomalies, accelerate decisions, and enable customers to constantly innovate, to answer business-centric questions, and provide actionable guidance for decision-makers. It is particularly focused on IBM MQ, Apache Kafka, Solace, TIBCO EMS, ACE/IIB and also supports RabbitMQ, ActiveMQ, Blockchain, IOT, DataPower, MFT and many more.
The Nastel i2M Platform provides:
- Secure self-service configuration management with auditing for governance & compliance
- Message management for Application Development, Test, & Support
- Real-time performance monitoring, alerting, and remediation
- Business transaction tracking and IT message tracing
- AIOps and APM
- Automation for CI/CD DevOps
- Analytics for root cause analysis & Management Information (MI)
- Integration with ITSM/SIEM solutions including ServiceNow, Splunk, & AppDynamics