Two people on the way to a meeting

As companies reimagine the IT landscape, the question is increasingly less about cloud vs. edge deployment, but rather how to determine which workloads run where as part of an optimized hybrid cloud strategy.

While cloud deployment is rising, edge computing has also been thrust into the spotlight given the need to shift computing resources closer to the locations where data has the potential to drive more real-time decision-making. HPE broadly defines the edge as a place where people, places, things, and data intersect. How you apply edge resources will vary based on specific business needs.

IDC predicts there will be an estimated 55.7 billion connected internet of things (IoT) devices on the planet by 2025. [1] At the same time, more than half of enterprise-managed data is expected to be created and processed outside the data center or cloud. [2]

Given the expanding footprint, how should companies determine which workloads to migrate to cloud or deploy at the edge? We took that question to the CIO Experts Network, a community of IT professionals and technology industry influencers, to get their perspective on how to align workloads to the right compute model.

“More data, faster [processing], and safer workloads needed in smart buildings, hospitals, government, manufacturing, and education are going to point towards edge computing options, especially as edge infrastructure costs reduce and managed service options increase,” says Isaac Sacolick (@nyike), president of StarCIO and author of “Digital Trailblazer.”

Real-time considerations

Making the right determination starts with asking the right questions.

Companies need to consider how much data is generated at a specific physical location, determine the velocity of real-time data processing required, and study an array of other factors such as whether localized machine learning is part of the mix. It’s also important to take into account whether the company has the on-staff resources and expertise to monitor security and operational risks, debug systems, and optimize edge hardware, notes Tom Allen, founder of the AI Journal.

Workloads that generate large amounts of data requiring quick processing are good candidates for edge while batch processing tasks are more efficient in centralized data centers, notes Helen Yu (@YuHelenYu), founder and CEO of Tigan Advisory Corp.

“The workloads that belong on the edge are the ones which give customers faster response times, greater privacy, and lower costs,” added Tennisha Martin (@misstennisha), executive director and chairwoman at BlackGirlsHack. “This looks different depending on the sector and the industry vertical. It can be imaging or IoT devices in a hospital setting or point-of-sale systems in a retail setting. The determination is made by asking where seconds are critical to the way the company does business.”

In addition to latency and data volume concerns, other factors to consider in edge deployment are criticality of services, sensitivity to service failures, and workload security requirements , according to Elitsa Krumova (@Eli_Krumova), a global thought leader and tech influencer. The number of users and their physical location is another determinant, especially when there are real-time network app resources required for secure workloads involving AI, machine learning, or real-time CRM apps, adds Adam Stein, (@apstein2), principal at APS Marketing. “Processing of always-changing, unique user-derived information so close to the people and systems required is ideal,” Stein says. “This helps both zero trust and SASE services like SD-WAN or CASB push security and access close to users.”

To balance the tradeoffs between edge and cloud, it’s important to access the characteristics of the workload through a variety of different lenses. “Assuming architecture, security, DevOps suitability, and cost are equal, edge wins against centralized cloud when response rate, mobility, or remote connectivity is of [the] essence,” says Michael Bertha, vice president at Metis Strategy.

Given the back-and-forth swings between centralized and distributed computing models over the years, the need of the hour isn’t necessarily to choose one compute model over the other, but rather, deploy a flexible and scalable combination of the two. “The most optimal workloads are those where one part lives on the cloud and the other on the edge, enabling simultaneous, task-based and real-time processing of data,” says Dipti Parmar (@dipTparmar), chief strategist at Dipti Parmar Consulting and co-founder at 99Stairs. “These include workloads that are mission critical, process large amounts of data in real-time, or require low latency, including quality control, gaming, CCTV, state management, load balancing, web services, medical procedures, and banking.”

Best practices for deployment

As companies go about aligning workflows to edge or cloud, there are number of best practices that can help streamline deployment. Among them:

  • Understand data gravity requirements. Use cases requiring fast access to data for decision making lean towards edge deployment. As part of this exercise, evaluate how quickly you need to process data, identify the performance requirements for data, and consider the latency demands – especially if there are services that require immediate access to a data set, cautions Alex Farr (@AlexFarr_IT), CTO at Christie Group.
  • Inventory your workloads. Workloads that don’t require heavy power or large storage capacities are great candidates for edge computing. “It’s not because the edge can’t handle larger workloads—rather these business cases are low risk for the company and make a wise first step when exploring the potential of edge computing,” advises Peter Nichol (@PeterBNichol), CTO at OROCA Innovations.
  • Business strategy alignment is key. Edge computing can enhance financial transactions by improving timeliness and availability or enabling real-time responsiveness for health care monitoring or industrial control systems. It all comes down to choosing a deployment model that advances innovative ways of working or new digital strategies. “Edge computing devices are now proven and business-critical options,” says Kieran Gilmurray (@KieranGilmurray), CEO at Digital Automation and Robotics Limited. “You must decide how to deploy edge devices in a way that best suits your particular business strategy.”

To learn more about how the HPE GreenLake Platform can boost your edge deployments, click here.

[1] IDC Worldwide Global DataSphere Forecast, 2021–2025: The World Keeps Creating More Data—Now, What Do We Do with It All? March 2021, David Reinsel, John Rydning, John Gantz

[2] Predicts 2022: The Distributed Enterprise Drives Computing to the Edge, October 20, 2021, Gartner—Bob Bill and Tom Bittman—ID G00757917

Advertisement
Share
Share
Advertisement