Edge Computing is Thriving in the Cloud Era

BrandPost By Denis Vilfort, Al Madden
Apr 06, 2022
Artificial IntelligenceEdge ComputingIT Leadership

Today’s edge technology is not just bolstering profits, but also helping reduce risk and improve products, services, and customer experience.

Credit: Getty Images

Recently, an increasing amount of hope is attached to edge computing. The industry is buzzing with bold ideas such as “the edge will eat the cloud” and real-time automation will spread across healthcare, retail, and manufacturing.

Experts agree that edge computing will play a key role in the digital transformation of almost every business. But progress has been slow. Legacy perception has held companies back from fully leveraging the edge for real-time decision-making and resource allocation. To understand how and why this is happening, let’s look back at the first wave of edge computing and what has transpired since then.

The first wave of edge computing: Internet of Things (IoT)

For most industries, the idea of the edge has been tightly associated with the first wave of the Internet of Things (IoT). At the time, much of the focus centered around collecting data from small sensors affixed to everything and then transporting that data to a central location – like the cloud or main data center.

These data flows then had to be correlated into what is commonly referred to as sensor-fusion. At the time, sensor economies, battery lifetime, and pervasiveness often resulted in data streams that were too limited and had low fidelity. In addition, retrofitting existing equipment with sensors was often cost prohibitive. While the sensors themselves were inexpensive, the installation was time consuming and required trained personnel to perform. Finally, the expertise needed to analyze data using sensor-fusion was embedded in the knowledge base of the workforce across organizations. This led to slowing adoption rates of IoT.

Additionally, security concerns cooled wholesale adoption of IoT. The math is as simple as this:  thousands of connected devices across multiple locations equals a large and often unknown exposure.  As the potential risk outweighed the unproven benefits, many felt it was prudent to take a wait-and-see attitude.

Moving beyond IoT 1.0

It is now becoming clear the edge is less about an IoT and more about making real-time decisions across operations with distributed sites and geographies. In IT and increasingly in industrial settings, we refer to these distributed data sources as the edge. We refer to decision-making from all those locations outside the data center or cloud as edge computing.

The edge is everywhere we are — everywhere we live, everywhere we work, everywhere human activity takes place. Sparse sensor coverage has been solved with newer and more flexible sensors. New assets and technology come with a wide array of integrated sensors. And now, sensors are often augmented with high resolution/high fidelity imaging (x-ray equipment, lidar).  

The combination of additional sensor data, imaging technology, and the need to correlate all of these together throws off megabytes and megabytes of data per second. To drive results from these vast data flows, compute firepower is now being deployed close to where the data is generated. 

The reason is simple: there simply is not enough bandwidth and time available between the edge location and the cloud. The data at the edge matters most in the short-term. Instead of being processed and analyzed later in the cloud, data can now be analyzed and used at the edge in real time. To gain the next level of efficiency and operational excellence, computing must take place at the edge.

This is not to say that the cloud does not matter. The cloud still has a role to play in edge computing because it’s a great place to deploy capabilities to the edge and management across all locations. For example, the cloud provides access to apps and data from other locations, as well as remote experts to manage the systems, data, and apps across the globe. In addition, the cloud can be used to analyze large data sets spanning multiple locations, show trends over time, and generate predictive analytics models.

So, the edge is about making sense of large data streams across a vast number of geo-dispersed locations. One must adopt this new perception of the edge to truly understand what is now possible with edge computing.  

Today: Real-time edge analytics

What can be done at the edge today is staggering compared to a few years ago. Instead of the edge being limited to a few sensors, data now can be generated from copious amounts of sensors and cameras. That data is then analyzed at the edge with computers that are thousands of times more powerful than they were just two decades ago — all at reasonable costs.

High core-count CPUs and GPUs along with high-throughput networking and high-resolution cameras are now readily available, allowing real-time edge analytics to become reality. Deploying real-time analytics at the edge (where the business activity takes place)helps companies understand their operations and respond immediately. With this knowledge, many operations can be further automated, thereby increasing productivity and reducing loss.

Let’s consider a few of examples of today’s real-time edge analytics:

  • Supermarket fraud prevention

Many supermarkets now use some sort of self-checkout, and unfortunately, they are also seeing increased fraud. A nefarious shopper can substitute a lower priced bar code for a more expensive product, thereby paying less. To detect this type of fraud, stores are now using high-powered cameras that compare product scanned and weight to what they are supposed to be. These cameras are relatively inexpensive, yet they generate a tremendous amount of data. By moving computing to the edge, the data can be analyzed instantly. This means stores can detect fraud in real time instead of after the “customer” has left the parking lot.

  • Food production monitoring

Today, a manufacturing plant can be equipped with scores of cameras and sensors at each step of the manufacturing process. Real-time analysis and AI-driven inference can reveal in milliseconds, or even microseconds, if something is wrong or if the process is drifting. Maybe a camera reveals too much sugar is being added or too toppings cover an item. With cameras and real-time analysis, production lines can be tuned to stop the drift, or even stopped if repairs are required – without causing catastrophic losses.

  • AI-driven edge computing for healthcare

In healthcare, infrared and X-ray cameras have been game changing because they provide high resolution and deliver images rapidly to technicians and physicians. With such high resolution, AI can now filter, assess, and diagnose abnormalities before getting to a doctor for confirmation. By deploying AI-driven edge computing, doctors save time because they don’t have to rely on sending data to the cloud to get a diagnosis. Thus, an oncologist looking to see if a patient has lung cancer can apply real-time AI filters to the picture of the patient’s lungs to get a quick and accurate diagnosis and greatly reduce the anxiety of a patient waiting to hear back.

  • Autonomous vehicles powered by analytics

Autonomous vehicles are possible today because relatively inexpensive and available cameras offer 360-degree stereoscopic vision. Analytics also enable precise image recognition, so the computer can decipher the difference between a tumbleweed and the neighbor’s cat – and decide if it’s time to brake or steer around the obstacle to ensure safety. The affordability, availability, and miniaturization of high-powered GPUs and CPUs enables the real-time pattern recognition and vector planning that is the driving intelligence of autonomous vehicles. For autonomous vehicles to be successful, they must have enough data and processing power to make intelligent decisions fast enough to apply corrective action. That is now possible only with today’s edge technology.  

Distributed architecture in practice

When extremely powerful computing is deployed at the edge, companies can optimize operations better without worry about delays or lost connectivity to the cloud.  Everything is now distributed across edge locations, so issues are addressed in real time and with only sporadic connectivity.

We’ve come a long way since the first wave of edge technology. Companies are now taking a more holistic view of their operations due to technological advances at the edge. Today’s edge technology is not just aiding companies bolster profits, but in fact, it’s helping them to reduce risk and improve products, services, and the experiences of people that engage with them.

To learn more  about how data can be analyzed and used at the edge in real time, check out the website, Intelligent Edge: Edge computing solutions for data driven operations. To understand what happens at the edge, at the core, and in between, read this blog on how HPE Ezmeral Data Fabric provides a modern data infrastructure that empowers data-driven decision making at the edge.

____________________________________

About Al Madden

Al Madden is involved in all things Edge. With degrees in chemistry and marketing, he is committed to finding the best ways to put technology to work. Whether in environmental monitoring, power distribution, semiconductors, or IT, Al now focuses mostly on making tech consumable, understandable, and usable through marketing and content strategy.

About Denis Vilfort

Denis Vilfort is director of PAN-HPE Marketing. A strategic thinker with a unique combination of sales/marketing experience and an in-depth understanding of technology, Denis focuses on helping customers solve technology challenges. He is a thought leader who not only thinks outside the box, Denis helps define new ones by asking better questions.