Mitech Preloader

EDGE COMPUTING TECHNOLOGY

Cloud Services / Dynamics / IT Transformation / Technology Advisory

EDGE COMPUTING TECHNOLOGY

Gartner defines edge computing as “a part of a distributed computing topology in which information processing is located close to the edge—where things and people produce or consume that information.”

At its basic level, edge computing brings computation and data storage closer to the devices where it’s being gathered, rather than relying on a central location that can be thousands of miles away. This is done so that data, especially real-time data, does not suffer latency issues that can affect an application’s performance. In addition, companies can save money by having the processing done locally, reducing the amount of data that needs to be processed in a centralized or cloud-based location.

Edge computing was developed due to the exponential growth of IoT devices, which connect to the internet for either receiving information from the cloud or delivering data back to the cloud. And many IoT devices generate enormous amounts of data during the course of their operations.

Think about devices that monitor manufacturing equipment on a factory floor or an internet-connected video camera that sends live footage from a remote office. While a single device producing data can transmit it across a network quite easily, problems arise when the number of devices transmitting data at the same time grows. Instead of one video camera transmitting live footage, multiply that by hundreds or thousands of devices. Not only will quality suffer due to latency, but the costs in bandwidth can be tremendous.

Edge-computing hardware and services help solve this problem by being a local source of processing and storage for many of these systems. An edge gateway, for example, can process data from an edge device, and then send only the relevant data back through the cloud, reducing bandwidth needs. Or it can send data back to the edge device in the case of real-time application needs.

These edge devices can include many different things, such as an IoT sensor, an employee’s notebook computer, their latest smartphone, the security camera or even the internet-connected microwave oven in the office break room. Edge gateways themselves are considered edge devices within an edge-computing infrastructure.

Edge computing is transforming the way data is being handled, processed, and delivered from millions of devices around the world. The explosive growth of internet-connected devices along with new applications that require real-time computing power continues to drive edge-computing systems.

Faster networking technologies, such as 5G wireless, are allowing for edge computing systems to accelerate the creation or support of real-time applications, such as video processing and analytics, self-driving cars, artificial intelligence and robotics, to name a few.

Edge computing means different things to different players. But one thing is constant: Location matters.

Edge computing enables autonomous mining equipment to react to unexpected conditions a mile below the surface, even when disconnected from a network. When a hotel guest places a food order from a mobile phone and wants to have it delivered poolside, edge computing makes it possible to steer servers to the guest’s lounge chair.

Sensors, smart devices, and mobile users are proliferating across all industries. Enterprises are investing in edge deployments to combat growing amounts of decentralized data that need to be processed in place. When low latency is essential, edge setups take the delay out of moving data to a data center or public cloud for processing.

There are advantages and challenges to either option, and which is the right edge-computing choice for any particular organization depends on their individual needs, budgets and staffing, among other factors. Here are some considerations:

Challenges of in-house edge computing

The IT-centric approach to edge keeps ownership of edge devices in-house and is likely to appeal to businesses with either strict legal requirements about where their data can be at any given time – a healthcare provider would be a good example – or a low level of institutional comfort for putting that data in the hands of third parties, like utility and manufacturing companies.

Handling things in-house can be challenging, however. For one thing, according to Christian Renaud, IoT practice director for 451 Research, the fact of the matter is that many IT shops lack the requisite expertise to handle an edge deployment on their own.

“We run into a few use cases where the internal IT team can’t handle the edge infrastructure, so handing it off to a vendor makes a lot of sense,” he said. “The challenge is that, with production systems, that’s a whole different ballgame [than IT], so there’s a pretty strict set of requirements in terms of what the OT vendors will let run on other people’s networks.”

The lack of common standards in edge compute limits customers’ ability to build their edge infrastructure using multiple vendors. An organization might not be able to use one vendor’s sensors without also buying its edge compute modules or networking gear, since they’re all part of the same offering.

Forrester vice president and principal analyst Brian Hopkins contrasts edge to cloud computing, where interoperability, open frameworks and containerization makes these concerns all but irrelevant.

While early goals of edge computing were to address the costs of bandwidth for data traveling long distances because of the growth of IoT-generated data, the rise of real-time applications that need processing at the edge is driving the technology ahead.

(https://www.networkworld.com/article/3224893/what-is-edge-computing-and-how-it-s-changing-the-network.html)

 

If you have any comments, please reach out to us at info@proso.ai

 

 

 

 

 

Leave your thought here

Your email address will not be published. Required fields are marked *