Have you ever had a package that was lost when it was about to reach its destination? With the holidays upon us, many of us are ordering and sending packages in the hope that they will arrive in time. Getting packages to their destination requires a network of postal or courier sorting centers to efficiently deliver packages to their destination.

Have you ever had a package that was lost when it was about to reach its destination? With the holidays upon us, many of us are ordering and sending packages in the hope that they will arrive in time. Getting packages to their destination requires a network of postal or courier sorting centers to efficiently deliver packages to their destination.

From a text message to the complex code to operate a self-driving vehicle, in a way, data is the same as a package. To get data “packets” from point A to point B, it is crucial to avoid bottlenecks. In fact, this is more important than ever, as we send record volumes of data: over 18 million text messages and 187 million emails have been sent in the minute you read this article. Without an efficient network to handle the request, the last text message you sent won’t travel more efficiently than a holiday package that was shipped to the wrong sorting center.

This is why edge data centers were born. They store and process a local copy of the data you may need, allowing your request to be processed across the network. By bringing this data and processing closer to your location, you get faster responses, which means faster load times for video or higher-resolution TV streams. This is more important than ever, as machine learning, artificial intelligence, virtual reality, telemedicine, intelligent transportation, and more drive the demand for data.

All of these emerging applications require the ultra-low latency provided by edge computing. Latency refers to the round-trip time for the data center to receive the request, process the request, and send back the necessary information (such as video, text responses, etc.). Historically, network capacity and technology have determined latency – imagine the difference in internet speed between dial-up and fiber broadband.

Cutting to the Edge: Why Data is Getting Closer to You

To understand why edge computing is so valuable, let’s look at a case study: cloud gaming. That’s where games run on data center servers, not your home PC or console. Games are streamed live to your TV, and your control inputs are sent to the data center in real time, allowing you to control your game character. This allows users without a high-end console or PC to play the latest games – but it requires incredibly low latency, otherwise the game will lag and stutter.

As we begin to implement new technologies such as cloud gaming, existing network capacity and technology are insufficient to support large-scale usage. To make these technologies possible, service providers are moving data centers closer to the “edge” of the network. This way, data travels less distances before being processed, and reduces latency to the level necessary to support these emerging applications.

Cutting to the Edge: Why Data is Getting Closer to You

Edge computing isn’t just for these emerging low-latency applications. It can also play an important role in mainstream data-intensive applications.

When you click to watch an online video, the request traverses the network, from your phone to a wireless base station tower, through fiber optic cables, and finally to the data center that contains your requested video. Transmitting this information over a network has a cost in the form of electricity — the farther it travels, the more expensive it is. With sustainability a top priority for operators, reducing electricity usage is both financially and socially responsible. Edge data centers are a critical component of delivering exceptional service while maintaining acceptable cost levels.

Drivers for edge computing: Emerging applications that require lower latency levels, reduce costs despite the large volumes of data we generate every day, and do so in a sustainable manner. While these emerging applications are exciting, the reality of edge computing lies in the cost-effectiveness it generates for data center operators. Edge computing is both a necessary part of a sustainable cyber future and a key driver of future technologies. Just as logistics networks are investing in their distribution infrastructure to efficiently meet future demands, communications also need to invest in edge infrastructure to do this.

To keep up with these demands, data center operators are looking for innovations. Corning has answered that question with our CleanAdvantage™ cleaning technology and ultra-high density solutions to save time and maximize capacity. Corning’s data center solutions are uniquely positioned for your data center deployment (edge ​​or otherwise).

The Links:   SKIIP32NAB12T10 BCM54280C1KFBG