fbpx
The Stack Archive Feature

Quick words with OpenStack Foundation: Edge Computing

Fri 21 Sep 2018 |

The Stack catches up with Ildikó Vancsa, ecosystem technical lead at OpenStack Foundation on how open infrastructure is playing an ever-greater role in edge computing 

Rapid development, deployment and iteration of software is increasingly emerging as a sustainable means of achieving competitive advantage for many businesses.

Open infrastructure and the development tools that support it can spur this innovation further, allowing organizations to share the work of creating these tools among many, many users. This not only reduces risk but also gives them access to new technologies that they can use to continually develop and deploy software to keep users happy.

When looking specifically at edge computing, much of the infrastructure necessary to support the technology is shared across many users and applications.

Edge computing can help minimize the effect of latency and limited or unreliable network connection

Edge use cases often involve a large number of edge sites and heterogeneous environments in which different operators and vendors are sharing the same network. In these instances, interoperability is crucial.

Open infrastructure enables developers of edge applications to share knowledge and experiences to build interoperable systems based on common standards and processes. This leads to better performance, greater security and lower deployment risk for everyone.

Transforming the cloud paradigm  

While edge computing is all about bringing the workloads and data closer to the end user, in most cases it still relies on a central data center cloud in its overall architecture.

When the use cases predicate an environment of widely distributed edge sites, edge computing can help minimize the effect of latency and limited or unreliable network connection on the applications.

The end-to-end architecture of edge computing use cases can include sites in extreme circumstances and sites with limitations on space, power and physical access. Edge computing will require those who work on open infrastructure to integrate the technologies that are used in traditional data centers, such as virtualization, containers and bare-metal based solutions to address the challenges of these extreme situations.

Beyond software solutions used in cloud environments, edge computing is also driving innovation in the hardware industry, challenging hardware vendors to fulfill the special requirements of edge computing use cases relative to performance, low power consumption or robustness towards environmental circumstances.

While a common example of latency-critical edge use cases is autonomous vehicles, there are also numerous potential use cases to keep your eye out for including retail, virtual assistants, CDNs for high-definition data, facial recognition and image processing, smart cities, autonomous site operations (mining, drilling, transportation) and many more.

Tags:

edge feature OpenStack
Send us a correction about this article Send us a news tip