fbpx
Features Hub Opinion

AI at the edge – challenges and opportunities

Thu 22 Aug 2019 | Carmine Rimi

There exists no set model for the successful implementation of AI at the edge – it is instead a case of flexibility, writes Canonical’s Carmine Rimi

As individual technologies, both artificial intelligence (AI) and edge computing have grown in stature over recent years. The connected home has brought a vast array of sensor-led solutions to the fore – remote-controlled heating, lighting and entertainment – while many laptops and tablets already possess the AI necessary to convert hand-written notes into text, creating instant designs on the screen.

While the convergence of the two is still in its infancy, together they hold the potential to revolutionise the lives of consumers and businesses alike. But it is not a marriage without its challenges, especially when it comes to practical implementation.

Laying the foundations

AI at the edge is not a distant dream. The ability to unlock a phone with your face, ask a smart assistant what the weather will be like tomorrow, or automatically adjust a camera to take the perfect low-light shots are all examples of AI in our pockets, right at the edge.

These use cases remain small scale, however, and relate to individuals rather than a technology that empowers groups or organisations across a variety of mutually-reinforcing applications. The challenge here is getting the infrastructure right to begin with, and to support the growing complexity at the extremities, because more network capacity locally means greater density at the edge.

The right mix of compute, AI accelerators, storage, and networking will allow AI at the edge to thrive and evolve. Similarly, bringing in the capacity of a core data centre, to offload any passive edge applications, will help streamline operations even further.

Although the edge is where the innovation appears to happen – where people actually interact with technology – balancing the workload between the locality and the cloud is the key to successful implementation. Get these foundations right, and the potential of AI at the edge will begin to be fully realised.

Harmonious infrastructure is the difference between one off examples of edge-based AI and much more expansive opportunities that connect larger patterns of data. Pooled from multiple devices, AI at the edge will offer intelligence greater than any individual piece of data can provide.

Think about mapping as an example of collective smartness in operation: traffic congestion alerts, speed traps and car sensors are all individual points of information that feed into a larger whole, from which smarter solutions can then be developed.

There is, therefore, no set model for the successful implementation of AI at the edge – it is instead a case of flexibility. But one thing that is certain is the need to prioritise security as the use cases multiply.

“When privacy combines with flexible infrastructure, AI at the edge will deliver innovation at a much greater scale”

Protecting the edge

Without trust and a comprehensive set of security measures, AI at the edge will never truly take off. The challenge is that privacy remains a double-edged sword.

On the one hand, processing data locally offers inherent benefits because the data remains in the desired sovereign area and does not traverse the network to the core. In other words, the data is physically domiciled at all times.

On the flip side, keeping data locally means more locations to protect and secure simultaneously, with increased physical access allowing for different kinds of imminent threats. A greater physical presence at the edge could, for example, increase the likelihood of Denial of Service (DoS) attacks, rendering individual machines or networks compromised.

To combat this threat, backup solutions that circumvent local edge failures may be needed. However, by removing the constant back and forth of data between the cloud and edge, privacy will be enhanced beyond its current capacity; especially where individual consumers are concerned, because personal information remains in the hands of the user at the edge. And when privacy combines with flexible infrastructure, AI at the edge will deliver innovation at a much greater scale.

Truly smart solutions

There are two primary advantages of embedding intelligence at the edge: the first is lower latency and the second is reducing network traffic to the core data centre. Both are critical for the real-time systems that power the likes of autonomous vehicles (AVs) or industrial robots.

They illustrate the convergence of the real world with the digital, and the need to act on data immediately in order for applications to perform as smoothly and naturally as possible. Without AI at the edge, both AVs and robots will remain in the camp of primitive connected technology, rather than intelligent systems that are able to learn and adapt automatically.

Closer to home, one of the biggest opportunities lies in the real-time contextual synthesis of the edge domain. The house acts as a combination of data streams from a whole host of independent devices.

Combined in the right way they help to tell a larger story, and from that story intelligence can be derived. A simple example could be motion sensors, facial recognition screens, and kitchen appliances all working together to produce an intuitive environment, where the home is one step ahead of the owner to help spread the workload.

The Internet of Things is transitioning from a predominantly centralised, hub-and-spoke cloud-based offering into a more distributed and intelligent edge set up. By overcoming the initial challenges of privacy and infrastructure, businesses will be able to innovate quicker and consumers will better understand the role AI plays in their daily lives.

The opportunities are rich, from industrial all the way to the home. Software allows the world to be smarter, and the combination of AI at the edge will allow inference to be done exactly where the application is – and in true real-time.

Experts featured:

Carmine Rimi

AI Product Manager
Canonical

Tags:

AI Cloud edge
Send us a correction Send us a news tip