Enabling AI with edge computing and HCI
Thu 26 Mar 2020 | Phil White
There is a constant stream of innovation happening in storage technology, and the hyperconverged infrastructure (HCI) market is leading the way
According to this report, the HCI market is expected to be worth $17.1 billion by 2023. This projected growth could be put down to the myriad of advantages that HCI offers, including single-pane-of-glass management, reduced rackspace and power which means greener data centres, and improved disaster recovery capabilities to list a few.
Logically, the next step for HCI in accelerating its evolution has been its move to the edge of the network. As the demand for supercharged instances of data-use is growing, such as artificial intelligence (AI), it’s not surprising that enterprises are looking to edge computing and HCI to enable them to capture data from the very start of their projects. By combining edge computing with HCI, businesses can enable their AI tools to make more intelligent decisions.
Let’s get digital
With the days of pen and paper behind us, digitalisation has become a necessity across industries. As a result, we are creating a tonne of data, which of course needs to be stored somewhere. More often than not, this data is stored on-site at the edge of a network – not your traditional data centre architecture.
One key benefit of edge computing is that it takes up a lot less hardware space than traditional hardware storage. By deploying this infrastructure at the edge of the network, it has the ability to not only handle and compile the data, but also compress the large amount of data so that it can be easily transferred into the cloud or into a centralised data centre at another site. This method grants access for the data to be handled and reviewed closer to where it was created, rather than trying to transmit it further away. This is why edge computing is often used by various distributed enterprises like fast-food restaurants, supermarkets, and petrol stations, as well as industrial surroundings like mines and solar energy plants.
Data collated at the edge of the network is not always being utilised to its full capacity. AI, for example, albeit still at the beginning of its journey, requires vast quantities of resources to develop and train its models. However, with edge computing the data is able to move freely into the cloud. From there the data can be analysed and the AI models can be trained before then extending it back to the edge. The best way for AI to be optimally used to generate these models is to make use of the data centre or the cloud.
Take the silicon chip company, Cerebras, which dedicates its work to accelerating deep learning. It has recently introduced its new “Wafer Scale Engine” which has been purposefully built for deep learning. This new chip is incredibly fast and 56 times bigger than the largest Graphics Processing Unit. Despite its grand size however, it does mean that its power consumption is at such a high capacity that most edge deployments would not be able to handle it.
That said, there is still hope, as businesses are able to amalgamate edge computing tasks using hyper converged infrastructure, enabling them to build and make the most of data lakes. By placing the data within a data lake, companies are able to use this to analyse it against all applications. The machine learning aspect is also able to unveil new insights through the use of its shared data against the diverse applications and devices.
When thinking about edge computing, HCI has made it much easier to use by combining servers, storage, and networking all in one box. Not to mention, it doesn’t face the configuration or networking issues it previously had. To add to this, the platform can administer integrated management for a high quantity of edge devices located in different parts of the country, with various forms of networks and interfaces, and thereby undoubtedly decrease operational expenses.
Breaking through AI’s glass ceiling
The surge in use cases for things like smart home devices, self-driving cars, and wearable technology means that AI is already prevalent in our everyday lives. According to Gartner, AI will continue to flourish with 80% of smart devices to contain on-device AI capabilities by 2022.
However, AI’s data collection does come up against a problem, because most of the technology powering it is hugely reliant on the cloud, and therefore can only come to a conclusion based on the data it has access to in the cloud. This results in a delayed response because the data first has to travel to the cloud, before heading back to the device. In the case of technologies like self-driving cars, which require instantaneous decision-making, any lag could result in huge complications.
In this scenario, edge computing has one up on the cloud and the potential to take AI to the next level. Any data required for that AI application is able to reside in close proximity to the device, therefore increasing the speed in which it is able to access and process the data. AI devices which are dependent on data conversion benefit the most from this application because they won’t always be able to connect to the cloud, as it requires access to bandwidth and network availability.
Another advantage of combining edge computing with HCI for AI is that it requires a smaller amount of storage space. The best operational feature about HCI is that technology is able to function, within a smaller hardware design. It will soon be commonplace to find companies launching highly available HCI edge compute clusters which are comparable to the size of a cup of tea.
If AI is to truly succeed, it will need to depend on HCI and edge computing to work together side by side, allowing AI to function on its own merit, and with minimal support. AI will be able to make the most of its deep learning asset, as well as improve its ability to make better decisions.
AI has the ability to be accessible to the vast majority thanks to technological advances in the cloud. However, it is the marriage of HCI and edge computing that will provide AI with the means it needs to surge into new territories, providing more intelligent and efficient methods to find a solution for all companies.