Infrastructure at the Edge
Thu 30 Nov 2017 | Thomas Kovanic
IIOT, enterprise and the data centre
The need to minimise latency in critical data paths, with systems that require real-time responses, has swung the computing pendulum back towards local systems, in what we are today calling edge computing. The further the IT industry distances us from our data, the slower our response to any action request will be. Therefore, keeping real-time actionable data as close to the data source or the data user will optimise the systems’ opportunity to respond effectively, and in some cases, avert a catastrophe.
Organisations’ capability to generate, use and store data has driven us to the zettabyte (ZB) era, and estimates state that global IP traffic will reach 2.3ZB by 2020. The reaction to this flood of data is to migrate expensive hardware and software to centralised off-site data centres, housing our total corporate IT platform, applications and our valuable data. However, every step away from the data source introduces a gateway, switch or other delay in the ability to act in real-time. Data centre operators, even owned data centres, frequently reorganise data locations to optimise their capacity loading with the result that our data could be located across the globe, introducing higher latency into our communication link.
The Internet’s and therefore cloud computing’s key advantage is resilience; the global network’s capability to reroute data around a network problem and delivering it to its destination. It is that rerouting, that can exaggerate the problem of latency and network jitter. Additionally, data packets may be damaged while on their way to their destination, which introduces recompiling latency and possible errors at the end-point, requiring data to be resent, exacerbating the situation.
As a response, Edge Computing is akin to older style corporate computing, where the data centre is in-house, but this does not mean that it is a throw-back to a computing dark age. In fact, Gartner’s “Hype Cycle for Emerging Technologies, 2017” states that ‘edge computing is on the verge of becoming an innovation trigger’, and includes edge computing on the list of key platform-enabling technologies to track. Today’s edge computing returns the compute resource to close to the data source to provide real-time processing.
Of course, the vast majority of data consumed is pretty much impervious to the delay and overall latency that is a side effect of the data centre’s location. We have all observed delays while watching live TV, or on a business skype call or any number of other small incidents of data delay over the network, which are irritating, but not costly. However, there are a growing number of real time requirements and none more so than in the Industrial Internet of Things (IIoT) environment, where a company’s ability to respond in a defined time scale is imperative.
It is in this IIoT environment where Gartner’s’ ‘innovation triggers’ are going to have a massive impact. Segregating time sensitive networks that have real-time response requirements, within an edge network and away from the off-premises corporate networks, will safeguard an organisation’s capability to collect, process and respond to IIoT data in a much more efficient manner.
Networked manufacturing, such as industrial automation systems, provide increasingly high value data to monitor control systems across the factory floor. Automated manufacturing operates with fewer human interactions and increased reliance on sensors monitoring systems and processes for faults or errors. Sensors across the system constantly relay data to the control system which monitors the data stream analysing it for anomalies. When errors are detected, the system responds with control signals, which may stop the manufacturing process, or correct the fault within an approved timescale, and then monitor the change to ensure a satisfactory outcome. Halting the process may be an extreme case and involve human intervention before the process can be restarted, but may be required to prevent a costly manufacturing fault in an expensive end-product. Innovations may not necessarily remove the need for human interaction, but will change the skill sets of the human technicians on the floor, therefore less machine operators and more network specialists may be required to implement the new generation of factory automation.
Essential in the development of edge computing capabilities is the choice of architecture and technologies used to implement the solution.
- Adopting a spine-leaf architecture, which utilises two layers of switches, offering less complexity compared to the traditional three-layer architecture (see diagram 1), offers improved responsiveness.
- Introducing low-latency infrastructure media. As data becomes more valuable, fibre optic cabling is becoming increasingly attractive due to its low latency, high bandwidth and longer cable run capabilities. Fibre is also highly effective in the manufacturing environment due to being impervious to electrical and mechanical interference.
- Using lower latency equipment. All equipment has latency associated with it, therefore adopting switches, routers and servers with the required capabilities can improve the systems overall responsiveness.
Effective edge computing requires a design that identifies the real-time response applications, including mission critical systems and IIoT, and deploys a solution that provides the architecture for this, whilst ensuring interaction with the wider non-time critical data located on corporate data centres or in the cloud. As the growth in data from industrial automation increases the capabilities of the network and processing required to optimise the factory floor becomes more important. Real-time systems at the edge can generate real operational benefit, while allowing the organisation to utilise the cloud as a repository for continuously updated data to be processed, analysed and stored and used for trend analysis and other business requirements. Edge computing is not an end in itself, it offers real-time capability in a global environment, where data holds an increasing amount of value. However, it is the ability to react effectively to that data that makes it priceless.