fbpx
Features Hub Opinion

The final frontier? How to overcome the age-old remote office IT headache

Mon 6 Apr 2020 | Ezat Dayeh

Remote office

In many businesses, from retailers and chain restaurants to health services and government offices, it is the local remote branch which is the prime client interaction point and where the business and the customer meet face to face 

As a result, it is where initial client satisfaction is formed and the loyalty journey gains momentum. But, as they are often located outside of where a core facility is for customer convenience, these sites typically don’t have extensive IT staffing and therefore lack the technical resources of core offices.

But they do require advanced client service applications with local processing capabilities. If every branch office needs to have its IT configured and deployed separately and in-person, costs are seismic and, in many cases, cost-prohibitive. As a result, remote office edge computing becomes a trade-off between costs and managing delays, service roll-out, and mixed customer satisfaction.

Solving the remote office branch IT challenge is now a core tenant to the business success of distributed organisations. Rising IT costs, evolving security challenges, and changing business models are driving the need to deliver advanced and efficient IT to remote sites which don’t have an onsite IT administrator or in-office data centre storage. As always, this must be achieved without putting further strain on already overstretched IT budgets.

However, current approaches to stretch data centre technology to support remote offices and branches have largely fallen short. Cloud and SaaS adoption has helped, but not all IT requirements can be met via the cloud, and to some degree, it is simply a patch for legacy edge computing assets.

In terms of local backup, recovery, cybersecurity, and more broadly data management requirements such as data mobility for replication, archival, dev/test, existing approaches fall short in achieving this. A new strategy is needed that lowers the total cost of ownership and enables the same capabilities as the core IT system and data centre to transform the economics, speed, and responsiveness of your edge IT systems.

New customer demands require new IT capabilities

Businesses with remote or branch offices that are critical to customer satisfaction — and therefore business success — struggle to find solutions that enable them to scale IT appropriately and manage remote data more effectively. There are solutions, but few make the IT infrastructure simpler, more secure and easier to manage.

In reality the opposite is true. The solutions are point products that require more dedicated technical resources, not less, and the deployment and ongoing operations add to the complexity, they don’t decrease it. For many it’s been a case of putting up with stretching data centre technology to fit the edge or trying to keep legacy on-premises servers updated. A no win situation.

Today there are three primary locations where data is being utilised and collected: the core (traditional and cloud datacentres typically in large offices), the edge (enterprise- hardened legacy infrastructure in branch offices and smaller sites that can’t warrant on-site support), and the endpoints (PCs, mobile devices, and Internet of Things sensors and devices). The esteemed analysts at IDC call the collective of all this data the Global Datasphere. And, it is experiencing tremendous growth. IDC predicts[1] that the Global Datasphere will grow by a factor of five, from 33 Zettabytes (ZB) in 2018 to 175 ZB by 2025.

 

Real-time is the new normal

In contrast to the core offices where IT systems have on-site technical support, branch offices have none and increase the burden on overstretched IT teams. As requirements for advanced edge applications increase, new approaches are required to reduce this burden and stop IT being a bottleneck to innovation at the edge. The ability to set-up IT systems for a new branch or office must be possible in hours, not days.

Customers expect to access products and services wherever they are, over whatever connection they have, and on any device. They want data in the moment, on the go, and personalised. This places greater demand on both the edge and the core IT systems to be able to deliver the precise data consumers require, often in real-time.

IDC predicts that due to the infusion of data into our business workflows and personal streams of life, that nearly 30 percent of the Global Datasphere will be real-time by 2025[2]. Enterprises looking to provide superior customer experience and grow market share must have data infrastructures that can meet this growth in real-time data.

By 2024, IDC expects[3] Edge storage will also see significant growth as latency-sensitive services and applications proliferate throughout our world.

Data must not become a burden

Today, even data created, stored and managed through core IT systems ends up in fragmented silos. Add remote office IT into the mix, and you have silos of silos. The challenge is real. Companies are beginning to understand the value of putting data to work to deliver insights, but they don’t understand well enough how to do this in an efficient, secure, cost-effective way.

Right now, each system primarily creates its own data; each system manages its own data environment. These environments rarely integrate. Data being backed up is done so as an insurance policy, that cannot generate value; it ends up being dark data. Therefore, the data is unable to provide value to the business, it’s simply stored for a recovery which may never be needed.

Businesses of all sizes and maturity have one common problem. An increasing level of data fragmentation; data spread across a myriad of different locations, infrastructure silos, and management systems that prevent the organisation from fully utilising its value.

A recent survey[4] conducted by Vanson Bourne, found that 35 percent of respondents reported using six or more different solutions to manage all their non-mission critical data operations. Over 10 percent of organisations use 11 or more solutions. That’s startling; each with licence costs, each unable to work with each other. The net result? Data has become a heavy burden, when it really should be serving as a value-add and competitive asset.

Wanted: A Way to manage rapidly growing data across remote office locations

Use cases that rely on edge computing are growing rapidly and as a result, 30 per cent of all enterprise workloads will reside in the edge by 2025 according to Gartner[5], which underscores the need for a unified approach to manage and protect data generated or stored at the edge in the same manner as enterprise data located centrally or managed in public cloud environments.

Companies need a solution that addresses administrative, security, latency, and bandwidth concerns at remote locations while at the same time complying with data recoverability concerns and compliance mandates of corporate headquarters.

Administrators need an easy to use way to run management operations centrally – such as creating simple policies for backup and recovery across multiple branch locations, as well as branch replication policies that ensure a copy of the data is available from the data center or cloud in case of local failures.

That kind of platform would help to enable high data resilience and would seamlessly replicate or archive data to the core or the public cloud. It could then help to create a significant reduction in the data footprint and WAN bandwidth utilization from remote sites with global variable-length sliding-window deduplication and compression. And last but not least, it could be a great step in order to have an integrated cybersecurity approach with an immutable file system, data encryption, and anti-ransomware capabilities.

Conclusion: Take a global approach to your data

It is increasingly important for companies to take a global approach to their data to provide low-latency, better customer experience and to address regulatory and compliance pressures requiring operators to locate data in regions in which customers are located. Whether in their own data centres or via cloud providers, companies need to consider which data needs to be located as close as possible to their customers and wherein the network it should be located (core vs. edge, cloud vs. own data centres).

Forward-thinking businesses are using a modern platform-based approach to data management that enables them to manage on-premises, cloud, and edge at the same time, with similar policies. This, in turn, enables you to bring actual value to all of your data, including data at remote/branch office sits.

Intelligent data is being sought to drive our businesses and lives in real-time and on-the-go. Many times, data has no time to travel from an endpoint to the core and back when informing real-time decisions. The enterprise edge helps to bridge this gap. Whether taking on data analytics or simply storing analysed and intelligent data, the edge will play an increasing role in enabling a real-time world.

If you have an expansive IT estate, you need to ask yourself, how much you value the ability to easily manage all your non-latency sensitive data and applications across core on-premises, cloud, and edge infrastructure – and what the ROI could be for your business if you enabled a more efficient edge IT capability. Your current Frankenstein’s monster of legacy IT might have served you until now, but can you afford not to modernise?

Experts featured:

Ezat Dayeh

Senior Engineer Manager UK&I
Cohesity

Tags:

data management edge computing Remote Working
Send us a correction Send us a news tip