Escaping legacy backup constraints in the data centre
Mon 17 Jul 2017 | Frank Barker
Frank Barker, CEO at Bacula Systems, discusses why data centres and MSPs are demanding greater scalability, performance and reliability from their backup solutions…
More than ever before, current market trends in IT infrastructure are about finding high-value solutions to drive efficiency and reliability. The majority of businesses, particularly those where IT is the business, face fierce competition and data center services and costs need to be carefully managed.
In the world of data backup and data recovery, users also need to be able to achieve a wide range of business objectives including policy compliance, reduced backup window time, faster data recovery, data deduplication, reduced data loss and even detection of storage problems, usually in a heterogeneous world.
Solutions need to be highly secure, easy to use, flexible and customizable. It’s blindingly obvious that customer needs continue to vary. Some customers may demand a specific capability such as Single File Restores from within its virtual machine (VM) infrastructure, or the ability to send selected backup data into a cloud, while others may require late data inclusion technology to ensure that last minute data is included when a database or application backup is in process. The combination of services required to be delivered can be extensive.
All of these complex requirements are regularly needed for backup and recovery in both physical, virtual and cloud environments, scaling from a few servers to a whole customer’s infrastructure and often to entire data centers.
Legacy backup services – all change
Many larger data centers are uncomfortable with the challenge of cost-effectively delivering to these requirements using legacy data backup services because of the lack of flexibility inherent in the underlying products on which these services are based. This together with high charges, often for a variety of software capabilities that are no longer useful or are unnecessary, drives frustration and the desire for change.
MSPs have a need for robust services delivery and agile technical development while also managing their bottom line
To add fuel to the fire, a number of the traditional data backup vendors are undergoing M&A transitions. These changes usually result in the shuffling of product portfolios and people, the selling of product lines and the adoption of business practices aimed at financing the M&A transactions by extracting more money from the customer base while simultaneously reducing costs by slowing or even stopping technical development and support.
Customers such as Sky in the UK, or NASA in the U.S., for example, have recently updated their backup model as they found other offerings to be needlessly bulky, expensive and inflexible. They were looking for a commercial model that fitted the emergence of the standard model for data backup (on-site combined with off-site and cloud). It was no longer right to be charged by data volume. Instead, they needed a flexible, service-based model that could be easily understood.
A new approach like this is particularly attractive to MSPs. The sector is undergoing a high degree of organic growth coupled with significant consolidation. MSP is a fairly broad term with many sub-sections, including IaaS, PaaS and data hosting companies. All are looking for data backup models that match their customers diverse and dynamic requirements as they face numerous challenges such as demanding SLAs, broad technical requirements and quota management.
Facing the pressure of competition, MSPs have a need for robust services delivery and agile technical development while also managing their bottom line. Technical capabilities used in accessing a REST API to easily and quickly develop a user front-end or for client-initiated backup or mixing and matching the use of GUI and command-line interfaces make all the difference. Of course, avoiding being taken for a ride by the backup vendor over data volume is also a strong driver.
Many businesses now incorporate backing up data to the cloud, and within the cloud, as an integral strategy to their existing on-site and off-site practices. This gives them another option in backup safety and economics, and it can also help as part of a disaster recovery (DR) strategy.
As with many things, these use cases depend on specific needs and it often comes down to convenience, reliability and economics.
Bacula has a customer-centric approach to backing up data to and within the cloud. To overcome the usual bandwidth limitations and restrictions there is a cloud solution which incorporates a buffering system to ensure that cloud backups are not blocking system resources waiting for bandwidth.
In addition, native cloud integration means that in the event of a recovery only the specific data that needs to be recovered from the cloud is restored. Individually and together these can save a company a huge amount of resources and money, as well as providing control to the user over what and how much data to put into (and take from) the cloud, and when.
Backup technologies like these are becoming increasingly important as companies need to be able to predict their data backup and recovery costs.
The future of backup
The backup services market will develop over the coming years in terms of basic importance and also in its range of services. There are some who predict that backup and backup services are no longer necessary in today’s world. I wait till they have their first data loss and look forward to working with them!
The reality is that backup is and will remain pervasive. Many large MSPs and software companies now have a DevOps function to quickly develop services for their customers. Backup is almost certainly one of their high margin, high-value services and has become a de facto standard.
Extensions including the development of monitoring tools are now almost always taken with an agile approach. As DevOps departments grow they need tools to ensure their ability to be agile.
Large data centers will increasingly seek better value from their backup and recovery solutions
Meanwhile, data volume continues to increase rapidly in nearly all IT environments and IT managers need to plan accordingly. Data deduplication continues to be important. Easy scalability means more than just the technical ability to do so. It also has to be affordable. Traditional pricing models and the traditional suppliers who are addicted to these to finance their bloated and inefficient organizations, just won’t survive over the long haul.
Backup vendors tend to jump on a current topic such as we have seen with ransomware recently. However, backup software should already offer options to protect companies against ransomware and similar threats. Protecting against this kind of eventuality is merely a basic functionality of good backup software and associated policies. There is a similar situation with the current discussions around the upcoming GDPR regulation in Europe.
Going forward, large data centers will increasingly seek better value from their backup and recovery solutions and, in the case of MSPs, competition driven by consolidation is going to get tougher.
The scale of expenditures and the ability to predict current operational costs and control future costs will drive closer CFO involvement in operations. This, in turn, will drive service delivery cost reviews including even crucial high margin services like backup.
In many cases, there are real opportunities for these businesses and departments to move away from legacy vendors and become more efficient and robust, although some have become victims of vendor lock-in, or are unaware of viable alternatives.
Bacula Systems is the leading Enterprise Open Core network backup and restore software company, combining Bacula’s enterprise-class open standards software with first-class support and professional services. The company is known in the industry for its commitment to safe, secure and reliable backup solutions.