Storage outsourcing renaissance
Storage service provision is back on the IT agenda, as bandwidth, management tools, and market forces come together to create a compelling proposition for overstretched IT departments, reports E&T.
The management of enterprise data is an ever-increasing burden to IT operations, and one of the biggest challenges on the IT agenda. Standard thinking characterises data as the product of computerised business environments and applications, but this now represents just one slice of a rapidly expanding 'digital universe' that is enfranchising a wide variety of formats.
Consider the data generated by technologies that have emerged in the last decade - such as RFID, vehicle telemetry, and mobile call-logs - plus the huge amount of transactions and billing activity that now takes place entirely 'online' (i.e., on a computer).
This, and a confluence of other recent developments, mean it may be time to reappraise storage outsourcing options, with as much as 50 per cent of enterprise data apt to be passed on to accredited third-party storage management specialists.
Such a prospect represents a significant change in attitude. Since the dot-com collapses of 2001 and 2002, consigning data across private or public networks to the care of a third-party storage provider acquired something of a tarnished reputation. Many CIOs endorsed the mantra that "data is your company's most valuable asset - it should not be put at risk". Retaining ownership of data also meant that it could be mined to yield useful statistical intelligence about the business, and using smart tools correlated to show hidden potential revenue opportunities.
It sounded convincing, but this notion did not take into account the data deluge that buoyant economies generate, and many commercial sectors, such as like retail, found that they didn't need to mine data for these hidden revenue opportunities; they had their work cut out coping with the demands of core business profitability. And, meanwhile, they were still hugging huge volumes of data that was growing at an exponential rate, quarter-on-quarter, year-on-year.
"In the recent economic downturn data growth levels have continued to rise," says Carla Arend, programme manager for storage, software and service research at analyst IDC. "This is something that just hasn't happened before. Usually recessions result in a decline in the amount of data generated." This anomaly has created some significant headaches that IT strategists just hadn't planned for, she reports.
The initial way of addressing the rising data tide was to throw more storage hardware at the problem. "Throughout the 2000s storage devices became cheaper and easier to manage, so they were increasingly added to the infrastructure to contain data growth in the short term," Arend adds. "However, over the last two years this has changed. Corporate data centres are reaching a physical limit in terms of how much more hardware they can accommodate. Also, because users have periodically expanded, buying storage devices piecemeal according to the best deals available at the time, they've now ended up with heterogeneous networks of storage devices that are proving very onerous to manage." This rather messy situation is one of the reasons why handing the whole storage caboodle to someone else to sort out is becoming more attractive.
Indeed, we are entering a "second wave of outsourcing", according to Phil Evans, UK strategy manager at data management specialist i365.com. He believes this is the result of multiple factors that are assisting each other toward a so-called 'outsourcing renaissance'. "SMEs, for example, now need to have secure enterprise-class back-up and storage levels, due to a number of reasons, but primarily driven by the advent of regulatory compliances. This class of operation is, however, beyond their budgets in terms of capital expenditure on equipment, and on IT staff resources. Outsourcing their storage needs, however, offers an affordable solution to this dilemma."
Even in medium-to-large organisations with larger IT departments, effective storage management has been "a burden on the IT staff, preventing them from addressing more pressing operational tasks", Evans believes. "Outsourcing storage frees them to concentrate on more critical issues."
In addition, argues IDC's Carla Arend, the economic squeeze has caused CIOs to revisit a business conundrum that has been out of vogue for a decade or more: what is the true nature of our core business - and to what extent are the demands being made on resource being made by IT operations detracting from our core expertise?
"This sort of thinking goes in cycles," Arend says. "We've seen outsourcing wave in the past, and it looks like we are possibly entering a new one."
There have always been companies offering online backup and storage of one kind or another. The model is changing. Due to bandwidth constraints, online backup and storage was a slow process even if an organisation had expensive dedicated high-speed circuits at its disposal. "The time taken uploading data from the customer to the outsourcer's servers was less of an issue than the time taken to restore that data in an unplanned recovery situation," explains Evans. "Now compliance mandates that restore times are faster."
The increased availability of high-capacity broadband circuits has helped this, but with even the fastest circuits open, backing up the many terabytes of data generated by a medium-to-large business within a 24-hour period is a challenge.
"The outsourced storage model existed, but the technology wasn't ready for it," explains Tony Reid, services director for Hitachi Data Systems UK. "Issues around bandwidth availability and costs, latency, and cost made it unfeasible for enterprise-sized data volumes. Only small businesses were in a position to benefit."
At the end of the 1990s the mindset was that outsourced backup and storage should capture all of the data that an organisation generated in one fell swoop - including critical and non-critical volumes, intermediate and duplicate copies, and even data earmarked for deletion. However, in recent years IT managers have discovered smarter approaches to classifying their data, and applying a tiered hierarchy of importance to it.
"Organisations are looking at these massive amounts of data, and they are assigning status to it," says i365.com's Evans. "The most valuable data assets they will still be unwilling to store outside the owned enterprise IT infrastructure… but some of the less critical data - data that may be even years old, and is not critical to the day-to-day running of the business, but must be retained for reference purposes - could be assigned to an outsource service."
He also suggests that having to examine the nature of enterprise data for storage purposes has also, inadvertently, provided illuminating insights for many organisations into the efficiencies of their IT systems, and the ways in which data outputs help or hinder critical operations.
The categories of data assigned to outsourced storage are not necessarily fixed. As Hitachi Data Systems' Reid points out, organisations may trial a new application using a hosted model for both the application and storage; however, if the application proves successful, and its operations acquire critical importance, then the decision might be made to bring both it, and storage of the data it generates, in-house.
"There's always a concern about critical structured data being housed outside of the enterprise," believes Reid, "but there has been a big shift of attitude toward outsourcing storage of unstructured data. And this is the type of data that is growing at three-to-four times the rate of the more important stuff."
Without contradicting Reid's view, Carla Arend notes that there will be instances where an organisation might outsource storage primary data, and that is when they have outsourced a primary application itself: "In the end it's less to do with possession of data than the speed at which it can be restored and accessed in a disaster recovery scenario."
Service level agreements
Another factor in increasing the reliability of storage outsourcing is the service assurance offered by the outsource service providers, and the changing nature of service level agreements (SLA) models.
"In the past there was a tendency for SLA negotiations to drag-on over a period of time, so that by the time they were eventually signed off the customer's requirement had changed, and the process had to be revisited," says Arend. "Now both parties are more pragmatic in their expectations. SLAs have to reflect the regulatory stipulations that enterprises must comply with", and now that the outsource providers are also subject to regulation, such as Statement on Auditing Standards No 70 in the US (commonly abbreviated as SAS 70), and the UK's AAF 01/06, which provides extra reassurance to the outsource option.
From the service providers' perspective, this is a market worth investing in. Hitachi Data Systems' Reid admits that the storage hardware provider is looking into the potential of entering the outsourced storage sector space at some future point. "Our products are found in the 'engine room' of storage provision, and we know a lot about what the market needs."
There's no evidence that escalating data volume will flatten out at any point soon - Phil Evans estimates customer data growth rates of around 20 per cent per annum - although technologies like deduplication may cause the rate of rise to decrease. It also means that starting with a basic value proposition - "we will store your data, back it up for long-term preservation if required, and commit to restoring it in accordance with SLAs" - storage outsourcers are in a position to offer value-added services.