Brochures

Claranet | Data Management White Paper

Issue link: https://insight.claranet.co.uk/i/1019711

Contents of this Issue

Navigation

Page 3 of 5

Improving Data Management for Infrastructure Transformation White Paper co-sponsored by 4 2001:0DB8:AC10:FE01 168.212.226.204 2000s: The SAN & Virtualisation By the 2000s networking and IT use was the norm. IT had become a consumable product of every business. Consumers were 'touching' the suppliers using technology in a way that had never existed before. • Applications, data and business systems were being exposed to external users – introducing external data security threats. • Data storage capacity requirements increased dramatically. • Increased regulatory and statutory compliance requirements. • Cost and risk were going through the roof and growing unabated. • IT departments wrestled some control back with the advent of Storage Area Networks and virtualisation. Businesses recognised the huge inefficiencies with decentralised spending and the related IT resources deployed. The business wanted to aggregate economies, but retain agility. SANs and virtualisation helped this, however the IT departments were now so divorced from fully understanding the business's needs and its application of IT that a 'one class fits all' approach was largely adopted. • Data and compute was re-centralised into a more controlled datacentre environment with centralised spending and a distributed 'cross-charge' type model. • Data was stored on expensive centralised storage arrays predominantly designed to serve the highest demand rather than a range of demands. • Some control of operational process and policy was reclaimed. • Some data security and data protection risks were reduced. • There was limited understanding of the varied use cases for data, its different IOPS, availability, recovery point, and recovery time requirements – resulting in unnecessary costs. • Business units were still largely in charge of application and database development causing copy data to soar. Today: The Cloud Today Terabytes have become Petabytes. The idea of classes of infrastructure services for different applications is starting to take hold, but is far from mature. • Very few organisations have got a grip of the legacy environment they created. • Data services to the business remain poorly aligned and the service delivery model is still very immature. • The cloud has introduced a whole new set of 'shadow IT' temptations for the business and with it a whole new set of risks. • Both IT and the business have ignored their compliance responsibilities – each unconsciously pointing the finger at the other. • The business doesn't understand how things are stored, or where. • The business expects IT to make the best business decisions with respect to cost and risk. However, the IT department has no idea what the data is, what it contains, what's important and what is not, nor the impact of any regulatory or statutory compliance requirement might have on the business's ability to present that data. • The IT department simply stores everything forever, just in case, inadvertently putting the business at greater risk (if the data exists, you must be able to find it). • Traditional data storage approaches burden the business with enormous unnecessary cost. • 'Copy data' introduces a lack of control of data consumption and poor service alignment. Over the last thirty years, we've created a number of challenges that need solving before organisations embark on infrastructure transformation projects. And because of the increased and demanding compliance requirements coupled with data storage costs, we have a problem that needs solving with effective data management .

Articles in this issue

view archives of Brochures - Claranet | Data Management White Paper