Why designing your data storage architecture is important

19 December, 2016
Why designing your data storage architecture is important

Miguel Vega is Vice President Systems Hardware at IBM Middle East and Africa.

Businesses should start paying more attention to data storage as part of their data strategy. As I see it, the cooler analytical tools we use to extract insights from data, which is the new natural resource and get most of the love, seem directly responsible for driving business value.

But while they do have an important role to play, they should not get all the credit.

Data is the new basis of competitive advantage, but is meaningless if you cannot access or analyse it timely. So, here are three ways your organisation’s data storage system can also actively boost business performance. If you are not getting these benefits, it is time you took a closer look at where your data lives.

#1 Turbo charged functionality

The data storage system is not a passive element of your wider strategy. The functionality of today’s platforms makes them an active contributor to the process of analysing information and deciding what to do next. Virtualisation, a part of software defined storage, is one of the most significant features helping to make this happen.

Software defined storage works by pooling numerous physical storage networks together, so these systems and the data they hold can be managed from one central place. Backup and recovery is made faster and easier as a result, providing single unified management layer for your data lakes.

All flash array is one other key feature of a modern high performance storage system. The technology uses silicon memory chips to store data for writing and reading, and is considered an upgrade from the mechanical hard disk drive.

All flash has been shown to reduce both latency and seek time by order of magnitude from millisecond to microsecond latency, meaning faster data access and analysis.

#2 Simple, seamless, scalability

Software defined storage solutions provides open standard access via all of POSIX, NFS, SBM, CFIS, Hadoop, HDFS, Object Storage, OpenStack Swift, to enable seamless data connectivity to applications across private, hybrid or public cloud.

Next, straightforward scalability allows your storage infrastructure to grow as fast or as slow as demand does. The most effective systems provide vertical scalability by including expansion enclosures to be added dynamically, which minimises downtime and the related financial losses.

For big data, analytics unlimited scalability, software defined storage provides the ability to scale with ease allowing you to quickly build, scale up your storage infrastructure to be even faster, more efficient, and more secure. Software defined storage also makes scalability possible when you need to combine an external storage source with your primary system.

When an external disk system is virtualised, its storage capacity inherits the same functionality and usability as your main system. This is even more reason to build your strategy around the best infrastructure available, and meet Zettabyte data scalability.

#3 Continuous high availability

A data storage system that drives real business value must offer continuous high availability, without down time or any single point of failure. The value here is tied to brand equity in other words, the value of a customers’ perception of your brand.

They may lose trust and choose not to return if downtime is a common occurrence, which obviously erodes your revenue while strengthening your competitor. Storage systems prevent downtime in many ways.

For example, automated path failover support recognises a failed command and resends it down an alternate data path, so the application can keep running. There is also redundant hardware, which provides a backup of vital system components should the primary component fail. Alongside this, hot swappable components enable you to replace failed components without shutting down the entire system.

I hope the above clearly demonstrates just how beneficial a dynamic data storage system is, especially as businesses are now in competition to do more with the data they hold. Whether you already have a coherent data strategy or plan to build one, it would be wise to make storage a priority, rather than a secondary feature.

After all, data is a precious commodity in the new cognitive and cloud era, so why would you want to waste it? Disk based storage systems are not your only option.

“All flash has been shown to reduce both latency and seek time by order of magnitude”

“Software defined storage also makes scalability possible when you need to combine external storage with your primary system”

“Automated path failover support recognises a failed command and resends it down an alternate data path so the application can keep running”

Using flash, moving to software defined, and how available, are important aspects of designing an enterprise data storage architecture according to Miguel Vega at IBM. This blog may have been edited for style and conciseness.

2016 Awards Banner

Latest Whitepapers

Whitepapers Covers
Whitepapers Covers
More Whitepapers