How File DATA Storage Tiering is Making Your 2015 Better

File Data StorageDo you remember a time when you came into a class and you were the only one who did the reading? It feels pretty good to have that leg up on everyone else. And you were probably the teacher’s favorite that day, too.

When you think about huge, high-level IT concerns, your first thought is unlikely the lowly file. You know: a word document or an mp3. Why would you jump there first? Actually, because unstructured data (i.e. files) and storage tiering is the largest area of data growth on the planet right now.

EWeek reports in its article, “Five Trends That Will Drive Data Storage” that, “According to IDC, more than 90 percent of the world's data is unstructured. This means we're moving from terabyte to petabyte to exabyte levels of data and higher. In 2012, the amount of global data reached roughly 2.7 zettabytes—and IDC predicts that number will double again by 2015 and continue to double every two years after that.” That is literal exponential growth.

A lot has changed in information technology in the past 10 years. This has led to a massive increase in new files being created, handled, shared, and stored. Ultimately these files are left to sit on the expensive disk you’re buying to make sure people can access the files they need, when they need them.

And that’s the real problem with the way most companies are dealing with their storage: people only need instant access, and incremental backups, and snapshots, and even more back-ups for files that are changing—not static, old, ever growing backlogs of files that people have already forgotten about. And that is what drives the cost of storage management up even as the cost per GB for physical devices goes down. Operational Expenses are 7 to 12 times more than the cost of acquisition over the life of your hardware.

Deleting the files simply won’t do. Even though 98% of these files will NEVER be touched again after they reach the “inactive” phase of their lives (after 90 days of not being touched), the other 2% WILL be needed (though ⅔ of the time, only for reference). Worse still, should there ever be a legal reason to find old files, you better hope that you have them and can find them fast. This means that you need to hold on to these files, but you don’t need to constantly back them up (where the real cost lies in storage infrastructure).

Moving the files to a new place (somewhere where they won’t be constantly backed up) seems like a simple solution. And it is, though perhaps deceptively so. According to Robert Glass in his book Facts and Fallacies of Software Engineering, "For every 25 percent increase in problem complexity, there is a 100 percent increase in complexity of the software solution." This has been dubbed Glass’ Law, and it holds true.

Moving old file data, while retaining the ability for people to get to it when they need to, is a problem that’s simply been beyond the scope of traditional methods. It’s a problem which had stumped the industry for years, due to the complexity a solution entails.

This is why solutions which aren’t purpose built to the task of explicitly tiering unstructured (file) data get it wrong. It’s an intricate, looming problem that needs to be addressed directly.

For instance, a solution which doesn’t maintain a catalog of tiered files (i.e. a searchable database that contains all the file’s metadata and extended metadata) can lose files with no way to find them (without searching everywhere, a time consuming and expensive process). A solution also needs to be able to meet changing requirements for the environment, be it the kind of stubs left behind after a file is moved, the option to tier without stubs, or the policies which manage what files move in the first place, and where they go.

The most important thing to realize about storage tiering is that the solution for it already exists. While many are getting it wrong, there are viable options on the market today. And when implemented well, tiering solves the file data problem: it allows for seamless access to old files when needed, without having to constantly back up those files. It makes file discovery for legal reasons easy, and can integrate retention and deletion policies to maximize efficiency. It doesn’t limit you to your own datacenter, either. It also has the flexibility to move files out to the cloud if desired.

Implementing a tiered storage strategy in 2015 could make a huge difference in your storage expenditures. One of our clients, a multinational automotive manufacturer, projected a first year cost reduction of 80%. Can you think of better uses for that kind of money?

This is the opportunity you have right now. And you can have it, with DefendX Software Mobility.

What is it? Put simply, DefendX Software is a policy based file movement system with a searchable catalog of all of your tiered files. When it’s time to bring the file back to an end user, it grabs the file and presents it. 

Request a Demo

Topics: Tiering, File Data Management

PLEASE SHARE YOUR COMMENTS BELOW

Subscribe to Our Blog

SEARCH

  • FOLLOW US
  • facebook
  • twitter
  • linkedin
GOT A QUESTION?
Just Ask