It all comes down to creating a disciplined way of thinking about file data storage.
You may want to keep all information at your fingertips as your project is active, so you create a simple nomenclature and folder structure. Then you’ll begin sticking files in each of those folders and subfolders. You might even hoard your file data. File backup and tiering strategy may be a passing thought, but in reality, you ignore it and push to be “done later;” thinking that cost is relative and you have room on your SSD flash drive … and so it goes. You may conveniently forget how much the company invests in data infrastructure and what it costs to expand it in response to the user demand.
In this way, we make file data storage as automated and invisible as we can to the users. But, the need to put a sense of ownership in our collective minds is also clear. If there is no sense of ownership, then our end-users will abuse the process and depend solely on the rules of automated movement of data to less expensive storage. Over time, that will cost more money.
NTP Software’s CEO recorded a YouTube video that explains things very succinctly. His discussion puts file data storage in perspective for the average user. It is an excellent primer for educating employees and clients alike about the value and use of a file data storage system. It answers the question, “Why should we care?”
As he explains, we are facing three primary challenges in the storage of data:
Growth regarding backup capacity and infrastructure to house it.
The cost of storage – the need for capacity will grow, but the cost of extra capacity is not decreasing at the same rate as the growth increase. Storage costs will rise.
The need for new technology will increase research and development costs.
This is what we’re facing in the way of data growth. In the next five years, we will more than triple the amount of data we have to manage. Individuals and corporations will generate about 80% of those 35 zettabytes of information.
If you consider the typical automated tiering architecture for most corporations, we are talking about a host of new solid state drives (SSD), advanced technology HDD’s, and high capacity HDD’s. Tier 1 is primary access flash storage where live data for current projects lives. Tier 2 is a collection of recent file data and active stub files for large data files put on higher tier, less costly storage. Tier 3 and higher contains dormant file data and inactive stub files. Access time will increase for higher tier data.
The cost of storage will rise as I said earlier. This graphic shows you by how much.
Notice that the overall cost rises at a slower rate in the out years. This accounts for the moderate drop in the expense of the hardware. The cost per gigabyte is expected to decrease by the 3% - 15%.
NTP Software has a solution that simplifies your storage challenges. It is an automated policy and user-driven solution that makes managing data secure. DefendX Software VFM™ gives you important features and benefits that go beyond ordinary file migration. In fact, several of these features are patent protected and thereby available nowhere else. Click here to learn more about NTP Software VFM.