You’ve gotten into the habit of just adding more servers and hard drives when you see the capacity of each of the drives is approaching the magic 85% full mark. We know that early in our involvement with file data storage that was the extent of the analysis of the data backup system to determine when a new investment was needed. There wasn’t any practical intelligence available for what the current file types were in the environment, who were the culprits storing data by the gigabyte and why. The metrics just weren’t being collected and the analysis was done haphazardly and without performance parameters in mind. In short, we were flying blind, storing data that didn’t need to be stored, losing data when hard drives crashed from exceeding their safe capacity limits and always having to beg the CFO for money to add hardware to the storage infrastructure as unplanned expenditures. IT as a cost center took on the real meaning of being a money pit with no ROI to show for the effort.
If you're new to file archiving, you probably aren't familiar with the term "stub." A stub is a file that points to a file, similar to the concept of a shortcut on a desktop. Instead of pointing to an active file, a stub points to an archived file that has been moved to a new location.
We all know that unstructured file data is growing out of control and needs to be managed. But what is the best approach? It really comes down to understanding the numbers behind the file data so you can make the right decision for your storage environment.
It’s difficult for an administrator to get control of network storage when space begins to dwindle. Users see storage space as an unlimited supply for all of their files including videos, music, and personal executables.
While companies vary greatly in their approach to storage management, out of necessity, most have some formal policies in this regard. Today’s disks are large and relatively inexpensive. But at any given time there is a finite amount of space to be had. Unlike electricity and bits from the Internet, network storage is not renewed over time.
One of the topics we’re often asked about is cloud archiving or tiering to the cloud. Enterprises want to know how it can help them and what are the key advantages and challenges. Today, we “clear the fog” on cloud archiving.
File tiering is new to most organizations. Given this, few people are experts in the technology. In fact, many of us are unsure of even what questions to ask. It turns out that there are several very important questions. A few features can make the difference between long-term success and a nightmare that gets worse every year.
Covid-19 has caused a shift in cost-cutting among companies in industries across the board. In fact, according to a Gartner, Inc. survey, 62% of the 317 CFOs and finance leaders assessed are planning cuts. While 38% of finance leaders are hoping to avoid cuts in 2020, 18% are planning to downsize budgets in all categories. Data file archiving has always been an essential part of business; it has quickly become a necessity, positioned at the front of the line for "must-have projects."
BOSTON, July 1, 2020 /PRNewswire/ -- DefendX is now seamlessly combined with Scality's industry leading software-defined Object Storage portfolio.
This software combination reduces the cost of traditional data storage by 70% or more. Customers can identify aged unstructured data and set policies to transfer this data into the cost-effective Scality RING. The solution delivers an instant return on investment and provides limitless scale with cloud-like economics.
Boston, February 5th 2020 — Joe Cutroneo, CEO of DefendX Software, a worldwide leader in the management and control of unstructured file data, has been accepted into the Forbes Technology Council, an invitation-only community for world-class CIOs, CTOs and technology executives.