In industries with some of the biggest data storage challenges — broadcast, media, retail, geospatial, and many others — many companies don’t refer to their data as “data” anymore. They refer to it as “content,” “assets,” or “information.”
These companies recognize that their employees or clients can use this digital information to add value to their businesses — to gain insights to grow revenue, speed time-to-market, gain competitive advantage, and improve margins. It points to a growing general trend of thinking about storage as a way to enable the business, rather than just as an expense.
Many of our customers are among these leading edge companies and here are some key learnings drawn from their experiences:
Tip # 1: Build storage solutions based on the “workflow” of the data or information. It sounds simple, especially since the concept of workflow of data makes sense in almost any application. An example of a workflow is the ability to ingest data quickly without losing information, share it (often globally), and preserve it based on business line needs. However, many companies still continue to take a “one size fits all approach” to their data. In the example of a large retail company, many videos and images used for advertising are stored and backed up with the same tools as a large database, or Exchange servers. A better approach is to look at each data type individually and design storage solutions based on the workflow and needs of the users.
Tip # 2: The ability to store more data for longer periods (without breaking the bank), enables users to get more value out of that data. Research data, such as the output of a genome sequencing project, offers a great example of this principle. When research is completed, sometimes it is not immediately clear how valuable the results will be. However, years later the research may lead to an important discovery. This notion of data having unpredictable value needs to be accounted for. Another example is with geospatial content; a satellite image captured today may help support a mission or an initiative 10 years later. Again, the value of data over time is unpredictable.
There is only one way to store more data longer, without breaking the bank — and that is with tiered storage solutions. The majority of digital content now and in the foreseeable future is “unstructured” — big files that do not lend themselves to typical data reduction tools such as deduplication or compression. The majority of this content needs to be kept available and accessible, but may not be needed for days or even years at a time. Using fast storage — like flash — for production work, but using low-cost, lower power storage — like tape — to archive the massive amounts of unstructured content is generally the right approach. Many other storage technologies, such as NAS (network-attached storage), block storage, object storage, and cloud storage can play a role in most of these workflows as well. In fact, we expect the cloud to be part of the majority of these tiered storage workflows in the future.
Tip # 3: Users can only create value from data if they can access it. For anyone who has designed a storage solution based on the workflow of their digital assets, and has designed a tiered solution that enables their business to store more data, the next step is to make sure users (film editors, designers, marketing managers, scientists, analysts) can get access to that data when they need it, without needing to go through IT. Technologies are available to enable the sharing of data across sites — even globally — and store the data in open formats that are easily accessible. Object-based cloud storage technologies and new open storage formats like LTFS for tape help to facilitate this data sharing.
Eric Bassier is Director of Data Center Product Marketing at Quantum. His writing has appeared in Data Center Post, Data Center Knowledge and Wired.