Last week, I blogged about the new “Data Age” where everyone and everything is connected and posed the following question:
Does location really matter anymore?
This week, let’s continue with our macro to micro look at connectivity, data and location. We left off with the question “how do you optimize data storage, protection and access?”
Location of the Data on Servers and Media
If budget and space weren’t an issue, you would have multiple copies of data in the cloud and on super-fast, on-prem SAN or NAS with some form of local redundancy. Unfortunately, this utopian approach would break the budget of just about any organization. There are numerous ways to try to reduce costs and they all do one of two things: try to reduce the amount of total data or tier to a cost-effective solution.
I would argue that for most organizations (particularly those that are content-driven), reducing the amount of data is not a priority or really even an option. So let’s focus on the second option: tiering data to a cost-effective solution.
How Do You Define “Cost-Effective Solution?”
Historically, for storage, cost-effectiveness was a function of price (both to purchase and maintain) and performance (i.e., speed). However, for storage in the data age, the definition of performance needs to be further augmented to include accessibility.
The price, performance and accessibility requirements for your specific workload must be balanced so your organization can function efficiently and deliver to the end user or consumer. To achieve this balance, you must be able to continue to leverage the advancements of the underlying infrastructure, such as server efficiency and hardware density, in perpetuity.
The Importance of Intelligent Data Management
Whatever storage solution you select should enable the efficient and automated movement of data within the storage system to leverage the continued efficiency gains of modern hardware and the tiering of data to other storage (whether that be on-prem, off-prem or hybrid). Ideally, this would be done as an integrated function of the storage itself to reduce your costs, rather than paying for a seperate data mover or hierarchical storage management (HSM) application. Watch out for solutions that require forklift upgrades or lock data into a specific physical location; they will ultimately hinder your ability to leverage the value of your data.
3 Considerations for Data Storage in the New Data Age
As the end of 2020 comes in sight and we start to plan for 2021, many organizations are recognizing that their notion of location in relation to data will change. To summarize, the new considerations are:
- All data must be online, searchable and accessible for end-users and applications
- Where data resides must meet the requirements for accessibility within budget
- The selected solution should leverage the advancements of the underlying infrastructure
If these issues resonate with you, please register for our October Brews & Bytes webcast titled Storing Data: When Does Dense Make Sense? Brews & Bytes ep 11. Eric Dey, Caringo Head of Product, and Scott Hamilton, WD Sr. Director of Product Management and Marketing will be joining me. With more than 36% of the world’s data residing on WD products and Caringo’s 15 years pioneering object-based data storage and intelligent data management for content access, delivery and archive, it is sure to be a lively and interesting discussion!