The resolution at which we view content has increased at a staggering rate in recent years. Whether it’s on the big or small screen, smartphone or tablet…the picture is clearer than ever. From cameras to playback devices, technology improvements across the board have made this possible. The storage capacity required to manage, manipulate and playback content has also increased, but it’s not just about increasing resolutions. Consider the raw uncompressed content that is ingested, the need for several editors or artists to be able to access content at once, and additional copies that need to be stored onsite and offsite for backup/disaster recovery (DR) purposes. You now have an environment that can be quite tedious and expensive to manage and support.
In post-production infrastructure, purchases are generally wrapped around a project. The need to remain cost competitive leads many post-production facilities to settle for heterogeneous siloed storage environments. Unfortunately, each storage system has to be learned, managed, and supported by engineering. This model quickly becomes outdated and does not scale well. Some of the larger organizations have adopted a single vendor or tiered approach. However, they often have to move into a larger system as capacity or performance is maxed out. When they do, the software is essentially purchased again and hardware is changed out to improve performance and scalability. As a result, it has become difficult for post-production facilities to hold onto content for long-term preservation and re-use. And while clients often expect you to keep the content past the end of the project, they often aren’t willing to pay to maintain that active archive.
Post-production facilities of all sizes want to provide active archive access with automated hassle free data protection built in. In addition, many are trying to enable streaming and OTT (Over-the-top content) use cases. These requirements combined with the deficiencies of traditional storage technologies are leading IT decision makers in many post-production houses to take a closer look at object storage.
Reasons for the interest in object storage solutions include:
- Ability to change protection schemes from replicas to erasure coding
- Optimization of performance or storage footprint based upon production needs
- Direct access via HTTP with streaming enabled through range read
- Support for many users and many parallel sessions
- Built-in content management
Best-of-breed object storage solutions enable pay-as-you-grow scaling on any mix of standard servers, leading to a 75% reduction in overall storage TCO when compared to traditional storage arrays.
If you are interested in learning more, check out our webcast Real Life Use Cases for Object Storage, M&E on Thursday, March 16 at 10 am PT/1 pm ET with Caringo VP of Marketing Adrian Herrera and VP of Product Tony Barbagallo.
You can also read how NEP The Netherlands is using object storage to power their content delivery needs.
Industry analyst Mike Matchett, principal IT industry analyst at Small World Big Data, talks to Caringo VP of Marketing about our landmark Swarm 10 launch in this Truth in IT video. Hear about what object … More Details »
Abstract: This whitepaper presents the results from recent benchmarking of Caringo Swarm object storage on a multi-Terabit converged Ethernet Software-Defined Storage Super Data Cluster deployed by the UK Science and Technology Facilities Council’s (STFC) Scientific … More Details »