30 years ago to the month, CERN computer scientist Tim Berners Lee published a proposal for what would become the Internet. It took another 2 years for the first web page to be published by CERN. After that, things took off. A new platform for mass communication using text and images over Hypertext Transfer Protocol (HTTP) was created. Fast forward 30 years and the same protocol and platform is being used to deliver video at an astounding rate. This arguably started with YouTube (2005) but didn’t take off until the entire ecosystem of streaming services, in-home streaming players, broadband and mobile devices caught up (around 2010).
We are now all conditioned to expect the delivery of video—regardless of our location—in an “on-demand” fashion. However, a lot of the underlying workflows and enabling applications and infrastructure don’t natively interface with HTTP. So how do you enable HTTP in existing workflows?
HTTP Streamlines On-Demand
Media players aside, the most important part of offering content on demand is the ability to deliver content over HTTP. That’s what Content Delivery Networks (CDNs) enable, often with edge devices that are closer to the point of consumption for faster delivery to a very large audience. You can also deliver content over HTTP with a few layers of technology including a load balancer, web server and network attached storage. Or, you can implement an object storage solution that wraps load balancing, web serving and storage all into one platform.
Most Applications Were Designed for File Systems
Most applications used in the creation and management of content were developed to support file systems and their protocols (NFS and SMB), not HTTP. This makes sense given that file systems and file system protocols are the way we have all edited and collaborated on files for the past few decades. (Learn more about the differences between File Storage vs. Block Storage vs. Object Storage.)
The good news is that a lot of applications are now supporting cloud-based services and their protocols (all based on HTTP), the most popular being Amazon’s S3 protocol. That doesn’t mean that you can stream content directly from these applications. It simply means that you can now output content created from these applications to Amazon S3 or another storage platform that supports the S3 protocol.
Content is Being Reused, Archive Must Adapt
One of the major hurdles content-driven organizations are faced with is the ultimate need to continue to reuse content. Tape is still the primary method of long-term preservation because from a straight $/GB comparison it’s the most economical of all storage mediums. But in our new on-demand world, what organizations are saving in storage costs they are losing in increased business and opportunity costs. Accessing content stored on tape is more complex than you might think as it is usually a manual process that requires significant time and effort. Project files may span multiple tape drives, and even if only a few seconds of content are needed, the entire file needs to be recalled. A solution that is starting to gain traction is using object storage as a tier of storage between the high-performance tier and the archive tier. With object storage, you can also deliver content within your internal network or via a private URL the same way you can deliver content to the public over HTTP.
This is only a high-level view of the challenges faced when enabling on-demand. Every organization’s environment and requirements are different. The good news is that most application developers are integrating S3 support and there are a number of companies that you can call for help (including Caringo!).
In this LightBoard video, Storage Switzerland’s Lead Analyst George Crump and Caringo CEO Tony Barbagallo discuss a real-world customer architecture to show how Swarm object storage can be used to simplify the environment and reduce … More Details »