Historically in the High-Performance Computing (HPC) space, storage was evaluated by cost vs performance. Most of the budget was spent on high-performance storage because it meant faster time to results. Once data sets were rendered, modeled or any other form of analysis was performed, the data either stayed on the high-performance layer or was painstakingly moved to some form of archival media (like tape). This worked for many years, but 3 things are now driving new storage challenges in HPC—increasing dataset sizes, visibility, and end-user expectations.
Increasing Data Sets
It isn’t a surprise to anyone that file sizes are increasing—driven by higher resolutions, more sensors, more devices or any number of reasons. This, as a result, increases the overall capacity requirements; but, data growth alone isn’t the challenge. The issues we hear most often associated with data growth have to do with protecting the data sets, providing multi-protocol access and consolidating data sets from disparate servers across a campus or across far-flung locations.
For many in the HPC space, collaboration goes hand in hand with achieving results or discovery. We hear from our discussions with researchers, higher education institutions, and labs that it’s more than just providing secure access. They now need visibility not only of the actual data sets (i.e., search and custom metadata), they also need visibility into who is accessing what and how many resources they are consuming.
You can thank the “cloud” for this one. Grad students and new IT professionals have grown up with cloud-based access, have studied “web-services” and are used to RESTful interfaces. Many under the age of 30 don’t even know what tape is. They really just expect to make a few API calls, grab the data they need and go.
Even if these new requirements stood alone, they would be a challenge; but, all 3 are hitting at once. There will always be a need for high-performance storage in HPC use cases; however, all of these requirements can now be satisfied with a combination of object storage and cloud services. At Caringo, we are seeing that many in the HPC space are buying less high-performance storage and investing in innovative solutions that plug seamlessly into legacy and new RESTful interfaces (like S3) enabling consolidation and collaboration all in a service-oriented approach. If you are interested in learning how to meet these expectations and getting a demo, I’d like to encourage you to:
- Visit us at booth 1001 at SC17 in Denver next week (contact us if you need a free expo pass).
- Watch our video demos and then sign up on Caringo Connect to request a free full-featured 10TB Dev Edition.
- See if you qualify for a free 100 TB license for new HPC or Higher Education customers.
And, of course, you can always reach out to us and we would be happy to answer any questions you may have.
Industry analyst Mike Matchett, principal IT industry analyst at Small World Big Data, talks to Caringo VP of Marketing about our landmark Swarm 10 launch in this Truth in IT video. Hear about what object … More Details »