If you grow up in Texas, you know that “Big D” means Dallas—the legendary city where the series of the same name and the movie Urban Cowboy were filmed. Famous for oil barons, cowboys and honky-tonk bars, Dallas is so much more. It is one of the fastest growing cities in the country and part of the burgeoning DFW (Dallas/Fort Worth) metropolitan area.
The DFW area is home to Fortune 500 companies, diverse industries (including information technology (IT), defense, financial services, telecommunications and transportation), private and public universities, and professional sports teams (Texas Rangers, Dallas Cowboys and Dallas Mavericks). Want more? Dallas is a mecca for shopping and fine dining as well as the home of the Texas State Fair and from November 12–15, it will be the home of SuperComputing ‘18 (SC18).
SC18 marks the 30th anniversary of the SuperComputing Conference Series. The infrastructure of high-performance computing (HPC) and its community have grown exponentially since this conference originated in 1988. At Caringo, we continue to see the trends our VP Marketing Adrian “AJ” Herrera wrote about in last year’s blog leading up to the SC Conference, including increasing data sets, visibility and end-user expectations.
Another Big D—Big Data
So, let’s talk about a different Big D—Big Data. File sizes and data sets continue to skyrocket, particularly for research institutions and laboratories as well as for those who deal with digital video files. Concerns for IT Pros and researchers are not relegated just to size, volume and protection, but also around how to manage and efficiently distribute data. Environments now need to support multi-protocol access (POSIX and RESTful) for users and applications while eliminating those pesky old data silos.
Seeing is Believing
More than just access, IT and Storage Admins need more than just visibility to the data. They need to be able to see who is accessing what and know precisely the resources those users are consuming.
The Times They Are A-Changin’
To use the iconic lyrics of the Bob Dylan tune, the times they are a-changin.’ In today’s on-demand distributed research environments, waiting minutes, hours or even days to access data is untenable. Additionally, not being able to easily distribute data and know what resources specific applications and end users are utilizing is no longer an option. This has led many organizations in the HPC space to come to the conclusion that they need to expand their storage infrastructure (beyond parallel file systems and tape) with object storage—whether it is by building a private cloud storage service, tiering data to a public cloud or using a hybrid cloud solution.
Swarm Hassle-Free, Limitless Object Storage has made it easy to do just that. It handles concurrent requests in parallel yielding the full throughput potential of all drives in the system, rivaling parallel file systems for read-intensive workflows. To learn more:
- Watch our Swarm 10 webinar on demand to hear about our latest updates to the Caringo product line.
- Visit us during SC18 expo hours at booth 4035 (contact us if you need a free expo pass or want to set up an appointment with an object storage expert).
- Check out our video demos and then sign up on Caringo Connect to request a free, full-featured 10TB Swarm Dev Edition.
If you are at SC18, don’t forget to join us for happy hour Tuesday and Wednesday at 2 pm in booth 4035. Co-sponsored by our partner Boston Limited, you can quench your thirst for object storage knowledge and beer with a “Dallas Blonde” (American Ale) from Deep Ellum Brewing Company.
Abstract: This whitepaper presents the results from recent benchmarking of Caringo Swarm object storage on a multi-Terabit converged Ethernet Software-Defined Storage Super Data Cluster deployed by the UK Science and Technology Facilities Council’s (STFC) Scientific … More Details »