Workflows that deal with big files and big storage capacities are difficult to move to a distributed or remote method of processing. Along with the difficulty in transferring large files to various sites, you need to deal with storing, protecting and accessing those files in an economical way. In this episode of Brews and Bytes, Robert Browne from Signiant joins Caringo’s TW Cook and host Adrian J Herrera to discuss these questions:
- What are the problems you need to look out for when transferring large files locally or globally?
- What are the issues you may encounter storing 100s of Terabytes of files?
- How about 100s of Petabytes?
- Is there such a thing as files and storage being too big for remote workflows?