Storage Wars: The Search for Inexpensive Yet Reliable File Storage
January 31, 2013 2 Comments
By Bob Hayes, Artbeats Director of Technologies
Here at Artbeats, our storage needs are a little different than might be customary for a video production house. We don’t do much long form editing and we don’t need 450MB/s to several editing bays. What we do need is massive, reliable online storage available across a LAN via AFP, NFS, and SMB. Easy enough to do, but it can get very expensive very quickly if you aren’t careful.
The safety and availability of your data and your backup strategy is insurance, pure and simple. And just like house or car insurance, it requires a cost to benefit analysis in order to decide what you need and how much money you should spend. In an industry like finance or health care where the security, availability, and government regulation of the data are all very high, you will probably spend a lot more money up front ensuring the reliability of that data and the speed with which you can restore the data in the event of loss. If you lost critical data, how long can you withstand the non-availability of that data? Nobody wants to deal with lengthy downtimes, especially when it’s revenue-generating data. However, sometimes the relatively small possibility of that catastrophic event occurring outweighs the front-loaded money you must spend in order to get the peace of mind that comes with knowing you can restore your data in a matter of minutes or hours (at worst), versus days or even weeks. This is the kind of analysis that IT managers make all the time, and we place bets on the outcome. Exactly like insurance company actuaries do.
At Artbeats, we recently had a situation where we simply needed more server space for data that was important, but not absolutely mission-critical. Essentially, if this data disappeared one day, it was already duplicated elsewhere and we would be able to recover back to where we were within a matter of days. Perhaps not the most optimal situation, but one where we decided we could spend less money than we might spend in other situations. In the storage arena, there are high-end RAIDs with lots of great features and mind boggling prices and there are lower end solutions that may or may not do the trick, and then there’s a good selection in the middle—massive storage for a relatively moderate (but still fairly expensive) price. We decided on a lower end solution, one of the “lowest” that we’ve ever done. I’m not going to mention specific manufacturers for a variety of reasons, but a little Googling on your part and you should be able to figure out what brands I’m talking about.
We thought we knew right away what we wanted, but after some research we changed our minds on our initial choice of vendor. We ended up with a box that is quite different from anything we’ve used in the past. It’s a self-contained network attached storage (NAS) device that also supports iSCSI, complete with built in server daemons for a variety of purposes, most of which we’ll never use. (We just wanted file servers.) This box cost us about $1000. I had some misgivings because it’s not rack mountable and has very limited redundant uptime capabilities, not to mention it’s made with a lot more thin plastic than I would usually like. But I have to say, I’m impressed with what we got for the price. It has eight SATA drive bays that support drives up to 4TB, a very smart RAID system, and it’s expandable to eighteen bays. Plus, it’s quiet—scary quiet. If it wasn’t for all the blinkenlites, you wouldn’t even know it was up and running by just looking at it. Network throughput is sufficient for simultaneous After Effects renders from four or five machines with no noticeable issues due to network bottlenecks. That’s all we wanted of it.
We populated ours with eight 3TB drives, and we decided on some of the cheapest drives out there ($140 each). So, for about $2200, we got 18.75TB of usable storage space in a hybrid RAID-5 setup that will withstand the failure of one hard drive at a time. We have a standby cold spare on hand, and that’s enough insurance in this case. It took us about an hour to completely configure it, plus an overnight RAID build. I haven’t done anything to it or even looked at it since then, and it’s been six months. Go ahead and price out 18TB of storage from an enterprise level vendor (usually with names of two or three initials) and you’ll see why we went with this particular item. For this job, it was a great way to go.