The State of Storage in 2012—to the Cloud!

Slalom Consultant Derek Martin

Slalom Consultant Derek Martin is an accomplished Microsoft systems developer and integrator, experienced in developing and deploying SharePoint and CRM solutions, integrating line of business applications, and leveraging existing infrastructure investments.

There is a great article in Information Week this week about the 2012 State of Storage that I wanted to comment on. If you don’t have a subscription, that’s okay. The basic premise is that solid-state drive (SSD) costs are really starting to drop and the idea of enterprise’s using SSD storage area networks (SANs), or higher numbers of SSDs in existing SAN technology, is starting to gain traction.

This is certainly true and will continue to provide great performance improvements for needed input/uutput operations per second (IOPS). It’s telling that to this day, many enterprise, including the five that I work closely with, are stuck in the traditional models of storage. This isn’t their fault–these kinds of sea-level changes take time and there are obvious risks to upending a trusted SAN solution. But the writing is on the wall: traditional massive storage arrays for both performance applications and archival/compliance/storage requirements are going to look very different in a few short years.

  1. Three tiered storage is not going away. Local SANs are always going to be needed for certain applications. Particularly in this day of massive business intelligence needs, the faster those IOPS, the better and SAN solutions built using SSDs in Tier 1 exclusively are going to quickly become the norm.
  2. Structure and unstructured data, to paraphrase from the Inofrmation Week article (above), are now neck and neck as the leading growth sources in Tier 1. However, at Tier 2 and Tier 3–its heavily unstructured and getting more and more so daily. This calls for a different paradigm with regard to those solutions–this is because of massive ‘gunk’ growing into the environment–data that you wish you didn’t have to keep but you do. So instead of spending millions on Tier 2 and Tier 3, let’s consider alternatives.
  3. Here it comes…wait for it….almost there….TO THE CLOUD! Ahhhh–I feel better:

The cloud is a perfect repository for unstructured data, or data that has long retention policies around it. It must be understood, however, that the security and integrity of your data isn’t something that be negotiable. Wherever your data sits, it must be safe, verifiable and audit-able. But these constraints do not preclude the use of the cloud–much to the contrary, it calls for the cloud–let me explain.

The ‘cloud’ isn’t all about applications and development, it is also about infrastructure. Microsoft’s cloud has some really robust infrastructure. There are tools and technologies on the market today that can take Tier 2 and Tier 3 (and even Tier 1 if you wanna get really crazy!) into Azure without significant changes to your infrastructure. One such example is a tool I’ve been learning about lately from StorSimple. 50 Tb of cloud data for $50,000. That’s pretty cost effective infrastructure!

Costs will go down and continue to do so. In the passed six months, the cost of doing business in Azure has gone down three times. The reason? Every time Microsoft hits another one of their sociability targets, they can (and do) reduce the prices for everyone. I’ve never seen a company do that before. Pretty impressive.

Security goes up. Your data in the cloud can be more secure than on prem. Yes I said it. Products like StorSimple have attained HIPPA compliance certifications (and that’s saying something!). The Azure data centers also have varying degrees of security certifications depending on services used include FERP, ITAR, SAS70, etc. When was the last time your data center got all of those?

Integrity and availability are critical. Can you prove that in the event of a data center loss, your data is safe? Again, the Azure data centers can. Your data, encrypted both in flight and at rest depending on your solution, are stored in at least three separate data centers. I suspect you can’t do that for $0.12/Gb/Mo on your own. A tool like StorSimple can also be attractive because of the technology it is using behind the scenes that can make your data immediately accessible to your secondary data center in the event of DC 1 loss and you don’t have to pay for that unless you need it. Not too shabby.

Information Week surveys aside, and they are saying the same thing–consider the cloud carefully for Tier 2 and Tier 3 as there are options for you. You can have enterprise data in the cloud that costs less, is more secure, is verifiable and can plug directly into your existing infrastructure, obviating the need for Tier 2 and Tier 3 to be located on prem in some cases.

It’s a good time to be in the cloud!

Slalom Consulting’s Dallas office Slalom Consulting's Project & Cloud focus
Learn more about our Dallas office Learn more about Slalom Consulting Cloud

subscribe by emailSubscribe to follow new Cloud posts

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: