By 2010, the amount of data added annually to computer systems worldwide will swell to 988 exabytes, according to an 2007 EMC-sponsored report from IDC. A 2008 update on the report found that in 2007, the digital universe measured 281 billion gigabytes — 281 exabytes — an increase 10 percent greater than analysts had predicted a year earlier.
While roughly 70 percent of the data generated in the digital universe is created by individuals, enterprises are responsible for the security, privacy, reliability and compliance of 85 percent of it.
“Storage content has been growing at explosive rates, primarily driven by the fact that companies are creating more data daily as their means of communicating internally and externally [have] become richer … Then you think about all the consumer trends with rich media,” Kyle Fitze, director of marketing for storage platforms at HP Storage Works, told TechNewsWorld.
With millions of gigabytes of data passing through their networks, businesses that move ahead with a planned upgrade can sometimes realize significant cost savings. However, convincing anyone to spend on new technology presently is a tough proposition. Saying that right now is a tough environment for business is a considerable understatement. With many organizations both large and small experiencing a decline in profits and revenues, companies are looking for ways to cut spending.
Who Wants to Spend?
Businesses are facing a challenging environment where the goal is not simply to spend fewer dollars, but also to make smart decisions that enable them to successfully juggle their near-term need to cut costs with the long-term objective to maintain and grow as the economy heads for recovery.
“2009 is looking tough. Companies are stretching their dollars trying to do more. The bottom line is that budgets are stretched. Budgets are cut, and you have organizations that still have to maintain and in some cases support growth to do even more. So it’s not only stretch your dollars to do what you were doing before. It’s also stretch your dollar to do more and support and enable business sustainment and enhance service,” Greg Schultz, founder and senior analyst at StorageIO, told TechNewsWorld.
“It’s going to be tough, and many organizations will be looking for ways to make their dollars do more and to do more with less. In some cases what that means is looking at what’s discretionary, what’s mandatory and what is required. It’s the same with technology — what is that you need to have, what is that you would like to have, what is that you must have to run your business? There are some nice new features, there some nice new functionalities that would be nice to have, that could help you out. Do you really need them this year, or is that something you could push off until next year?” he continued.
When it comes to trimming budgets, one of the first areas companies target for cuts is infrastructure investments. However, while holding off on upgrading employee desktops or laptops might make sense, putting off an upgrade to an enterprise storage system could eventually cost more money than it would save.
“Things that are bread and butter, things that are essential to keep the businesses running is where the focus and the dollars have to go. In other words, basic performance, basic availability, basic capacity as well as anything that could help them from an efficiency or an economic standpoint and even an energy efficiency perspective,” Schultz explained.
“There are two approaches, one is you can go into conservation saving mode and not do anything. The other is that you can actually spend money to save money. You can spend x amount of money and get a big return,” said Schulz.
Updating storage hardware could provide enterprises with more capacity, better performance, greater energy efficiency, extra features and functionalities, lower maintenance costs and operating costs. The return on investment for the initial outlay would therefore be significant.
All Kinds of Boxes
Companies have several differing options for storing data. A storage area network (SAN) architecture enables a company to connect remote computer storage devices such as disk arrays, tape libraries and optical jukeboxes to servers so that the devices appear to be locally attached to the operating system.
Network attached storage (NAS), unlike SAN, uses file-based protocols including NFS (network file system) and SMB/CIFS (server message block/common Internet file system) in instances in which the storage is clearly remote and computers request a portion of an abstract file rather than a disk block.
Among the technologies many vendors claim are vital for enterprises are thin provisioning and deduplication.
Thin provisioning, used in large-scale centralized computer disk storage systems such as SANs and storage virtualization systems, allows IT departments to allocate space to servers on a just-enough and just-in-time basis.
“On the high end, thin provisioning is for some companies a matter of control. Companies believe that if they go toward thin provisioning, it would stimulate growth without control. It is in the early stages of adoption,” Natalya Yezkhova, an IDC analyst, told TechNewsWorld.
Deduplication technology can increase disk utilization by up to 50 times, reduce storage costs and improve mission-critical capabilities such as disaster recovery.
“Vendors are looking at how to move the technology from a secondary storage solution to a primary storage solution and expand its application. In the archival and backup world, it’s all deduplication. It started as an expensive solution for larger companies but there are solutions now for mid-sized companies,” she noted.
Then, of course, there is cloud computing, in which dynamically scalable and frequently virtualized resources are used via the Internet. It enables enterprises to store massive amounts of data with real-time access capabilities at a fraction of the cost of more traditional infrastructures.
Need for Speed
“For-high performance in environments where time is money and you need to do as much work in the shortest amount of time possible, you’re looking at the newer fiber channel drives and the newer SAS (serial attached SCSI) drives,” Schultz said.
Fibre Channel (FC) storage is a gigabit-speed network technology. Used principally in the supercomputing field, FC has become the standard connection type for SAN enterprise storage networks. SAS is a data transfer technology that moves data to and from computer storage devices such as hard drives and tape drives.
These drives — SAS and Fibre Channel Storage — have larger capacity so that companies can store more data on a drive, meaning there are fewer drives doing the work. They are also faster, which additionally contributes to fewer drives needing to do the work.
“Fibre channel drives are better for multi-tiering applications that have a high IO (input/output) requirement. But for other applications like email archiving, a fibre channel drive is just excessive and more expensive than a SATA drive,” Yezkhova pointed out.
Businesses that need to push the speed boundaries will want to look at Solid State Storage Technology (SSSTs), both RAM-based and flash-based.
“Think of it as your tier zero, where absolutely time is money,” Schulz stated.
In the Cloud
Businesses that need to consolidate data and need a larger capacity — 1 terabyte or more of storage — are looking for systems with intelligent power management.
“We’ve seen a trend toward virtualized server environments with the growth of VMware and the entrance of Microsoft and Citrix in that space. We expect that explosion to continue, where companies are pooling and virtualizing their servers to gain efficiencies. That creates specific challenges in storage. They have much higher workloads on the average server and need to figure out how to provision those new applications in a virtual machine on a physical server and all the storage for that application. Then they have to figure out how to back it up and protect their data. That is driving a reinvigoration of the virtualized storage space,” Fitze explained.
Cloud-based solutions are best for companies that will allow their storage to be outsourced. Most companies have chosen to look closely at these online options, according to an IDC survey, due to data security and the control of data flow.
“It will take several years before these types of services are truly mainstream. Small and midsized businesses are driving the trend because they don’t have much storage expertise, and the online services in the cloud give them the storage they need without the need to build up an on-premise infrastructure,” Yezkhova said.
“For 2009, it’s the storage cloud — seems to be of big interest. It’s not specific products, but the concept is driving the market, and companies are responding with products,” she stated.
Universal storage that will start to evolve in 2009 combines deduplication, accelerator storage and networking blocks, and includes products such as IBM X Series, according to Yezhova.
This corresponds almost exactly with what we’re seeing in the market. The old adage "nobody ever got fired for buying (three letter company)" doesn’t hold true anymore. The first storage budget item that got cut at big business was the next quarterly purchase of $30/GB high-end SAN, but new data still needs a place to go because the rate at which it is being generated isn’t slowing!
At Permabit we’ve seen a huge surge in interest in areas like financial services, where money used to be almost free. Through a commodity hardware architecture and advanced software in our Permabit Enterprise Archive, we can store their critical long-term data with reliability far better than their high-end SANs, at a cost less than offloading it to tape.
One area I’d like to comment on is deduplication. At the backup level dedupe can see savings of up to 50x or more, but that’s generally based on doing it wrong — backing up the same thing time after time. Right now data is being written to primary storage at $30 to $50/GB, and then backed up at an aggregate total cost of maybe $5 to $10/GB more. Deduplicating VTL or backup can shave a few dollars off this, at best.
Instead, much of that data can be moved to an archive tier at $3 to $5/GB, and with an effective cost even lower with deduplication. Effective replication can eliminate the need for backup entirely. This path saves much more money, though isn’t as seamless as merely optimizing backup. With economic pressures, any business would be remiss not to look at deploying an effective archive tier.
Regards,
Jered Floyd
CTO, Permabit
Zetta: Accepting Beta Applicaitons for 5TB of online capacity ->
http://www.zetta.net/freetrial.html