Ok, so I swiped that off from the original “water, water everywhere but not a drop to drink” Samuel Taylor Coleridge poem “The Rhyme of the Ancient Mariner” in which sailors on board a ship where without water to drink despite the fact they were surrounded by a sea of water.

In the same token, companies have successfully figured out how to capture as much data as they could ever need. The problem is interpreting the data to mean something.

One industry that is trying to cope with this issue is the “legal” industry where until recently the fight in the ediscovery solution suite was pretty much among small niche layers.

As many should know by now storage gargantuan EMC took a bite of the ediscovery market with the acquisition of Kazeon. EMC said the acquisition gives it an “end-to-end, in-house d-discovery and litigation readiness solution”.

This is bad news for Kazeon competitors, Iron Mountain, Autonomy, Clearwell Systems and StoredIQ. The latter because it is an OEM partner to EMC.

The two competitor that just might have to start shopping for an OEM partner in this space is NetApp, which partnered with Kazeon in October 2005.

Is there a pattern here?


On May 20 2009 NetApp offered to buy Data Domain (DD) for $25 per DD share. It all seemed a natural fit. NetApp needed a robust data deduplication solution that fits well with its target of SME (ok, more the mid-market than the small enterprises – but definitions of what constitutes a SME varies by country or region). NetApp acquired Alacritus in April 2005 for its virtual tape library technology. Speculation abounds this will be thrown out in favor of DD’s technology.

For its part, Data Domain needed the global reach that NetApp had to offer.

Greg Cornfield noted that this would give EMC (with its acquisition of Avamar in 2007 for $165M) a run for its money (http://www.searchstorageasia.com/content/netapp-buy-data-domain). Other industry bloggers

On June 1, EMC did up the ante with EMC CEO, Joe Tucci, holding informal talks with Frank Slootman, Data Domain CEO, followed by a letter of offer. The $30 per share cash offer by EMC certainly is more appealing than the cash + stock offer by NetApp.

But the story doesn’t end there. As with any romance novel, there is a twist in the plot and a counter offer by NetApp followed the EMC bid.

Why the interest in Data Domain? DD’s main claim to fame is its leadership positioning in the data deduplication business. NetApp desperately needs it to enhance its offering. EMC could do well to add the strength of the DD technology to its portfolio of offerings. EMC’s current offering is a combo of Quantum (OEMed) technology and the Avamar acquisition.

Gartner estimates that about 30% of companies have deployed some form of data de-duplication. Data deduplication allows companies to reduce data storage requirements anywhere up to 30:1. The dollar savings appeals to almost everyone with a growing storage farm regardless of prevailing market conditions.

Gigaom’s Stacey Higginbotham thinks EMC will fight back. EMC is doing this to protect its turf and stop a rival from eating into it’s (EMC) market.

Ryan Hutchinson of Lazard Capital believes that EMC is doing this as defensive as well as offensive. EMC has the cash to buy Data Domain at a price that would certainly hurt NetApp (even if the latter is the final winner). EMC has a cash hoard of $9B (cash plus long-term investments) relatived to NetApp’s $2.6 billion (cash and short-term investments).

At any rate, the discussion appears to be mute at this point. Data Domain and NetApp have published a new press release to indicate that they have “entered into a revised acquisition agreement” valued at $1.9B or $30 cash and stock.

Is the deal over? Outside of approvals and due diligence – maybe. Would EMC counter back? They certainly got the money. Broadpoint AmTech analyst Brian Marshall seems to think EMC can still come back to the table with a higher bid if only to keep competition from getting their hands on DD’s technology.

What’s weird about all this is that NetApp’s share price is about US$18 compared to around US$32 for Data Domain. For the life of me I can never figure out how stock valuations work.

On Februarty 15, 2008, NetApp announced the release of its StoreVault S550 platform, an entry-level device targeted for the midmarket and is a direct response to Dell (AX4 OEMed fromEMC), HP (StorageWorks MSA 2000) and IBM (DS4200).

Fast forward to less than a year later, when NetApp quietly made it known that it was discontinuing the S550 platform in favor of the FAS2000 family. So did NetApp actually launched two family of products targeting the same market segment? It does look like it but my understanding is that the FAS2000 was originally targeted at larger end of the SMB spectrum whereas the S550 was meant for those just above the SOHO-segment with a capacity requirement of 12 TB, and preferably already a NetApp customer.

Am I trying to beat up NetApp for killing a product line a year after it was launched? The S550 is an upgrade to the S500 launched in June 2006. And no, I am not berating NetApp for making a ‘wise’ decision. Why would you create two product families that target the same market segment? (Truthfully a lot of products out there in the consumer space do that – just try counting the number of Sony MP3 player models out there and you start to wonder if the marketers are in cahoots with product development to come up with new and ingenious ways of egging more money from unwitting customers.)

Back to NetApp. According to the vendor the S550 will continue to be sold until June 2009 and the product itself will be supported until 2012. But why would anyone want to buy a product that is close to end of life? You don’t! And if you did buy one recently its likely because the channel reseller didn’t want to let you know about the end-of-life secret – what you don’t know won’t hurt them! Certainly I wouldn’t buy a product that is unofficially announced as coming to an end. Why? Spare parts will be more expensive! Support queries will not get prioritized as less people become familiar about a dead product!

End-of-life is a fact of life in any product. Every product – good and buy – will eventually see its lifecycle coming to a complete grind. You just don’t want to be the ones buying at the tail end of an end-of-life. Trust me, you don’t want that! Its like buying a brand new server only to be told a month later that that particular model is phased out and a new one is now available, cheaper, better, faster – been there, done that!

Vendors have a responsibility to tell customers (every customer – not just the most important customers) that certain products are coming to an end. They can offer specials to entice potential customers to buy the ‘dying’ product but that should be a concious decision on the part of the buyer. Channel partners should take an oath of protecting the interest of their customers first before the company bottomline. Because at the end of the day, the customer is trusting you, Mr Channel and Mr Vendor, that you know every truth there is about a product they would like to consider as worthy of their hard-earned cash and investment.

If the customer can’t trust you, then you don’t deserve to sell them anything!

Three months ago I had an opportunity to meet an ex-SUN storage evangelist – Robert Nieboer – who spoke elonquently about the concept of Open Storage. According to Nieboer, Open Storage suggests that you buy standard, commodity products like controllers and hard disks, and install an “Open Source” operating system that includes storage resource management, and you get a storage subsystem that will perform pretty much everything that a similarly configured storage solution from IBM, EMC, HP, HDS, NetApp or Fujitsu, but at a fraction of the cost – 10% to be precise.

Certainly at 10% of the costs of a convention storage solution from mainstream storage vendors, you’d wonder why there is no mad rush to get into the open storage bandwagon.

I took the opportunity to broach the concept of open storage to three CIOs. The response were unanimous – “no, thank you!”. Why?

Michael Leung, CIO for China Construction Bank (Hong Kong branch), notes that banks are probably some of the most conservative businesses on the planet. Highly regulated, all activities are monitored and audited. Every piece of technology the bank is using has likely got the auditor’s seal of approval. Any changes to the type of infrastructure would warrant long meetings, supported with lengthy documentation, with auditors. Its ok to pay through the nose as long as CCB complies with set standards and regulations.

Like Leung, Raymond Ngai, Head of IT Infrastructure for the Hong Kong Jockey Club (HKJC), meeting government regulatory requirements is just as important as keeping data secure 24x7x365. The Jockey Club sites on 35TB of data housed across the main data center and the disaster recovery (DR) site. Like CCB, the Jockey Club likes to stay with old and reliable (I prefer to call it predictable).

One head of IT who bucked the perception was Thomas Lee, Computer Operations Manager at HACTL (Hongkong Air Cargo Terminal Ltd). Lee says HACTL is constantly on the lookout for IT solutions that would deliver significant ROI. Certainly if you can save 90% of your storage budget, that warrants a label of significant ROI. But you have to pay attention to the fact that HACTL’s business isn’t as highly regulated at CCB or HKJC.

Apart from SUN, none of the other major storage vendors in Asia (EMC, IBM, HP, NetApp and Dell) have an open storage story to tell. Why should they? Such a story could potentially cannibalize their “open but proprietary” storage offerings. SUN is not invulnerable either. IDC ranks SUN as 5th in terms of storage sales, with double-digit growth – like everyone else. Much of this growth is attributable to SUN’s storage business derived from OEM partnerships with Hitachi, LSI and Dothill.

Certainly SUN’s proposition that storage should not cost as much as it does today has it merits. The question that CIOs and business managers need to ask is ‘whether the technology is mature enough today to warrant taking the plunge.’

The likely answer is ‘no.’

It will probably take a few more years before open storage becomes a viable solution for mission critical applications. Like Linux and the open systems platform, early adoptors can take the opportunity to test the solution either on development platforms or applications that live on Tier 3 storage platforms.

Today the verdict is out – open storage is not ready for prime time. But like the probervial Gartner Hype Cycle, the technology will reach a point where it becomes mature enough to consider. Until then, stick with the tried and tested, albeit expensive and proprietary solutions available from everyone else.

For more on Open Storage, check out the following sites:
SUN Open Storage
DotHill Open Storage
SGI Open Storage