Recently, I have been having more and more discussions around software-defined storage, the way it is reshaping the landscape of both enterprise and commodity storage and finally, the need to decouple the hardware from the software.
Is this really a surprise to anyone that software is eating our lives? Just look at Uber and Tesla. The former has no actual inventory but is worth an estimated $50 billion dollars. The latter uses software along with real times updates to give its owners an iPhone like experience on four wheels. If you stop and think about it, the constant updates and real time notifications are great but the genius comes from decoupling the software from the hardware. In the two cases mentioned above, the hardware being the car is decoupled from the software being the operating system of the car(s).
In the world of enterprise storage, the process of procuring additional capacity or additional performance is fairly straightforward but not exactly on-demand much like our economy has turned into. In short, legacy infrastructure has seen a 1:1 pairing or bonding between the hardware and software. In legacy environments it wasn’t possible to say “let’s pool all of our resources from “n” storage arrays in order to gain economies of scale, resiliency, availability and elasticity.” Today, the game certainly has changed and I would consider it changed for the better. If Tesla and Uber can put the intelligence of their platforms into software why can’t traditional storage vendors?
More and more companies are buying only what they need or using the Walmart logistics method of “JIT” or just in time. Companies are also looking at extracting that same software and using it atop commodity hardware as we have already seen the intelligence of the environment persistent in software and not the hardware. A case could be made that all flash drives can have an effect on the performance of an application such as a database or real-time ingestion pipeline such as Amazon Kinesis. This trend has had such an effect that a new crop of infrastructure has emerged that is known as software-defined or hyper-converged infrastructure, this being where all of the legacy hardware components such as networking, compute and storage are virtualized or abstracted from the hardware giving the tenant elastic, scalable and consumable pools of resources to serve to his/her users.
As it is easier to scale software vs hardware, the question has to be asked, “At what point does hardware itself become commoditized and the intelligence of software start to ascend?” I believe it has and will continue to happen at an even faster pace than years past. Why? Because as the price of the datacenter components become cheaper and more and more companies are taking software as core competency, hardware can be outsourced to the cheapest bidder and the software will become the valued IP.
How have you leveraged software defined “X” to gain a competitive advantage?