For most of the short history of enterprise-class technology, companies have spent their IT budgets buying, provisioning, and maintaining customized solutions running on corporate-owned infrastructure and networks.
The problems with this model are many and costly: over-provisioning, orphaned software licenses, inflexible architectures, under-utilization, cyber security issues, zombie servers, the dearth of qualified IT talent, siloed solutions, aging systems that do not meet the demands of the 21st Century … the list goes on and on.
The common theme running throughout is a lack of IT cost transparency: most companies have had no idea just what they are getting for their IT dollars. Return on investment (ROI) and total cost of ownership (TCO) calculations are still educated best-guesses. Very few in IT leadership can draw a direct line between a single technology and its financial benefit to an organization. As you will see, it is precisely this ROI/TCO trap that has given such rapid rise to the cloud.
The Status Woe
The root of the ROI/TCO trap starts with the status quo – specifically, the belief that owning and operating technology is the only way to wring the most value from dollars spent. But because the all-in cost of this approach is foggy at best, these assumptions are now being seen as just that: assumptions.
This is not new thinking. When Nicolas Carr penned his now-famous article in the Harvard Business Review, “IT Doesn’t Matter,” his premise that IT is a commodity that doesn’t bring any tangible business value to an organization sparked a firestorm of criticism. “How could this be?!?” incredulous IT people demanded. “Surely we add value! We must … right?”
That was in 2003 when most companies of any size had invested a hefty percentage of revenue in lots and lots of software, servers, PCs, and networks. In time, however, Carr’s view came to be accepted as fundamentally sound. Three years later Amazon launched its Web Services and the modern cloud era was born – not because a cloud service had been launched but because customers flocked to it. Within a year, 180,000 developers had signed up to use the service.
The Trend Line Is Moving In One Direction
If Carr had been wrong, and owning and operating technology really did provide significant business advantage, neither cloud nor its precursor technology, server virtualization, would have made the deep inroads they have into corporate IT strategy and thinking. Ever since the word “cloud” entered the IT lexicon, PC sales have flattened or gone into decline, cloud computing in all its forms is growing at a double digit CAGR, and companies have embarked on data center consolidation and application rationalization efforts en masse.
Companies are shedding owned infrastructure and software as fast as they can. Every year, organizations are replacing legacy applications and relying more heavily on cloud applications. In 2015, enterprise organizations ran an average of 18 cloud applications. By 2017, that number will nearly triple, to 52.
A more powerful way to look at the ROI/TCO trap is a simple thought experiment: If you were founding a startup today would you buy, own, and operate all the technology you needed to open your doors or get it from the cloud? What do you think you’re investors would want you to do?
We’ve come a long way from the days when technology was just a way to automate existing processes and cut headcount. Cutting-edge businesses the world over are now using technology to create industry-changing products and services, roll them out to new markets faster than ever before, and expand market share by interacting with customers on their terms – all while lowering the unit cost of IT to the organization.
For a conversation about how to exploit the advantages of the cloud to lower your risk and drive business outcomes, and learn more about how AiNET can help simplify and streamline your IT infrastructure, get in touch.