The volatility of postseason basketball provides a good opportunity to talk about another kind of madness: inefficient stewardship and mismanagement of data centers. If missed free throws and turnovers are the key ingredients of March Madness upsets, then failure to take advantage of data center virtualization is the secret sauce behind many of the biggest issues in these facilities.
Virtualization has been around for decades, but only with the rise of cloud computing has it become a widely known concept and a multi billion-dollar commercial market. IT research firm Gartner estimated that the server virtualization market grew 5.7 percent from 2015 to 2016, reaching nearly $6 billion in value.
Even though it is a staple of modern IT, virtualization is not well-understood by everyone, and this lack of knowledge can prevent technical teams from virtualizing their key assets, such as servers, desktops, and data centers. Organizations cannot afford to miss such opportunities, now that software-defined data centers are becoming central to business operations.
Cloud computing has long been a background news story for many people, i.e., something they have heard of and may have general ideas about but do not really understand in depth. However, the recent outage of Amazon Web Services in late February 2017 likely alerted many internet users to the deep dependence of a vast number of applications and websites on cloud-based infrastructure.
Big data has been an IT buzzword for years, but what does it really mean? The simplest definition is perhaps that it encompasses the concepts and practices related to dealing with information at vast scale. Think of what it is like to manage a database of insurance information or a collection of electronic health records: The scope of such tasks requires expertise with combing through large data sets, as well as familiarity with specific technical tools such as Apache Hadoop and Amazon Web Services.