Before we discuss trends in virtualization or look at its early beginnings; let’s look broadly at what it means to virtualize. “Virtualisation is the configuration of servers or clients which results in the division of resources into multiple, isolated execution environments, by applying one or more concepts or technologies to reduce costs and enhance flexibility associated with the acquisition, implementation, management, expansion, and recovery of critical business systems.” (Olzak et al., 2010).
Virtualisation came as the demand for an increase in IT Services, businesses and or organizations wanted a more consolidated method for managing and monitoring of their systems, provision of new servers/services as needed, maintain high availability for mission-critical applications and in the breath, increase the return on IT investments. It was in the 1960s that this radical evolution began and IT Experts were researching method of logically dividing these system resources provided by a mainframe computer between different applications. This claimed, was evident the study conducted by Prangchumpol et al. (2015) and states “As business and or organizations seek to improve their infrastructure and access to data; they are forced to consider cost and the most appropriate use of resources.” He also stated that “…virtualization can be used to simultaneously manage heterogeneous workloads of different applications.”.
Since then virtualization has also stemmed the growth of Big Data and Computer Security. We are living in a virtual world as claimed by the study by Turner(2014). He also claimed that we are creating more data in this decade than the previous. This wave of data, stems from the fact that more and more users are entering the online world through the use of smartphones and other smart devices that collect and store information at a rapid rate; namely Nest home devices, digital camera systems, automated systems, intelligent machines, learning machines otherwise termed as the Internet of Things(loT).
In light of the growth in Data and Computer Security, as explained by (Jeong et al., 2015) “many companies have managed to reduce costs and enhance the efficiency of their servers by adopting a virtual desktop infrastructure (VDI) which is classified into private cloud computing.”.
In my line of work, we have embarked on the path of virtualization, we are a growing company with locations in the US and two (2) locations in the Caribbean and we are looking to expand even further. About four(4) weeks again, days before I started my training here, I attended a webinar with a consultant on VMware and it’s many attributes. I was given an insight into the world of virtualization and all its benefits; from being able to test disaster recovery in the backend to VMotion(enables migration of an up or running virtual machine from one physical server to another with no downtime).
In conclusion, virtualization allows for more efficient use of the host machine’s resource, datastores, memory to CPU. This in turn as spiral the growth of cloud computing(which gave rise to social media and the enormous amount of data being generated as we speak) and satisfying the need for a more robust infrastructure as the demands grow.
Every new technology generally gives rise to risk and some small drawbacks, with virtualization, the same is true. With Virtualisation sometimes there are security risks and hardware compatibility issues. Some of these risks include Antivirus, Malware, planning proper risk assessments, change management, insulating production and development environments and the list goes on and on.
Bigelow(2010) express compatibility issues he was faced with after virtualizing a legacy system “Hardware incompatibility issues are among the most common virtualization problems. Even though virtualization does a good job of abstracting workloads from the underlying hardware, any software that is hardware-dependent can lead to virtualization problems.” He also went on further to express that before virtualizing any physical machine, full consideration should be taken to ensure that the new environment reflects the old as best as possible. This in turns narrows the window of incompatibility.
Olzak, T., Boomer, J., Keefer R. M., Sabovik, J. (2010) Microsoft virtualization: master Microsoft server, desktop, application, and presentation virtualization, 1st edn., Amsterdam: Syngress.
Prangchumpol, D., Sophatsathit, P., Lursinsap, C., Tantasanawong, P. (2015) ‘Resource Allocation with Exponential Model Prediction for Server Virtualization’, Academic Journal, 13(5), pp. 385 [Online]. Available at:http://eds.b.ebscohost.com.liverpool.idm.oclc.org/eds/pdfviewer/pdfviewer?sid=66a5454d-7cd9-47f8-baa8-51206cf7fe23%40sessionmgr102&vid=0&hid=127(Accessed: March 12, 2016).
Turner, V., Gantz, J. F., Reinsel, D., Minton, S. (2014) The Digital Universe of Opportunities: Rich Data and the Increasing Value of the Internet of Things. [Online]. Available at: http://idcdocserv.com/1678 (Accessed: March 12, 2016).
Jeong, D. et al., 2015. Investigation Methodology of a Virtual Desktop Infrastructure for IoT. Journal of Applied Mathematics [Online], 2015, 1–10. Available from:http://eds.a.ebscohost.com.liverpool.idm.oclc.org/eds/pdfviewer/pdfviewer?vid=4&sid=60bea189-efeb-487f-b9ba-6c5fae28de2d@sessionmgr4005&hid=4111 (Accessed March 12, 2016).
Bigelow, S.J., 2010. Hardware compatibility issues lead to virtualization problems. Hardware Compatibility Issues Lead To Virtualization Problems. Available from: http://searchservervirtualization.techtarget.com/feature/hardware-compatibility-issues-lead-to-virtualization-problems (Accessed March 12, 2016).