[an error occurred while processing the directive]
RSS
Логотип
Баннер в шапке 1
Баннер в шапке 2
2010/05/09 18:05:46

Virtualization Classification and Applications

Virtualization in computing is the process of representing a set of computing resources, or their logical combination, which gives any advantages over the original configuration. This is a new virtual view of resources not limited to implementation, geographic location, or physical configuration of components. Typically, virtualized resources include computing power and data storage.

A catalog of virtualization technologies and projects is available on TAdviser.

Content

A good example of virtualization is symmetric multiprocessor computer architectures that use more than one processor. Operating systems are typically configured so that multiple processors are presented as a single processor module. This is why software applications can be written for a single logical (virtual) computing module, which is much easier than working with a large number of different processor configurations.

The concept of a virtual environment (in the original - virtualization engine) is a new area of ​ ​ virtualization, which gives an overall holistic picture of the entire network infrastructure using aggregation techniques.

TAdviser: Russian virtualization. Review of 15 developers of domestic products

In December 2022, the TAdviser analytical center studied who is on the Russian virtualization market, analyzed the functional and integration capabilities of products, and assessed the experience of development companies. The partner of the project was Softline, which is not the owner of any of the players involved in the study, and, being a leading provider of solutions and services in the field of digital transformation, is interested in promoting each of them. More.

IDC: Classification of stages of development of virtualization technologies

  • For Virtualization 1.0, primary, basic tasks were typical, such as encapsulating resources in virtual machines and allocating resources to physical machines, dynamically consolidating resources into shared pools.
  • In the subsequent, current for developed countries, the Virtualization 2.0 stage has become a priority for other tasks related to improving performance, including increasing time between failures, increasing readiness, recovering from emergencies, load balancing, and managing virtual clients.
  • The future stage, Virtualization 3.0, involves the creation of automated data centers, the presentation of various services, and the orientation to established sets of rules. This classification reflects the dynamics of processes in America and Western Europe, where since 2009 the number of virtual servers has exceeded the number of physical ones.

Virtualization (Global)

White Paper: Virtualization (Global Market)

Gartner predicts the decline of the virtualization market

The global server virtualization market (x86 architecture) will reach $5.6 billion in 2016, an increase of 5.7% compared to last year, Gartner predicts. Despite the generally positive dynamics, the number of software licenses purchased in this area began to decline for the first time. The growth of the market is provided only through service services. This suggests that the software segment is coming close to the peak of its development[1].

"Over the past few years, the server virtualization market has matured. In many organizations, more than 75% of servers are virtual, which indicates a high level of saturation, "said Michael Warrilow, research director at Gartner.

According to analysts, the attitude towards virtualization among organizations of various sizes differs more than ever. The popularity of virtualization among companies with larger IT budgets in 2014-2015 remained at the same level. Such companies continue to use virtualization actively, and saturation is approaching in this segment. Among organizations with smaller IT budgets, virtualization is expected to decline in popularity in the next two years (until the end of 2017). This trend is already observed.

"Physicolization"

According to Gartner's observations, companies are increasingly resorting to the so-called "physicolization" - starting servers without virtualization software. It is expected that by the end of 2017, in more than 20% of such companies, less than a third of operating systems on x86 servers will be virtual. For comparison, in 2015 there were two times fewer such organizations.

Analysts note that companies have different reasons for refusing virtualization. Today, customers have new options - they can use a software-configurable infrastructure or hyper-converged integrated systems. The advent of such options forces virtualization technology providers to act more actively: expand the available functionality of their solutions out of the box, simplify interaction with products and reduce the payback period for customers.

Hyperconverged Integrated Systems

In early May 2016, Gartner released a forecast for hyperconverged integrated systems. According to analysts, in 2016 this segment will grow by 79% compared to 2015 to almost $2 billion and within five years will reach the mainstream stage.

In the coming years, the hyperconverged integrated systems segment will show the highest growth rates compared to any other integrated systems. By the end of 2019, it will grow to about $5 billion and will occupy 24% of the integrated systems market, Gartner predicts, noting that the growth of this direction will lead to cannibalization of other market segments.

Analysts refer to hyperconverged integrated systems (HCIS) as hardware and software platforms that combine software-configurable computing modules and a software-configurable storage system, standard accompanying equipment and a common control panel.

Types of virtualization

Virtualization is a generic term covering resource abstraction for many aspects of computing. Some of the most specific examples of virtualization are listed below.

Paravirtualization

Paravirtualization is a virtualization technique in which guest operating systems are prepared for execution in a virtualized environment, for which their kernel is slightly modified. The operating system interacts with the hypervisor program that provides it with a guest API, instead of directly using resources such as a memory page table. Virtualization code is localized directly to the operating system. Paravirtualization requires that the guest operating system be modified for the hypervisor, and this is a disadvantage of this method, since such a change is possible only in the case, if the guest OS has open source codes that can be modified according to the license. �V the same time, paravirtualization offers performance almost like a real non-virtualized system, as well as the ability to simultaneously support various operating systems, as well as with full virtualization.

Infrastructure Virtualization

In this case, we will understand by this term the creation of an IT infrastructure that is independent of the hardware. For example, when the service we need is located on a guest virtual machine and, in principle, it is not particularly important for us on which physical server it is located.

Virtualization of servers, desktops, applications - there are many methods for creating such an independent infrastructure. In this case, multiple virtual or "guest" machines are hosted on the same physical or host server by special software called "hypervisor."

Modern virtualization systems, in particular VMware and Citrix XenServer, for the most part work on the principle of bare metal, that is, they are put directly on bare hardware.

Example

The virtual system, is constructed not on bare metal a hypervisor, and on a combination operating system Linux of CentOS 5.2 and VMware Server on the basis of the server Intel SR1500PAL platform, 2 Intel processors of Xeon 3.2/1/800, 4Gb by RAM, 2xHDD 36Gb RAID1 and 4xHDD 146Gb in RAID10 total amount 292Gb. The host hosts four virtual machines:

Server Virtualization

Analysts Gartnernote: "Virtualization x86 server infrastructure acts as a starting point for two of the most important modern industry trends - infrastructure modernization and." cloud computing Moreover, "it is fundamentally changing the way businesses deploy, manage, and deliver IT." The researchers are convinced that "virtualization of the x86 server infrastructure is now a key area of ​ ​ development, information technology making the strategic development of the server towards virtualizations cloud computing more obvious to the IT directors of large enterprises."

File:gartner - virt.png

OS-level virtualization

Resource Virtualization

  • Partitioning is the division of a single resource, usually large enough for this, (for example, disk space or network bandwidth) into a number of smaller, easier-to-use resources of the same type.

For example, the implementation of resource sharing includes the Crossbow Project, which allows you to create several virtual network interfaces based on one physical.

  • Aggregate, allocate, or add multiple resources to large resources or pool resources. For example, symmetric multiprocessor systems combine a plurality of processors; RAID and disk managers combine multiple disks into a single large logical disk; RAID and network equipment use many channels, combined so that they appear as a single broadband channel. At the meta-level, computer clusters do all of the above. Sometimes this also includes network file systems abstracted from the data stores on which they are built, for example, Vmware, VMFS Solaris, ZFS NetApp WAFL

Application Virtualization

  • Application Virtualization - Includes a production environment for a locally running application that uses local resources. The virtualized application runs in a small virtual environment that includes registry keys, files, and other components needed to run and run the application. This virtual environment works as a layer between the application and the operating system, which avoids conflicts between applications. Application virtualization includes systems such as Softgrid and Thinstall.

Virtualization Application Areas

  • Server Consolidation: Often, virtual machines are used to combine multiple physical machines on fewer, more powerful ones as virtual machines.
  • Test laboratories and training: Due to the simplicity in deploying virtual machines, they are often used to build test stands, as well as to teach new products and technologies.
  • Distribution of pre-installed: ON Many software developers create ready-made images of virtual machines with pre-installed products and provide them on a free or commercial basis. Such services are provided VMware VMTN by or Parallels (Parallels Software) PTN

Virtualization provides a lot of amenities and benefits, especially on "heavy" systems where hardware partitions are used. The use of multiple servers on a single physical server significantly improves the efficiency of its use. Moreover, on current multi-core processors with special instruction sets, performance losses are either minimal or none at all. As a result, the average company needs one server instead of four. The savings are obvious: reduced costs for equipment, for electricity, in the accommodation area.

Everything related to Internet/intranet services can be virtualized, these tasks do not require significant resources. Servers that share files and print services, as well as serving collaboration systems, are more significant for business, but even here virtualization does not entail any special problems. However, for database servers, starting with a certain amount of stored data, virtual servers are already poorly suited. The main reasons are the existing limitations on the amount of resources that can be allocated to a virtual machine. Therefore, BI systems, as well as highly critical applications like ERP or CRM, are not recommended to be deployed on virtual servers.

In addition, the benefits of using virtualization begin when some critical massively released physical servers are reached. The total cost of a system that uses virtualization, taking into account software licenses, can be higher than several physical servers, each of which will solve one problem. And it is not always possible to recoup this difference by reducing the cost of ownership. In addition, the physical server turns into a single point of failure, and its breakdown or failure will affect all virtual servers that are installed on it. According to analysts at Gartner, virtualization applies to only 40% of all servers.

Virtualization in SMBs

Complex IT operations are not unique to large organizations. SMBs also face significant heterogeneity in IT operations. At the same time, they have a much smaller number of employees. Despite the fact that about 80% of all organizations have physical servers, 34% of all organizations simultaneously manage physical and virtual servers, as well as operations in cloud environments, the so-called "triple service" in computer environments. Among organizations that manage virtual infrastructure, 54% have two or more different hypervisors. At the same time, the range of names for these hypervisors is much wider than VMware vSphere and Microsoft Hyper-V. 67% of organizations with virtual servers use at least one hypervisor that is not a product of these two market leaders. 42% of organizations store at least some of their data in the cloud, and of those organizations that back up data remotely for disaster recovery reasons, 65% store at least some of this data in cloud storage. Clearly, administrators need storage solutions that cover all three storage methods: physical, virtual, and cloud.

In
May 2014, Acronis sponsored IDC's global cross-industry survey of small and medium-sized organizations (up to 1,000 employees) to explore the ever-changing data protection and disaster recovery needs of these organizations. The interviewees were all IT employees responsible for purchasing decisions and for the overall management of groups that deal with data protection issues or influence data protection procurement decisions. The total sample size was 401.

Common challenges administrators face when developing backup solutions for these hybrid (physical, virtual, and cloud) environments include controlling the complexity and cost of solutions, and moving data and systems between physical, virtual, and cloud environments. As economic considerations dictate further reductions in IT personnel, administrative functions are gradually shifting to distributed workload IT professionals - especially in small and medium-sized organizations. Distributed load specialists typically have the skills to manage servers and software applications, but they are less familiar with storage management procedures, while they increasingly have to take on this function.

More than 70% of organizations surveyed use several backup software applications. data storage Administrators try to pair data protection solutions with software applications based on their characteristics, primarily considering the specialization of solutions, ease of use and cost as the main criteria for buying. Over 40% of organizations have purchased a separate backup storage product designed specifically for virtual environments. The ease of use of a backup storage solution is of particular importance as distributed-load professionals assume greater responsibility for managing software applications, including features they are not typically familiar with, such as backup storage and disaster recovery of an organization.

As organizations become increasingly dependent on IT services, administrators focus on maximizing downtime. IT services more than ever take into account critical metrics such as the cost of downtime, and managing availability in full compliance with the requirements of a guaranteed service level agreement is becoming common practice. Among small and medium-sized organizations surveyed, about 60% believe that the cost of downtime of the most critical application is from 20 to 100 thousand US dollars. The recovery point objective (RPO) for approximately 85% of them is less than one hour, while the recovery time objective (RTO) for 78% is less than four hours. The recovery point target determines the acceptable amount of data that can be lost in the event of a failure. The recovery time target determines the time period required to restore normal operation after a software application failure.

Virtual Environment Information Security

Despite the rapid growth of the virtual solutions market, the security of the virtual environment has not yet come to the attention of Chief information officers, and even more so - company management. Meanwhile, attackers who gain control of the physical host gain access to many business-critical applications running on virtual machines. Thus, the price of an admitted information security incident increases sharply.

When implementing projects virtualizations , a number of organizational threats arise. First, the ease with which virtual machines are created leads to the emergence of unregistered accounts, unnecessary services, and errors in configuring the virtual infrastructure. At the same time, there are additional problems with auditing the infrastructure, checking it for compliance with the internal and external policies of the information security company, and managing updates. The complexity of infrastructure management increases, which leads to a decrease in the quality of the management process and the overall information security level. In addition to organizational threats, external threats also arise - the likelihood of infected guest machines appearing in a virtual environment, capturing a hypervisor, and a number of other threats inherent in a virtual environment increases.

The virtual environment allows you to safely, quickly and efficiently test business application updates, makes it easy to create 'malware traps' (called honeypot) that mimic the company's real IT environment, and allows you to implement other information security measures.


Virtualization technology implies an additional component - a hypervisor that does not execute third-party code and controls the operation of applications in all virtual environments. Checking for a potential danger organized through hypervisor works more reliably, since malicious code cannot resist it. Attacks on the hypervisor itself are hampered by the fact that it does not execute user code and all drivers are supplied by the developer of the virtualization platform. At the same time, according to Lisachev, the safety of the hypervisor is higher than it is more compact. The hypervisor, VMware he said, is "an order of magnitude more compact than competing ones," so attacking it is even more difficult than Hyper-V or Xen working inside the base, where, in operating system principle, third-party applications can be launched. Thus, virtualization serves as an additional layer of protection when trying to capture applications or user data.

In addition, the virtualization platform can be used to solve certain information security problems. In particular, in a virtual environment, software updates can be tested to check the stability of their operation in a "combat" system. Also, in virtual environments, you can easily deploy trap servers for hackers who look very attractive for hacking, but do not contain real data. By controlling such a trap through a hypervisor, an administrator can identify targeted attacks and collect information to catch criminals. Investigating incidents in a virtual environment is also simplified because it is more difficult for a cracker to get to system logs and "clean up" them. In addition, the entire images of virtual systems can be saved in a backup for subsequent close analysis by forensic experts. All this gives VMware representatives reason to argue that a virtual environment can be better protected than an information system that does not use virtualization.

Virtualization technology integrates applications and data into a single environment. If until recently data and applications were protected in various scenarios: data - using backup systems, programs - installing antivirus tools, then with the transition to virtualization, the methods of protecting programs and data should become closer. Virus scanning technologies are already emerging inside a disabled virtual machine - in particular, such a mechanism is implemented in Trend Micro Deep Security. In addition, malware began to be dealt with by restoring a virtual environment from a clean and verified backup. And this process of combining application and data protection techniques will only expand as virtualization technologies become available.

However, in order for virtual environments to be safer than physical ones, you need to take advantage of the benefits provided by virtualization technology. It is not enough just to transfer the security system already built for the physical infrastructure to the virtualization platform, you need to actively implement the additional functionality described above: to control viruses coming from outside the virtual environment; install trap media; investigate incidents involving virtual machine backups, etc. By applying these state-of-the-art methods of protecting virtual applications, an enterprise will have a better chance of maintaining control over its information.

Virtualization in Russia

According to Microsoft estimates, the Russian virtualization market grew by more than 100% in 2010. "Judging by the number of requests and projects implemented on the basis of our solutions, the market has approximately doubled over the year," says Vasily Malanin, Microsoft HPC and Virtualization Product Manager. "The main reasons for the overall growth of the market are its recovery from the crisis, as well as the growing interest in virtualization technology from customers."

The key players in the Russian virtualization market in 2010 remained Microsoft, VMware, Citrix. For example, Microsoft estimates its market share in the last fiscal year, which ended in July 2010 for the company, at about 40%, based on the number of physical servers on which their virtualization solutions are used.

Most often, Russian enterprises use virtualization technologies in relation to databases, e-mail and ERP applications. Moreover, according to the survey, the majority of all services running in a virtual environment are critical for business. Such data contains a study by Kaspersky Lab on the analysis of the current state of development of virtualization technologies in Russia.

According to Kaspersky Lab experts, virtualization technologies are now at their peak and have already reached such a level of maturity when companies transfer business-critical applications to the cloud, store and process confidential data on virtual machines. The study also showed that Russia is not inferior to Europe and the United States in terms of the duration of use of virtualization technologies. On average, for Russian companies, this period is almost 2 years, which corresponds to the global indicator.

Despite the widespread use of virtualization, more than two-thirds of the surveyed IT professionals in Russia said they had only basic and sufficient knowledge in this area to fulfill their daily duties. Only 13% were able to call themselves real experts - for comparison, in the United States, this figure reaches 38%.

Virtual machines in Russia are not protected effectively enough

Only 11% of Russian companies use specialized tools to protect their virtual servers. This is evidenced by the results of a survey conducted by Kaspersky Lab among IT specialists in Russia. As it turned out, over half of domestic enterprises prefer to protect virtual devices using the same antivirus solutions as to protect physical computers. Moreover, 73% of organizations admitted that they currently use a single security policy for physical and virtual environments, and only a quarter of them consider this as a temporary measure - until their own policy is developed for virtual infrastructure.

Using this approach, companies, according to experts, can negate one of the main advantages of virtualization, namely the efficient use of hardware resources. Unlike specialized solutions, traditional security features require the installation of copies of the antivirus kernel and signature databases on each virtual machine. This can cause phenomena such as "heavy" scanning and updating, which can significantly slow down the host server and even lead to its emergency stop.

"The reason for this situation lies primarily in the lack of business awareness regarding the protection of virtual environments. Despite the growing popularity of virtualization technologies, many companies treat IT security problems with a certain degree of carelessness. We are also talking about underestimating risks - many IT specialists have a misconception about the increased security of virtual infrastructure, and about the low awareness of companies about specialized security solutions. The latter, in particular, includes the recently introduced Kaspersky Security product for virtual environments, developed taking into account the peculiarities of virtualization, "comments Vladimir Udalov, Head of Corporate Products in Emerging Markets at Kaspersky Lab.

At the same time, experts expect that in the future the popularity of specialized solutions will grow. Thus, the overwhelming majority of respondents (74%), as the most important characteristic of an antivirus product, noted its ability to provide protection without compromising the performance of virtual machines. This balance is achievable only when using solutions adapted for protecting the virtual environment that do not require installing an antivirus agent on each machine.

Read also

Notes