RSS
Логотип
Баннер в шапке 1
Баннер в шапке 2
Project

GeoGrid (Stulz CyberAir)

Customers: GeoGrid

Product: Stulz CyberAir

Project date: 2017/01  - 2018/12

2018: Air Conditioning System Organization

Center LLC GeoGrid has three offices, one of which - the head - is located in a mansion on Malaya Kaluga Street.

GeoGrid Center LLC carried out repairs in the building, providing not only for reconstruction, but also restoration. After the repair was completed, the building housed the Company's management, data center, production management, as well as technical services that monitor the operation of engineering systems. However, according to the terms of the agreement with the Department of Cultural Heritage of Moscow, the owner did not have the right to rebuild the mansion and redevelopment the premises. This also affected the placement in the data center building.

File:Aquote1.png
The operation of the data center requires a complex engineering support, which naturally consists, among other things, in the use of numerous and very massive equipment. At the same time, multi-core computing machines, the uninterrupted operation of which needs to be provided around the clock, require a very large amount of electricity (with the prospect of further growth in its consumption). Therefore, the computer halls of the Information Processing Center were placed in the basement rooms without using the main areas of the house, "said GeoGrid Center LLC.
File:Aquote2.png

This required non-standard solutions. For example, the roof of the building was not only not adapted to install equipment on it (the house is a tower with a sloping roof), but was also an object of cultural heritage. So, together with HTS specialists, it was decided to install Guentner dry coolers on the ground - more precisely, deepen them into the ground, since it is forbidden to install anything above the lower edge of the ground floor windows in front of the facades.

File:Aquote1.png
By approving such a solution, we took into account that in this way we reduce the productivity of dry coolers by 25%. However this step was justified as computing power are located not in all racks: one rack was switching, on one occupied the equipment of media systems and a security system, three were taken away under internal IT infrastructure, four - under the systems of data storage and backup owing to the large volume of the processed information, - Konstantin Kotelnikov tells general direktorIT-Breeze. - For this configuration, the loss of 25% cooling power was not critical: we were calculated to fit into the N + 1 redundant configuration. Practice has shown that a one-time load of 10 racks of design servers and 10 racks of remaining equipment generates 140 kW of heat. "
File:Aquote2.png

Another difficulty was related to the configuration of the room, which entailed restrictions on the installation of air conditioners. Stulz CyberAir 3 precision air conditioners were chosen for the project - IT-Breeze had a positive experience using the solutions of this vendor in collaboration with HTS. In the room it was necessary to install two groups of three air conditioners, which at the same time turned out to be perpendicular to each other, which led to the intersection of air flows under a false floor at right angles. To avoid overheating, two racks were loaded with switching and safety equipment - experience has shown that they are the coldest.

File:Aquote1.png
The optimal solution for the configuration of this room was the precision air conditioners of the CyberAir 3 GES series with indirect friculing and inverter compressors of STULZ, of which we are the official distributor, "said Alexey Kolokolov, manager of HTS. - This option allows you to maximize the reduction of operating costs for electricity, as well as increase the service life of the main equipment, because in the Moscow climate, with such design parameters, indirect friculating is used about 70% of the time. The main advantage of this solution is that air conditioners are able to operate in four different modes, adjusting to the temperature conditions outside the building. One of them is extended friculing, which allows you to postpone the inclusion of the most energy-consuming component - the compressor.
File:Aquote2.png

To solve data processing problems, the customer needed about 10 thousand physical cores. The design power consumption of the created data center is 200 kW (20 racks per 10 kW each). Another 150 kW was allocated for engineering infrastructure. The entire infrastructure is configured with N + 1 redundancy, which provides a high level of fault tolerance. The data center used the classic equipment installation scheme with the organization of a cold corridor, installed safety systems: gas fire extinguishing , video surveillance, physical access control to the premises.

Image:Кондиционеры STULZ CyberAir 3.jpg
Image:Сухие охладители Guentner .jpg