DPCs of Google
The Internet corporation built network of data centers practically without use of hardware switches and routers from the known world brands. Instead of this Google in cooperation with Asian producers of open servers developed the program-controlled telecom equipment which allowed not only to save, but also to provide big flexibility of network.
|Technology:||Cloud Computing, IaaS - Infrastructure as service, SDN Software-Defined Network Software-defined networks, DPC|
- The main article about the company: Google
- Relations of Google with the governments and intelligence agencies of the countries of the world
- History of Google
- Absorption and asset sale of Google
- Employees and work in Google
- Financial performance of Google
- Stocks of Google
- Google Russia
- Offices Google
2020: Construction of data center in Poland for $2 billion
At the end of June, 2020 it became known that Google builds in Poland data center for $2 billion for work with cloud services. It is the largest investment into infrastructure of this kind in the country, the director of business development of Google Cloud in Poland and Central and Eastern Europe Magdalena Dziewguc said.
Poland positions itself as the regional technology center, and it is unsurprising - in May, 2020 Microsoft also announced plans to invest $1 billion in the Polish data processing center. The deputy prime minister of Poland Jadwiga Emilewicz noted that, by its estimates, Google can invest in the new project from $1.5 to $2 billion.
About plans of Google to start the new cloud region in Warsaw, it became known in the fall of 2019. Then the CEO of Google Cloud Thomas Kurian wrote in the blog that the choice fell on Warsaw because that "Poland endures the period of rapid growth with the accelerated digital transformation and already became the international development center of the software".
It is supposed that the new data processing center will service Poland and the countries Central and East Europe. The new cloud region will have three zones for protection against interruptions in service and will be started with a set of key products of Google, such as Compute Engine App Engine, Kubernetes Google Engine, Cloud Bigtable, Cloud Spanner and BigQuery.
The new data processing center will become a part of Google network of 20 cloud regions and 61 zones of access which service clients of Google Cloud (as of the end of June, 2020). As partners of the company in this project Domestic Cloud Provider, the Polish bank and the state-owned investment firm the Polish fund of development acted. Opening of the Google center is planned for the beginning of 2021.
Investment of 3 billion euros into the European DPCs
On September 20, 2019 Google announced investment of 3 billion euros into development of data centers in Europe. The budget is expected 2 years, the CEO of the company Sundar Pichai at a press conference in Helsinki told.
|It is fantastic news to Finland — the prime minister of the country Antti Rinne who also participated in this press conference said.|
According to him, in called Google of 3 billion euros 600 million euros investments into the new DPC located in the Finnish city of Hamina enter. Taking into account these investments total expenses on this project will be 2 billion euros since 2009 that will bring to the region 4.3 thousand jobs a years, Rinne reported.
From 2007 to September of the 2019th Google invested more than 15 billion euros in the European Internet infrastructure.
The company builds the data processing centers using renewable energy sources. In September, 2019 Google concluded the biggest bargain in the history on purchase of renewable energy which nearly a half will be made in Europe through start of 10 new energy projects.
|Investments in the amount of more than 1 billion euros into new power infrastructure in the EU — from the new project on purchase of wind energy in Belgium to five projects in the field of solar energy in Denmark and two projects on wind power in Sweden will turn out to be consequence of the signed agreements — Sundar Pichai said.|
18 new agreements in which it is about energy on 1600 MW are signed in total. It will increase a portfolio of transactions of Google in the market of wind and solar energy more than by 40%, up to 5500 MW.
Earlier in September, 2019 Google announced plans for construction of data center Tainan. It is about the second DPC of the company on Taiwan after start of an object in the city of Zhanghua in 2013.
Google invests 600 million euros in data center on the Russian border
Google already has a DPC in Hamin. It is created from paper-mill which the Stora Enso company sold to the American corporation in 2009. The enterprise in a data processing center of Google spent about 800 million euros for transformation.
Google calls this data center one of the most technically advanced and energy efficient in the network. An object is cooled at the expense of sea water from the Gulf of Finland that allowed to reduce energy consumption.
The new data center in Hamin will also be located in the territory of the former paper-mill. Earlier it belonged to Summa company.
Thus, the general investments of Google in Hamin will make 1.4 billion euros. The secretary of the Ministry of Economy and Employment Jari Gustafsson (Jari Gustafsson) called these investments "good news and the certificate of stable and competitive environment". According to him, the funds allocated by Google strengthen a digital medium of Finland.
The head of municipality of Hamina Hannu Mukhonen (Hannu Muhonen) also welcomed investment plans of Google and noted that they promote increase in attractiveness of the region in general.
|Demand for Google services grows every day, and we create infrastructure of DPCs to correspond to these requirements. It, in turn, has a positive impact on the Finnish economy, including due to creation of jobs — the head of the Finnish representative office of Google Antti Jarvinen said.|
Google increased performance of the wind power stations by 20%
On February 27, 2019 it became known that researchers from DeepMind laboratory taught to predict a neuronet the capacity of wind-driven generators in 36 hours on the basis of retrospective data on their performance and weather in a certain area. According to researchers, use of such algorithm allowed to improve performance of wind power stations of Google company for 20 percent.
Wind power stations — one of the most optimal and used methods of alternative production of energy. Their work strongly depends on wind speed: the rotor of the wind-driven generator should be at height with rather high average speed of wind (about 4.5 meters per second). Nevertheless, authentically to predict a wind force pattern in the location of the station can be difficult because of what the efficiency of its work can be reduced.
Developers from DeepMind (enters into Alphabet holding) suggested to solve this problem with the help of an algorithm which could predict authentically the energy developed by a wind power station in a certain area on the basis of the available data. They trained a neuronet on data on wind force in a certain area, weather and energy which is developed by a certain wind-driven generator.
Using it the algorithm can predict the energy developed by the station in 36 hours. It gives enough time to estimate development and costs of energy and also to take measures which will allow to save resources for its production. Meanwhile use of an algorithm (researchers report that they test it since 2018) helped Google to improve performance of their wind power stations for 20 percent: in the future the company is going to improve work of an algorithm, collecting data.
Google completely automated process of cooling of the DPCs
In August, 2018 it became known of full automation of process of cooling of data processing centers of Google. Thanks to it the company could reduce energy consumption by 30%, reports the startup of DeepMind entering into Google which developed a new management system.
Forces of the DeepMind programmers created a new algorithm which on the basis of information arriving from several thousand sensors, estimates each five minutes the equipment cooling system. The neuronet which was trained at the data collected in two years is the basis for an algorithm.
The technology predicts the most effective operation parameters of the equipment in data centers, providing maintenance of stable temperature and minimum possible energy consumption.
Google achieved economy of energy in DPCs almost for 30% in two months after implementation of development of DeepMind. Though the cooling system of data centers is completely automated, it is monitored by specialists to be convinced of stable operability of technology and to minimize risks of overheating or an error of artificial intelligence.
Google considers that this algorithm can be used also in other areas connected with the raised energy consumption.
DeepMind began to apply machine learning to reduce energy consumption, in 2016. Due to use of a neuronet the company could make data centers 15% more energy efficient and also reduce energy consumption by 40%.
|It was surprising to see that AI learns to use winter conditions and reduces water temperature that , in turn, reduces energy costs for cooling. Instructions do not improve over time, but AI is improved — the operator of one of data centers of Google Dan Fuenffinger noted.|
Nearly a half of emissions of CO2 from work of the Internet is the share of Google
As reported on May 7, 2018 the Quartz edition, the website of Google, processing about 3.5 billion requests a day, generates about 40% of carbon emissions of work of all Internet. In spite of the fact that the Internet "is virtual", actually it relies upon millions of physical servers in data processing centers worldwide which are connected via submarine cables, switches and routers. The most part of power sources for work of this infrastructure select a carbon dioxide gas in air at combustion of fuel. One of researches demonstrates that emissions from Internet activity are similar to what is generated by the global aviation industry.
The researcher Joana Moll created data visualization of CO2GLE. The service uses data of Internet traffic of 2015 and is based on the assumption that Google.com processes approximately on average 47 thousand requests every second that represents expected 500 kg of emissions of CO2 per second, or 0.01 kg for a request. According to Moll, digits are approximate.
When the edition shared data with Google, the company did not dispute calculations. The representative of Google also reported that providing one month of services to one user creates approximately the same number of emissions of greenhouse gases, as well as driving of the car on one mile, or about 360.7 grams of CO2, counted the edition.
The researches Moll were concentrated on Google, but other websites also promote carbon emissions. For example, Facebook reported that its data processing centers and business operations in 2016 brought to 718,000 metric tons of emissions of CO2 that about 77,500 American houses working at the electric power are comparable to annual development of CO2.
Google — the leader in use of energy from renewable sources
In 2017 Google became the largest corporate buyer of energy from renewable sources in the USA. The Internet giant purchased rather net energy to provide requirements of all the data centers and divisions for the world.
In 2017 Google purchased 3 gigawatts of energy from solar, wind farms and other objects of renewable power. As a result the company was the leader among the American corporations in purchases of net energy. The second and third places were taken by Amazon and Apple, The Financial Times with reference to data of Bloomberg New Energy Finance reports.
|Investments into renewable energy sources are reasonable for our business — she told, pointing to the falling cost of wind and solar energy. — These are long-term contracts with the fixed prices, and in some markets the net energy costs as much and even cheaper than electricity received in the traditional way.|
According to Palmer, from the moment of her arrival to Google six years ago requirements of corporation for energy annually grew at two-digit rates, and such dynamics will probably remain further.
As DPCs are connected to power network, directly it is impossible to supply them with electricity from renewable sources, as a rule. For this reason Google signs delivery contracts of net energy, and sells unspent excesses in the local markets.
Now, when Google achieved the goal and for 100% passed to renewable sources, the company will concentrate efforts on ensuring more direct receipt of net energy in DPC. It is for this purpose going to increase investments into drives of energy and to lobby changes in policy in the electricity market, Palmer added.
Google completely passes to solar and wind energy
The company became the world's largest corporate buyer of renewable energy, having reached the total power of 3 GW. The general investments of Google into the sphere of pure power reached $3.5 billion, writes in November, 2017.
Google officially passes to absolute use of solar and wind energy. The company signed the contract with three wind farms: Avangrid in South Dakota, EDF in Iowa and GRDA in Oklahoma which total power is 535 MW. Now offices Google will consume 3 GW of renewable energy worldwide.
The general investments of the company into the sphere of power reached $3.5 billion, and 2/3 of them it is the share of objects in the USA. Such interest in "net" sources is connected, first of all, with decline of cost of solar and wind energy by 60-80% in recent years.
For the first time Google signed the cooperation agreement with a solar farm in the 114 MW Iowa in 2010. By November, 2016 the company was already a participant of 20 projects on renewable power. Completely it was going to pass to energy of the sun and wind in December, 2016. Now Google the largest in the world the corporate buyer of renewable energy.
2016: As Google did without the equipment of Cisco and HP during creation of worldwide network DPC
Google started one of the largest computer networks in the world which stretches from USA to Finland and Taiwan. With its help the company aims to provide fast Internet services (search Google Maps YouTube , etc.) to a huge number of people in the different countries. The irony is that Google built this large-scale network without involvement of such companies as Cisco Dell, HP and IBM which supply the equipment for many computer networks in the world, it writes the edition Wired on January 27, 2016.
For the last 15 years the Google network became so big that the companies any more not to do without cheaper and effective method of its creation. Traditional equipment too difficult and expensive. Besides, it is rather difficult for them to manage.
For this reason Google in cooperation with the different companies from Asia and other places was engaged in production of own flexible products, including servers and network switches. These devices use own software of Google which can be configured easily for the solution of certain tasks that it is impossible in a case with traditional solutions.
It was in style of Google, however the idea found application far outside the company, Wired notes. The social network Facebook, whose audience grew to hundreds of millions of people, too created own equipment. Then under the auspices of a non-commercial initiative of Open Compute Project the company opened access to the technologies. Other players can use and improve them, and by means of serial production to reduce the price. Apple, Microsoft, Rackspace and Goldman Sachs and made. The appeared new type of the equipment is not the just next creation of Google, but by the only real way of creation of the largest online services.
The initiative of Open Compute Project was supported on January 27, 2016 by large telecommunication operators AT&T Verizon, Deutsche Telekom and SK Telecom. Within the subproject focused on telecommunication business they will study possibilities of open servers and network equipment with the purpose to increase efficiency and to reduce costs. In Russia similar approach the РДП.ру company professes.
|"Everyone looks for the same synergy and flexibility. Accumulation of experience and knowledge sharing will go on both ways — the director of planning of infrastructure of Verizon Gagan Puranik considers, speaking about his company and others who already joined an experiment of Facebook in the field of the open hardware.|
Telecommunication business is one of the largest buyers of the IT equipment in the world. Interest in an initiative of Open Compute Project from operators will allow to take the next big step from the traditional equipment for data centers towards the flexible programmable equipment created by Google and Facebook. It, certainly, undermines business of such companies as Cisco and HP, Wired writes.
|Telecommunication companies understand that the benefit which they receive from work with traditional suppliers continues to decrease. Open Compute is a natural method of fight against it. It helps to separate more open and flexible ecosystems of supplies of equipment from a manner of the acting suppliers to do complex business — Jay And Rivers (JR Rivers) who once was engaged in development of the equipment in Cisco and helped Google to manufacture switches considers, and now (by the beginning of 2016 — a comment of TAdviser) he sells to large customers (including telecommunication) the network software similar on the fact that he created Google.|
AT&T spoke about plans for virtualization of 75% of the network by 2020 long ago. In other words, the company moves towards model of Google assuming that the logical structure of network is put not in the equipment, and in the software. It is called the program configured networks (SDN) and means what AT&T aims to get rid of an obsolete equipment.
According to media, AT&T already developed some network equipment to adapt to this transition. In June, 2015 the company opened specifications of a number of the developments within Open Compute Project though then she did not participate in this project yet.
As Jay And Rivers notes, telecommunication companies cannot implement the last technologies as quickly as it is done by participants of the Internet industry.
|These children move more slowly, than any of the web companies. I do not think that next year there will be cardinal changes — he considers.|
However these companies develop this direction and do it publicly. What from it will be received by Facebook? It does not treat a telecom. At first sight can seem that for social network there is no benefit from the fact that AT&T or Verizon are engaged in mastering of SDN. But the indirect advantage nevertheless is. The Facebook service is used in networks which are managed by these telecommunication companies. Therefore what is good for them is good also for Facebook, Wired considers.
At the same time Facebook studies methods of providing Internet access long ago where it is absent, using unmanned aerial vehicles and satellites.
|We focused on connecting the whole world to the Internet, but we do not intend to do it alone — the head of infrastructure of Facebook Jason Taylor notes.|
2011: Start of DPC in Oklahoma and the construction plan of DPC on Taiwan
Construction of DPC in Oklahoma cost Google $600 million, an object was put into operation in the fall of 2011. Over 100 employees are engaged in it, they ensure smooth operation of a number of services – Gmail, Google Maps, Google Search, Google Plus. The company intended to spend $700 million more for expansion of capacities, this amount includes not only construction expenses of IT infrastructure, but also arrangement of the adjacent territory. In the neighbourhood the cafe, the gym, office spaces and even the game room will be located.
Google will spend $300 million more for construction of DPC in the district Changkhua on Taiwan for support to the growing expansion on the Asian markets. On the example of this object the company intends to approve for the first time a new energy-saving technology which allows to change the mode of cooling of the equipment day at night and to reuse previously accumulating energy of the selected heat. It is supposed that opening of DPC will cause emergence of 25 jobs. As Chien Lee-feng, the managing director of Google Taiwan noted, in the long term the DPC should become a hub for cloud computing and play a link role on the way of traffic to the Asian region.
1997: The first server of Google
The body of the first server of Google was made of parts of the designer of Lego as Brin and Page lacked money. This equipment was placed in Stanford campus. At first the search system located at the address google.stanford.edu, and the domain google.com was registered on September 15, 1997.
- ↑ Google to invest up to of $2 billion in Polish data center, paper says
- ↑ Google Unveils of $3 Bn European Data Centre Expansion Amid Biggest Renewable Energy Purchase In Corporate History
- ↑ Google to invest 600 million euros in Finnish data center
- ↑ the Neuronet of DeepMind predicted the capacity of wind-driven generators for the day ahead
- ↑ Safety-first AI for autonomous data center cooling and industrial control
- ↑ Nearly a half of emissions of CO2 from work of the Internet is the share of Google
- ↑ Alphabet becomes biggest corporate renewable energy buyer in US
- ↑ Electrek Google completely passes to solar and wind energy
- ↑ of Telecoms Look Past Cisco and HP to Open Source Hardware