RSS
Логотип
Баннер в шапке 1
Баннер в шапке 2
Project

IBM will create the system of analytics for the world's largest telescope

Customers: Square Kilometre Array

Science and education

Contractors: IBM


Project date: 2012/04  - 2024/04
IBM works on creation of new technologies date of management and analytics for the world's largest radio of the Square Kilometre Array (SKA) telescope.

IBM works on creation of new technologies date of management and analytics for the world's largest radio of the Square Kilometre Array (SKA) telescope.

The new telescope will start working only in 2024, and it will make so many data that even computers of the future will process them with great difficulty, the company predicts. "It is the research project within which the purpose is creation of a system capable to process hundreds of exabytes of data daily", - Engbersen (Ton Engbersen), the project director from IBM said Tone.

The Netherlands selected to IBM and Institute of radio of astronomy of the Netherlands (Netherlands Institute for Radio Astronomy, ASTRON) the five-year grant for the amount of $43.6 million covering expenses on creation of such system.

Work on SKA is conducted by consortium from 20 government agencies. It will be not only the largest, but also the most sensitive telescope in the radio world. Perhaps, he will allow scientists to answer a question, how exactly there was Big Bang about 13 billion years ago. SKA will consist of 3 thousand small antennas, each of which will make an ongoing data stream.


As soon as SKA works at full capacity, it will produce more than 1 exabyte of data a day

As soon as SKA works at full capacity, it will produce more than 1 exabyte of data a day (1 exabyte = 1 billion gigabytes). By estimates of IBM, these data will exceed the daily incoming traffic of a World Wide Web on volume. Data will be taken off from the telescope located in Australia or South Africa and then to be summed up and sent to scientists worldwide. Information processing will come down to collecting of data packets from each single antenna.

According to Engbersen, scientists are going to use the most modern equipment for data processing, but in a system design they all the same should step for a framework of the existing technological capabilities.

Until it is also solved whether will be given to be flown down in the uniform data processing center (DPC) or it will be several of like those. One more task which needs to be solved is provision of energy of such large number of the equipment. It is necessary to configure also algorithms of data transmission according to an equipment configuration.

As a result highly productive system will be able to process from 300 to 1500 petabyte of data a year. It in tens times more, than makes the largest scientific generator of data in the world – the Large Hadron Collider (15 petebayt annually).