Developers: | IBM |
Date of the premiere of the system: | 2016/06/22 |
Technology: | SDN Software-Defined Network Software-defined networks, Data processing centers are technologies for DPC |
IBM Spectrum Computing is the software-defined infrastructure solution.
On June 22, 2016 IBM announced development of a number of software-defined infrastructure solutions by means of cognitive functions and release of the solution IBM Spectrum Computing. Functions of a system will provide fall forward of access to results of the applications based on data and analysts.
Intellectual software for resource management and loadings of IBM Spectrum Computing will help the organizations to simplify extraction of valuable data from data arrays and to accelerate the high-performance analysis or machine learning. This technology can be used in the different industries. For example, scientists with its help can sort genomes for search of ways of treatment of oncological diseases, engineers – to design Formula One race car, bank employees – to personalize financial services for customer acquisition.
Representation of IBM Spectrum Computing, (2016)
The IBM Spectrum Computing platform contains the new, cognitive, considering resources policy of planning which will help to increase loading of the existing computing resources and optimizes the cost of management. At the same time, it will accelerate obtaining results in the field of high-performance computing systems, analytics of Big Data, applications of new generations and working environments open source, for example, of Hadoop and Apache Spark.
IBM Spectrum Computing will allow the organizations to consolidate infrastructure of DPC and to distribute resources between local, cloud or hybrid environments. In its structure three software products.
Software will help to receive quickly predicted analytical picture:
- IBM Spectrum Conductor developed for acceleration of data analysis of software works with cloud applicaions and with the environment of an open source code, reducing time of obtaining results and allowing to distribute resources for harder and harder applications. At the same time, it protects data and manages them for all lifecycle.
- The integrating Apache Spark of software IBM Spectrum Conductor with Spark simplifies Apache Spark implementation - platforms for analytics of Big Data open source, and allows to accelerate obtaining results of information processing for 60%.
- Accelerating researches and design of software of IBM Spectrum LSF - universal software of management of loadings. The flexible and user-friendly interface helps the organizations to accelerate a research and design by 150 times, controlling costs by advanced resource allocation and their rational use.
By software of IBM Spectrum Conductor it is created in two years of cooperation of the IBM developers and customers focused on development of analytics of new generation. It at the same time manages a set of applications, providing resource allocation for operational obtaining results. Highly effective multi-user planning provides safe and confidential data exchange and resources.
IBM intends to open access to the key IBM Spectrum Conductor component for further expansion of use Apache Spark by scientists and developers in the field of data processing.
Today data are generated with the huge speed advancing capabilities of the person to perceive such array of information and to analyze it for identification of business insights. At the heart of cognitive infrastructure there is a need for high-performance analytics as structured, and unstructured data. IBM Spectrum Computing will allow the organizations to implement quicker new technologies, to try to obtain higher and predicted performance. Bernard Spang, vice president of division of IBM Software Defined Infrastructure |