RSS
Логотип
Баннер в шапке 1
Баннер в шапке 2

Oracle Data Integrator for Big Data

Product
The name of the base system (platform): Oracle Data Integrator 12c
Developers: Oracle
Date of the premiere of the system: 2015/04/14
Technology: Big Data,  Data Mining,  DBMS,  Development tools of applications,  Data processing centers are technologies for DPC

Oracle Data Integrator for Big Data is an integration tool of technologies Hadoop and NoSQL with relational databases and corporate applications.

On April 14, 2015 the Oracle corporation provided a component of own strategy in Big Data – Oracle Data Integrator for Big Data.

A system is intended for the organization of properties of availability Big Data. This integration tool supplements the announced Enterprise Big Data portfolio and assists development of the concept Oracle about sharing of technologies Hadoop and NoSQL with relational databases, to their protected deployment in any model — a public cloud, a private cloud or infrastructure of the enterprise.

Oracle Data Integrator for Big Data helps customers with fast transition from data to solutions, rationalization of development for Hadoop, to increase in data transparency and improvement of data management on the scale of the organization. This solution provides to the companies access to various data types in corporate and cloud sources, helps to increase performance of processing of the growing information volumes and quality of the business solutions given for acceptance, observance of the regulating regulations. The solution Oracle Data Integrator for Big Data provides the most effective consolidation of the existing investments of clients into Hadoop cluster, without requiring installation of the proprietary software or the separate server.

Oracle Data Integrator for Big Data Oracle Data Integrator 12c, Oracle GoldenGate for Big Data and Oracle GoldenGate 12c are a part of the product portfolio Oracle for data integration providing delivery of data, a federirovaniye of data, management of metadata, ensuring quality of data, movement of data arrays and replication in real time.