[an error occurred while processing the directive]
RSS
Логотип
Баннер в шапке 1
Баннер в шапке 2
2020/11/27 22:50:08

Why implementation of IT systems does not result in the expected effect? Zri in a root

In process of development of IT landscapes and increase in quantity and variety of IT systems by a subject of the increasing criticism and discontent of end users there are problems of quality of data. Chief information officers consider versions of possible solutions, but the market can propose only point solutions for specialized tasks. In article prepared especially for TAdviser by the expert Alexey Eremyashev it is offered a complex view of a problem of low-quality data and the strategy of management of their quality.

Content

File:Aquote1.png
The one who has the best information tries to obtain the greatest success

Benjamin Disraeli, English statesman
File:Aquote2.png

The era of smart devices develops into an era of the smart systems. Providing personal services about tracking of actions, routes of movements, optimal baskets of purchases, smart devices transfer this information to the smart systems. And all information environment adapts under individual preferences.

At the St. Petersburg International Economic Forum of 2017 spoke about a role of modern much information technologies. The president Russia Vladimir Vladimirovich Putin in one of plenary sessions accurately spoke the state priorities of creation and development digital economy:

File:Aquote1.png
We are capable to try to obtain leadership in a number of the directions of new economy, first of all digital. The Russian IT companies, certainly, are globally competitive. Or we will head this process, or we will be even more dependent.
File:Aquote2.png

And business should accept this call and learn to work by new rules. Learn - it means to understand trends, to see new opportunities and timely to use them.

Global tasks are drivers of rapid development in many spheres of activity of society. They lead to a set of discoveries, creation of progressive technologies and emergence in the market of new products and services.

According to the statistics of The Standish Group, the percent of successfully implemented IT projects in Europe makes about 30%. At the same time, for small projects (to $1 million) this percent is significantly higher (more than 70%), than for large projects (10%).

Despite complexity of collecting of real statistics, experts confirm that the situation in the former Soviet Union is not better. As a rule, projects are executed with delays of terms, a part of the stated functionality does not work or works in the semi-manual mode, completion of functionality is performed under the pretext of development of systems.

Criticality of a situation is caused not so much by observance of project restrictions, how many not achievement of effects of the investments into IT stated at protection. It is possible to speak about complexity of determination and assessment of effects of implementation of new IT systems, but the lack of post-investment control leads to substitution of business objectives on formal fulfillment of requirements. As a result, the efficiency of IT systems is significantly lower, than could be at competent approach.

There is a set of the factors causing current situation, and each of them demands basic researches and the separate analysis of the reasons. This article discloses one of key aspects of efficiency of IT - quality of the used data.

Quality management of data as independent area

Digital data are the cornerstone of all information technologies. Global "digitalization" leads to huge amounts of data. It is impossible to speak about development of IT without understanding of the nature of data and technologies of work with them.

According to forecasts of IDC, by 2020 the digital Universe will reach volume in the 40th zettabyte that in 57 times more, than quantity of grains of sand on all beaches of the planet.

Among experts it is often possible to hear Clive Hambi's phrase which became a peculiar meme: "Data are a new oil of the 21st century". Expression reflects one of key problems of the modern industry of IT. As well as oil, data are raw materials for production of new technological goods - knowledge. Information technologies should not just collect and store data, and increase their usefulness, obtaining from them information and turning it into knowledge. In other words, current trends show need of transition from the "raw economy" generating huge volumes of low-useful data to complex analytical and intelligent systems. And if within manual work over the reporting the user could correct flaws in data, then such systems require high quality of primary digital data. False data conduct to incorrect solutions.

In established practices the direction which integrated technologies and solutions for increase in usefulness (quality) in data is called "Quality management of data". It is necessary to understand difference of this direction from technologies of their analysis, such as "date mining", "machine learning", "predictive analytics". Data analysis is aimed at creation of qualitatively new information providing us the new knowledge which directly is not contained in initial data. Quality management of data provides us usefulness of digital data, value of each machine bit.

In Russia the IT market only develops and is rather poorly provided by the complementary services focused on increase in overall effectiveness of IT systems. Service of quality management of data - one of such supporting services directed to increase in return from investments.

In process of development of IT landscapes and increase in quantity and variety of IT systems of a problem of quality of data become a subject of the increasing criticism and discontent of end users. Chief information officers consider versions of possible solutions, but the market can propose only point solutions for specialized tasks.

For understanding of technology and perspectives of development of the Quality Management of Data direction it is necessary to address the history of development of information systems and evolution of proposed solutions.

Background

Dealt with issues of data management from the very beginning of development of computer technologies. In process of growth of amounts of data and their heterogeneity there was an issue of systematization and standardization of approaches to their structuring and storage.

Analysts from IBM company were one of the first in development of the direction of data management. In the 1968th year the concept of the database (D) which predetermined further development was offered. And in 1981 Edgar Franck Kodd finally formulated a relational data model which formed the basis of development of relational databases and the standard of program control by data on the basis of the SEQUEL language, later renamed into SQL. The relational data model and SQL are popular at design and implementation of databases to this day. Progress of scientists allowed IBM company to create the first industrial IBM System R database.

In the theory of relational databases important concepts of independence of layers physical data storage, a logical model and control interfaces of data were put. The worked normalization methodology on the basis of rules of "normal forms" allowed to achieve the required level of coordination of data at the expense of internal mechanisms software database management systems (DBMS).

It seemed, one more step and in most the large companies corporate DB for preserving of all industrial data of the organization will appear. There was a time, functionality of databases developed, new software products were developed, but the uniform corporate database did not appear. Why? For the answer to the matter it is required to plunge even more deeply into history and more attentively to look at evolution of information technologies.

At the beginning of development of computers, in 1946, group of scientists of the pennsylvanian university led by John von Neumann offer essentially new architecture of use of computer memory. Data on the list of the executed commands of the program remain in one area with the processed data. It meant that any more physical implementation of logic of programs using "jumpers" is not required and that the most revolutionary, the program could modify the sequence of the command code proceeding from results of the calculations. After public distribution of works, engineers estimated genius of the idea at once, and the first two generations of a computer were created on a von Neumann architecture.

In spite of the fact that the program and processed data in a computer appeared in one area of memory, the nature and processes of their formation remained different. The program data defining logic of work are defined in development process of the program. The processed data (corporate, user) are created and processed as a result of a program runtime.

Computers were improved, programs became complicated, there were databases, and approach to physical storage of diverse data in one area of memory remained the same. Selection of roles of the developer and user of the software product became the basic reason of preserving of tough differentiation of data. It was necessary to provide that user data could not break a program runtime.

Of course, there are nuances of the considered classification. Depending on approaches to implementation, identical data in different software packages can be interpreted differently. Historically, at further development of software, from user data there was selection of one more data type - the "setup" allowing the user to influence program runtime logic within the preconfigured opportunities. For complex systems there were even separate professions connected with setup of a specialized software.

Key to solution of the problem of quality management of data is use of essentially different approaches during the work with each of three data types. If the quality of program data to be completely in a zone of management of the programmer, then ensuring quality of user data - much more difficult task. She demands complex measures including organizational, from the user.

Market estimates

2019: Magic quadrant of Gartner

At the end of March, 2019 the Gartner analytical company released a magic quadrant in the field of the software for quality management of data.

At the time of the publication of a magic quadrant (on March 27, 2019) experts of Gartner made the last calculation of sales of tools for quality management of data for 2017. According to the results of this period market size reached $1.61 billion, having increased by 11.6% concerning the 2016th.

At the end of March, 2019 the Gartner analytical company released a magic quadrant in the field of the software for quality management of data.

According to specialists, solutions for quality management by data found very high importance for digital transformation of the companies, especially those who use such developing technologies as automation, machine learning, cloud computing and the business focused workflows.

The main client demand for such software concerns four points: audiences, managements, variety of data and delay time of requests. Software makers for quality management of data give to preference to the following directions: to analytics, intellectual data processing, implementing solutions and pricing.

In Gartner terminology "the quality of data" is understood as processes and technologies of identification, understanding and correction of shortcomings of data which support effective decision making and management of information flows within operational business processes. Tools, ready to use, as a rule, include such important functions as profiling and parsing of text information, standardization, cleaning, comparison, replenishment of data and monitoring.

Magic quadrant of Gartner in the field of solutions for quality management of data

Leaders of this market by 2019 experts of Gartner called the following companies:

SAP develops such solutions for quality management of data as SAP Smart Data Quality SAP Information Steward, SAP Data Services and SAP Data Hub. These products have 14 thousand clients by the time of drawing up the analytical report.

Researchers referred fast development of new features and innovations, strong support of community and ample opportunities of use of products in different scripts to number of the main advantages of SAP, including in Big Data, analytics and integration.

As for criticism of SAP, it is connected with difficulties of integration of products of the company with third-party tools, high prices and the complex circuit of licensing. Besides, some customers indicate the need completions of the user interface in SAP solutions to make visualization of business data more visual.

The main product of Oracle in the field of quality management of data is Oracle Enterprise Data Quality. 550 clients use it (by March, 2019). Most of them praise vendor for an opportunity to apply software to different areas, for profiling of data and a strong brand. At the same time the main discontent of Oracle is caused by inflated prices, the low level of support of products and poorly prepared documentation and also insufficient attention of the company to models of deployment of software via SaaS and clouds.

At IBM note deep understanding of the market, the correct strategy, timely development of innovations and the affordable prices of products. At the same time the company lags behind competitors when it is about updating of products or transition to software of IBM from the competing solutions. Also customers speak about problems of interactive visualization of results of work of software and the low level of technical support of products. Despite it, by March, 2019 IBM brought together base from 2500 clients using a quality management system of data of IBM InfoSphere Information Server for Data Quality.[1]

Problem of quality of user data

Let's consider a problem of user data in a DB. The principles underlain in the concept of relational databases with enthusiasm were apprehended and used by software developers. Databases are everywhere used for storage of all three data types: internal system information (program), settings, user data.

For system data everything is fine. The principles of independence of layers allow the developer of the program to change a data structure, without being submerged in questions of the organization of their physical storage. And the rules implemented in DBMS and restrictions of a relational data model provide their integrity.

One of examples of exceptions is the product of Microsoft company - Excel. In spite of the fact that in it the integrity, successful implementation of the principle of independence is not maintained user data and ease of use provided the MS Excel fantastic success.

With user data - in a different way. The user works with interfaces of programs, without having direct access to the organization of structure of data storage. While the software product does not correspond to the rules defined in manifestos on data management systems in a DB. The majority of programs is not supported the principle of independence of structure of user data of functions of the program.

The end user sees corporate data through logic of program interfaces and the preconfigured reports. Any need of change of a data structure, and sometimes and just emergence of new values, forces the user to address the programmer for entering of modifications into a program code or settings of a system.

Violation of the principles of independence of layers became one of the main a source of problems of ensuring quality of user data in modern information systems.

Emergence of ERP systems as attempt to solve a problem of scrappy automation

The quality problem strongly becomes complicated if software products become a little. Information technology development and their active use leads to inevitable increase in number of programs which the company is forced to use. Current situation received the nominal name "scrappy automation".

The data management problem in a single software product can be solved by adaptation of user data to structure of the program and implementation of the specialized checks monitoring integrity of data.

If there is a lot of programs and in each program the database with the structure is used, then it is impossible to adapt a data structure for each of them. And also it is extremely difficult to implement data transmission mechanisms which would provide integrity of data between the integrated IT systems.

Investigations: repeated input of identical information, inconsistency during the work in different information systems, high transaction costs for data purification.

Effects: decrease in trust at the customer to information technologies and investment soundness in IT.

For the solution of the faced difficulties experts of the market suggested to go on the way of globalization of information systems. For example, for business process automation in 1990 analysts of Gartner company stopped creating a system concept of the class Enterprise Resource Planning (ERP) 1. Its essence - creation of the unified information system on the global scalable universal platform (one program). It was supposed that with its help it is possible to automate all internal processes of enterprise management, and it will work at one database, providing end-to-end integrity of corporate data.

At the end of the 20th century of ERP it was presented as the universal solution instead of scrappy automation. Hi-tech platforms from the famous leaders of IT market were submitted to the world: Oracle, SAP (it was selected from IBM), Microsoft. Very high level of development and scale of solutions became qualitative advantage of the offered products.

The culture of software development actively developed and were able to afford to use complex technologies of development very large corporations with powerful scientific analytical centers. The market of programmers only formed and the small IT companies could not create qualitative production base.

Destruction of hopes

Figuratively speaking the logic of evolution has no compromises. Global information platforms as "giants dinosaurs", were an intermediate link in evolution of information systems. In process of complication of business processes of the enterprises and all of the accelerating information technology development, the complexity of universal software packages continuously increased. Aiming behind the maximum covering of processes of automation, the functionality of the global systems became catastrophically huge (for example, the number of tables in SAP ERP exceeds 1.5 million), and the volume of the data processed by programs doubles everyone two-three goda2.

Enter changes into the monsters systems it became more difficult, more expensive and longer. As a result, at the beginning the 2000th years specialized solutions began to restrict IT giant and in an IT landscape more and more adjacent systems began to appear.

To be in time behind the market, ERP manufacturing displaced the efforts from own developments towards acquisition of the companies of segment leaders and their integration into the platforms. The market was covered by a consolidation wave.

As confirmation, on a magic quadrant of Gartner for BI systems for 2008 we see in leaders products from Cognos and Business Objects. A year later, these products already enter lines of solutions of IBM and SAP, respectively.

But information technology development happened quicker and quicker and determined further complication of solutions. What efforts would not be made by ERP producers what would not be thought out by their managers, it could not break the outlined trends any more:

  • The decline in quality of implementation of IT systems connected with heterogeneity of the market of the consumer.
  • Complexity of transition to new versions of products. The high volume of completions and modifications of standard functionality of a system with accumulation of number of users in process of their operation.
  • The long process of system implementation resulting in mismatch of the received result to constantly changing business needs.
  • Essential turn-on time of new perspective information technologies to the product platform.
  • Growth in an IT landscape of quantity of corporate systems of third-party producers and the geometrical growth of required integration flows.

As a result, positive effects from implementation of ERP systems are with interest blocked by negative. The enterprises come down back to "scrappy automation", and the quality of data to become the main problem of users again.

Awareness of importance

File:Aquote1.png
The existing IT a landscape - the worker. But it is antiques. We need not the Florentine mosaic, but "LEGO" that was simply and conveniently

Alice Melnikova, ex-head Sberbank Technologies, head of department of financial technologies, projects and organization of processes of the Central Bank.
File:Aquote2.png

Failure from the idea of the global systems clearly was outlined in the second decade of the 21st century. It is well visible on the example of positioning of large players. In particular, the SAP company does not want to be associated with the ERP system any more and displaces marketing focus on an innovation in IT.

On the other hand, the management of the companies which is actively developing the IT direction for business support began to realize that not to resolve issues of quality by simple formulation of requirements at a stage of purchases and "a slozh of the handle" expecting result. Problem solving of efficiency of IT assets and quality of corporate data lies in the area of internal processes of company management. Implementation of the new centralized processes providing the solution of assigned tasks is necessary.

It can find confirmation easily in the IT strategy of the leading companies. Programs of their development many service projects, such as formation of a corporate data model and development of monitoring systems include. The popularity of search queries in the field of quality of data in recent years grew in tens of time.3

Unlike others IT сервисов4, service of quality management of data gained distribution rather recently. For example, the American institutes deal with problems of management of IT architecture more than 20 years. Approaches of TOGAF, FEA, Gartner were as a result created and were widely adopted worldwide. In the field of quality of data it is possible to find only a number of specialized solutions on management transactional spravochnikami5. Management of the normative reference information - an important task, but this one of many tasks. Accuracy of the data, their availability and coordination in different IT systems requires complex approach. The serious scientific and analytical work allowing to work all aspects of quality management of data at each stage of their lifecycle is required.

Problem of integration of systems

Separately It should be noted sharpness of a question of quality assurance in integration processes. At the technology level there is a set of the standardized program interfaces for transfer of the most different data: about documents, events, metadata (data on a data structure) and many other types. Large integration platforms support many of them that allows to configure without effort connection and to perform data transmission.

On the other hand, the lack of the standards defining strict rules of data management in software products and algorithms of their preparation for transfer results in incompatibility of data at the level of logical structures. In other words, you will be able to transfer data, but it is automatically correct to interpret data in the system of the receiver it will not be possible.

As a result, implementation in the integration interface of required structure and a format of data is improved separately for each point of integration. In what complexity of such developments and why they constantly "break"? As it was told above, each system has the logical structure of data with the reference books. At integration it is necessary to transform data from one set of reference books to other set of reference books with other list of values. It requires study of an algorithm of conversion not just at the level of fields of reference books, and for each value of the target reference book.

In turn, algorithm elaboration at the level of values of reference books requires knowledge of data domain and specifics of business of the specific client. To simple software developers this task not to solve. As a result, there is the whole group of specialists which formalizes algorithms, scrupulously programs, tests and carefully verifies all data.

Now we will provide that in an initial system the functionality or values of the key reference book changes. In practice it is not always possible even to predict the place where there will be an error. It is necessary to carry out manual data purification, identification of errors, completion of algorithms, implementation, testing. Support of integration turns into permanent and expensive process with low trust to results of work.

Formation of standards of quality management of data would allow to make process more managed and effective due to system and pro-active approach. As well as in any quality system, the purpose of quality management of data - to resolve issues before it develops into defects of data where the price of an error will turn out very big.

Competent quality management will allow to avoid need of repeated input of primary data, to lower costs for support of information systems and their integration communications, to increase efficiency and reliability of corporate data. And of course, standardization of quality management of data should become a basic basis for transformation of an IT landscape from "mosaic" in "LEGO".

How to manage quality of data in practice

The discipline of quality management of data as of 2017 to be in a world trend and it is actively developed by specialists worldwide. Among the most popular experts it is possible to mark out David Loshin, Philip Russom, Martin Oberhofer whose works already won wide popularity.

Basic understanding and approaches to quality management of data are worked rather well out by scientific community. Now there is a study and development new the practician taking into account occurred lately in IT of innovations. There is an active study of new solutions and their approbation in the companies "pioneers".

Determination of quality of data is formulated as the generalized concept of usefulness of data formalized in a certain set of criteria. For corporate these information management systems it is accepted to select the following six criteria: demand, accuracy, coordination, timeliness, availability and interpretiruyemost6. For each criterion a set of the key performance indicators (KPI) is defined and the practicians improving them are studied.

The list the practician, the data forming a basis of quality management, rather big and it constantly extends. Due to the novelty of the direction and lack of still uniform standards, for many from the practician there is no specialized software and there is no necessary functionality in IT platforms. Implementation the practician has more organizational character and demands significant completions of IT systems. But it is a matter of time.

Practicians say in a set of requirements to functioning of the existing processes in the companies which would provide certain aspects of quality of data.

Let's give examples the practician which use would allow to build a full-fledged quality management process of data:

  • requirements to IT architecture;
  • requirements to methodical ensuring processes of IT;
  • requirements to learning processes of quality assurance in IT systems;
  • functional requirements of IT systems on ensuring quality of data;
  • requirements to conducting testings and the used pilot amount of data;
  • requirements to transfer of a system to operation;
  • requirements to processes of support of IT systems;
  • requirements to change management processes;
  • requirements to information space of users;
  • requirements to maintaining the description of data (standardization);
  • requirements to profiling and monitoring of data;
  • requirements to management of integration interaction;
  • requirements to control of intersystem integrity of data;
  • requirements to management of responsibility and processes of approval.

Implementation

It is not obligatory to approach a problem of quality management of data in a revolutionary way. It is recommended to implement practicians using the iterative approach (for example, on Agile methodology). It is necessary to select the most critical areas of data both the related IT systems and processes. Define target values of criteria of quality and pick up the most effective practicians. The individual actions plan is developed for each practice and their effectiveness is controlled. Further, the accumulated experience is replicated on other areas of data or extends new criteria and practicians.

According to Institute of data warehouses (The Data Warehouse Institute, TDWI), the company regularly faces the organizations which apply only one method, and occasionally only to one data set or data domain. The majority of solutions for ensuring quality of data should expand amount of the applied technologies, data sets and data domains.

It is important to note that process of management of data in the company should be organized as centralized and it is desirable it is defined at the level of the IT strategy of the company. The principles and approaches of quality management of data are fixed in specially document - the memorandum of quality of data. When developing the memorandum it is important to consider two factors.

First, quality management of data cannot be considered as the isolated and self-sufficient process. Provisions of the memorandum should be interconnected with other management processes of IT which should be adapted as appropriate.

Secondly, the memorandum - not the legislative document which should be created and lie on the shelf. The quality management process should work constantly and improve continuously.

Responsibility for data

Separately it is worth paying attention to responsibility for data. To similarly budget management, the quality management process of data is an end-to-end process within operational processes of the enterprise. By analogy with the centers of the financial responsibility the institute of owners of data (Data Stuard) should be created. Their task to provide quality management processes and data security in a zone of the responsibility. It is important to note that the area of responsibility is defined only by the nature of data and does not depend on IT systems where these data are used.

In turn, the director of information technology (CIO) is responsible for a quality management process in general. Its duty to provide functioning and execution of procedures from IT service, to control the level of service and to promote increase in efficiency used the practician.

Management of the company should understand that the quality management process of data cannot be solved only at the expense of service IT. Its task - to perform service function for business and to answer within execution of assigned tasks. IT specialists cannot define completely independently a data structure, logic of their processing, the rule of filling of reference books and algorithms of reporting. A customer of IT systems are the divisions of the company interested in increase in efficiency of the work. They should try to obtain from IT service of high-quality and convenient IT tools and adapt the work for their competent use.

Perspective of development of quality management of data

Specialists of TDWI formulated the main directions in which there is a modern development of discipline of quality management of data. Their understanding allows to estimate correctly perspectives of the direction and to create development priorities.

Development new practician and technologies for providing to quality of data

As it was written above, during information era constantly there are new trends and the directions of information technology development. Respectively, they require continuous development new and adaptation existing the practician.

Ensuring quality of data in real time

According to the research TDWI, ensuring quality of data in real time is the second most popular quickly developing discipline after management masterdanny. Filip Russ recommends to allocate it with a high priority to save coordination of data between systems.

Coordination with other disciplines

Quality management of data is built on the basis of already developed information environment. Implementation the practician of quality management should be exercised in the context of already developed disciplines, such as management of IT architecture and management of project activity. For example, the project documentation of a system should join the section with the description of the automated control, and the regulations of system maintenance should include monitoring of the automated reports on checks of user data.

Profound profiling

Profiling of data is carried out often superficially and separately for each reference book. The competent profiling executed on the basis of a logical data model with the analysis of dependences between reference books allows to define more difficult vulnerabilities for quality of data. Obligatory profiling should be made for the data participating in integration flows between systems.

Development of a specialized software

Many solutions for ensuring quality of data of first generation were self-written internal developments. Such solutions show demand of program automation of problems of ensuring quality of data. Own developments, as a rule, do not conform to requirements of industrial solutions. Their practical value is not high because of impossibility of replication between IT landscapes of the different companies. And possibilities of development of functionality of such solutions, in most cases, are limited to lack of scalable architecture.

Development of quality management of data in Russia

Information technologies in Russia actively develop. Due to development of digital economy and our main resource – talented youth, the country can make high-quality breakthrough and compete with world leaders.

By Milgr and Roberts claimed that cost efficiency of IT is caused not only and not just by investments into IT, how many changes of the complementary services connected with IT assets. Investments into assets and development of the directions designed to increase efficiency of IT provide multiplicative effect.

We can see enthusiasm and results of the separate companies which decided to carry out a task in the IT landscape. On roar of IT services the certain experts ready to develop the new direction meet and to work technology of process of providing and rendering service.

Certainly, it promotes creation of practical base and creates a basis for further development. But separate actions of market participants do not allow to provide consolidation, processing and succession of the accumulated experience. The successful solutions created in market conditions are protected by the companies as competitive advantage. There is no institutional basis capable to collect knowledge for carrying out fundamental analysis, development of technology and formation of uniform standards. Without this basis we cannot hope for significant progress and creation of effective solutions in the field.

Information technologies develop with a huge speed. Implementation of ambitious problems of an exit in IT leading can be executed only joint efforts of the government, business, scientific and educational community. Acceptance of complex measures for creation of the general ecosystem is necessary (a good example is the community based on standards of institute of management of project activity of PMI):

  • Formation of the scientific institutes which are engaged in a perspective in IT and regularly updating standards.
  • Development of experimental base on the basis of the companies of public sector.
  • Selection of educational disciplines for programmers and IT consultants, with operational updating of the program of discipline.
  • Creation of standards in the field of management of IT and school of IT managers.
  • Development of certificates in the field of the IT (devoted to management practice of IT, and who are not limited to security issues and working conditions).
  • The support at the state level at the expense of legislative initiatives creating the incubatory environment protected from threats of the international corporations.

In the USA on this subject the state institutes and powerful analytical centers of the large companies work. To fondly believe that the West will share the practices. The innovative technologies are provided to developing countries with a noticeable time delay. It is required for ensuring the technological leadership of corporate IT giant allowing them to organize quietly production process, to take the main positions in the market and to skim off the cream.

We should compete with the West at the level of the speed of development of scientific base, technologies of production processes, quality and standards. Standards should be updated regularly, and Wednesday should adapt as fast as possible under them.

For IT, with the high speed of changes and short lifecycle of assets, effect of the leader – defining. Unlike goods market, in the IT market the cost of copying is insignificant is small and the leader gets everything.

2020: The Russian Center of the Analytical Systems suggested to make quality of data a basis of digital transformation

On November 27, 2020 it became known that the Russian Center of the Analytical Systems (CAS) specializing in development, implementation and consulting in data slicing pane and digital transformation of business submitted the concept of digital transformation "Date literacy and quality of data – two main trends of digital transformation in a state administration" (Alexey Mamonov, the CEO of TsAS).

Real transition to digital management model (Data-driven) strongly slips because of two main problems – the insufficient level of literacy of officials in work with data and also low-quality the data which are contained in the state information systems. According to TsAS and other analytical sources, in 2019 nearly five thousand transactions based on data were the share of each person, and this digit steadily grows. At the same time only 24% of the persons obliged to make decisions on the basis of exact information are able to work competently with data (can "read", analyze and use them as necessary base for the daily work).

Not less important condition to go to digital economy is the quality of data (degree of suitability of data to use). It is a key indicator for a possibility of creation of "digital doubles" – copies of processes or objects of the real world in information systems, – that according to many experts, is fundamentals of digital economy. Incorrectness, incompleteness and irrelevance of initial data do such models so distorted that large-scale transition to management on the basis of data becomes impossible. For example, even at the most authentic state registers still there is a considerable share of "dirty" data (so, for November, 2020 open data of FTS contain data on 39,000 enterprises which stopped activity earlier than there were they were registered, there are records where several tens of the enterprises are the share of one TIN, and according to open data of the Moscow government in the city houses which area exceeds 4 mln sq.m and housing only 1 sq.m. are registered). Moreover, the considerable share of the state ICs contains data which it is difficult to call differently, than "digital garbage". Such situation does not give the chance to make reasoned decisions on the basis of data as state information systems and open data provided by them are designed to be a uniform reliable and most reliable source to information for the analytical systems of the state and private companies.

Correction of a situation needs to be begun with creation of complete strategy of data management in the organizations and distribution of literacy of work with data. During creation of an information system, her customer and the operator needs to answer the main strategic issue: of what value this system in several years from the point of view of saved up data will be. Otherwise it does not make sense to spend time and money for such system.

Literacy of work with data (Data Literacy) – activity in the organizations which enhances the value around the world. This direction covers four main skills of staff of the organizations – ability to read data (to understand what is meant by these or those indicators); ownership of analytical tools for work with data (for example, BI systems); understanding how to analyze these data and, at last, capability to reason decision making using data. In the summer of 2020 in Russia the dataliteracy.ru project as a part of the international thedataliteracy.org project designed to help the organizations to increase as fast as possible level date literacy of the employees started.

According to Alexey Mamonov, from the point of view of management on the basis of data the situation when all data, necessary for decision-making, irrespective of their physical location, are organized in the uniform directory available to the analysis according to access rights is ideal. As the tool at the same time the fast BI system giving an opportunity of an independent research of data (for example, designing of dashbord without attraction of IT of specialists) and supporting natural language queries should be used (including voice). If the manager, knowing the relevant agenda, can also see an exact picture of the events ("the digital double"), then and the made decision becomes the surest or even obvious.

The offer for business

In the modern world for an exit in leaders it is necessary to get rid of rudimentary habits to make assessment of results on the basis of low cost and the accuracy of execution. To the forefront there are common sense, efficiency (result/cost) and speed of adaptation under new conditions.

The companies which are seriously thinking of efficiency of own IT are recommended not to concentrate on point problem solving, and to look critically at internal IT services and to finish them taking into account modern trends:

Resistance to changes

The existing world changes, processes of the organization and IT change. In these realities it is useless to draw requirements on the basis of the ideal picture of "to-be" and to painfully wait for approach of happiness. It is necessary to create the development strategy IT on the basis of effective adaptation of IT to changes in business processes. Resistance of systems and all IT landscape to changes and business continuity is important.

Mosaic architecture by the principles of LEGO

The effective IT landscape lies in the area of a set of the highly effective, mutually integrated specialized solutions with a number of centralized systems: an integration bus, KHD centralized by NSI, a monitoring system of IT systems and a quality control system of data. Failure from mono-systems and the isolated stationary solutions, standardization of requirements to systems regarding data management and integration flows is necessary.

Quality management of data

Development of management of IT architecture assumes transition to management of business architecture. Complex design and business process management, data and an IT landscape will allow to reach the maximum effect. At the same time, data management should not degenerate in a rule set and checks. Creation of a full-fledged corporate data model and the quality management system of data constructed over it is necessary. It will allow to solve a set of problems with integration and providing a retrospektivnost of data in the conditions of permanent changes.

1. Within article the perspective of quality of data is considered through focus of the corporate information systems automating processes of the company. The systems of other classes have the specifics. Their inclusion in consideration would result in excessive complexity of the provided material
2. It is possible to get acquainted with assessment of growth of complexity of systems in A.A. Bossom and V.M. Ilman's works
3. According to Wordstat service from "Yandex"
4. It is possible to find the list of IT services and their application in methodologies of ITIL and COBOL
5. Reference books which elements are created as a result of accomplishment of standard user transactions in a system (transaction)
6. For different data types as that Big Data or Archives of documents, a set of criteria and approaches to management can differ a little

Read Also

As data became raw materials of the 21st century

Notes