Software-defined storage of Software-Sefined Storage, SDS
The distinctive feature of storage systems of SDS (Software-defined storage) consists that management of their numerous service functions (deduplication, replication, creation of snapshots, backups and so forth) is performed on the basis of the business focused rules.
Thanks to broad penetration of virtualization program control by infrastructure of DBMS separates from the hardware. It provides high flexibility in work, the scalability becomes simpler. But there are also minuses: search of the necessary server in a cluster considerably becomes complicated, to find the place of physical placement of a required fragment of data it appears not easy. IDC consider that it will bring in a result to formation of absolutely autonomous program stack which will be responsible completely for the solution of any functional tasks of DWH on already available equipment.
The value of information accumulated by the companies constantly increases. It's not just that on its base all operating activities are conducted. Thanks to means of analytics of Big Data not only the intuition of the management, but also completeness of information which is available for decision-making become the most important factor for the choice of future business development path, Charles Foley, the senior vice president of Talon company notes on the Data Center Knowledge portal [1].
At the same time the one who the first will reveal the correct perspectives and will begin to move forward will hardly be able to remain long alone. The others will also not miss the chance, having simulated it on the saved-up data. Because of it the exponential growth of volumes of the data accumulated in the companies is already observed everywhere.
However it does not mean yet that already learned to use data. Still information is stored in a separate type. A considerable part of data, of course, gets to the centralized storage. However their large volume, being of undoubted interest to future analysis, still often is scattered "on corn bins": in computers of employees and even in their mobile devices, writes by Folya.
Fragmentation is accompanied by the phenomenon called by "shadow IT" (shadow IT). It is shown that individual employees, and sometimes and the whole departments or branches of the company, "take away" a part of information resources in the hands, doing them unavailable to others. It looks as own set of servers, DWH and restoration systems functioning independently from the general IT infrastructure of the company. Quite often cases of transfer of responsible data in public clouds, preserving them on USB carriers meet. Opportunities for reliable control of safety and data availability decrease, and sometimes just get out of hand, complains Folya.
By Folya considers that the scale-up problem of fragmentation of data is inevitable and arises, it agrees, figuratively speaking to the second law of thermodynamics which in a general view can be formulated as follows: "In any system in process of its development chaos inevitably accrues".
For example, natural fragmentation arises at expansion of business and building of a branch network. Then on each new section own segment of IT infrastructure which is optimized under certain, local conditions is created. Installation and service of network, servers and DWH contact first of all reliability of internal access. Interaction with other points of a location of IT infrastructure of the enterprise often is difficult in the conditions of real operation. Support of this opportunity requires additional resources therefore their selection is sometimes postponed indefinitely, notes Folya.
But at violation of communications (access break) a part of important corporate data appears in isolation. Some users cannot get access to them even if requirement is big. It can have an adverse effect on the made decisions. The problem can affect heads of any level. The future of the company becomes dependent on a recall rate of available data. In these cases it is necessary to return to decision making to intuitions. In the conditions of the developed market and big risks the price of the made decisions can be very high, warns Folya.
The accruing fragmentation, according to him, can also become the reason of some other, not less vital issues:
- Preserving of integrity of data. When data arrive from different places, it is necessary to use mechanisms of control of their integrity accurately. But if there are several authorized copies of the general data, then there can be problems.
- Decrease in level of security of data. When access users or the local, not authorized services get to service of data and their reservation, every time arises risk of loss of control over collected information and even possibilities of its partial or complete loss.
- Inefficient use of IT resources. The servers and DWH selected under customized applications often are underloaded. But at the same time special attention should be paid to each resource, to consider its specific requirements. As a result capital and operating expenses grow, individual support for specialized sections of IT infrastructure of the company is required.
- Inefficient use of labor resources. In case of fragmentation of data it is necessary to increase the state or to attract the external companies on outsourcing. It is necessary for support of all sections of data storage in working order.
See Also
- Network attached storages (Network Attached Storage, NAS)
- Storage area network (Storage Area Network, SAN)
- Direct Attached Storage, DAS
- Product catalog and projects of DWH
- Data backup