Translated by
2019/09/27 17:19:33

Software 2.0: As the new development approach of software will force computers to grow wiser

Paradigm of Software 2.0 - a software development approach which is capable to make high-quality breakthrough in the field of computing development. In spite of the fact that the idea which is its cornerstone is not new while it still remains only a hypothesis. The journalist Leonid Chernyak in the material prepared for TAdviser tells about premises and the history of this approach.


Processors as factor of stagnation of computer progress

All more than seventy years of the history computing developed though is prompt, but without notable high-quality changes. Not accidentally main engine of progress usually call the Moore's law on periodic doubling of density of placement of transistors on a silicon substrate. This law is that other as recognition of that fact that all last years advance was provided mainly due to improvement of semiconductor technologies.

Progress hardly only affected a von Neumann architecture, a basis of bases of modern computing. Half a century back for its implementation several thousand vacuum lamps triodes were enough, and modern processors consist of several billion transistors, but with all their high performance they, as well as many years ago, remain capable only "stupidly" to execute the sequence of in advance prepared machine instructions called by the program. And the computer whatever made it was continues to remain the trivial actuator, the close relative of the simplest program automatic machines, the same street organ or the automated belfry – a carillon.

All not numerous attempts to considerably change the current situation, including clockless, quantum processors, etc., ideologically were limited to transformation actually of processors: at all the visible novelty they save an old paradigm, leaving the computer the same programmable device. However quite recently there were first signs of possible changes, there was a hope that the computer can be made more smart, i.e. the trainee.

Revolutionary idea: how to force computers to grow wiser

This hope was presented by Andrey Karpaty, the author of the surprising and perspective idea. However he, apparently, up to the end did not realize degree of its revolutionism. Karpaty limits the vision of the future to a new development approach of the software, and the utilitarian "programmer" view of things prevails over it. However, if to come off automation of development and to look more widely, then other picture on what he offers opens absolutely.

Karpaty's idea can be interpreted as follows: if to build in the trained model in the computer, then it will be possible to make it the trainee and thus to bring out of a status of the stupid program automatic machine.

Andrey Karpaty
Andrey Karpaty

In 2017 Andrey Karpaty, in the past the scientist-researcher at Stanford University, entered to a position of the head AI Tesla companies, responsible for software development for computer vision and autodriving. The rapid career jump of the academic scientist, far from the industry, the fan of high-speed assembly of the Rubik's Cube was preceded by the resignation of the former management of this direction caused by a number of automobile incidents in full strength.

Appointment of the scientist to managerial position obviously demonstrates understanding by the management of Tesla of need for more serious, than before, the relation to creation of the unmanned vehicle.

In few months Karpaty published quite unexpected post entitled by "Software 2.0"[1]. Later he commented on it in a half-hour performance[2] at the Train AI 2018 conference (San Francisco, May, 2018).

In the post he said that the present programming paradigm which after emergence of Software 2.0 it is necessary to call Software 1.0 remains invariable 70 years, despite emergence for last years of enormous quantity of equipment rooms and software new technologies. Its essence that strictly determined solution at first is described by the person in a programming language, then it is compiled less often interpreted) in machine codes and executed.

The operating paradigm arose together with von Neumann architecture in which basis data storage and programs in one total memory is put. It perfectly corresponds to calculation tasks for which, as a matter of fact, and it was created and also for others work with exact data. The need for an alternative did not arise until computing did not adjoin to difficulties of the world around. In most cases data from real life are vague therefore also big measuring accuracy and the subsequent solution are not required. But "touch revolution" and the problem of Big Data which arose later considerably changed circumstances.

From a hypothesis to practice

The paradigm of Software 2.0 focused on such purposes is not implemented yet and will be hardly implemented in the nearest future. More likely, it can be considered a hypothesis or the request for the future. Before to provide it, it should be noted that the background is not new. The American mathematician of the Azerbaijani origin Lotfy the Back, the author of the theory of indistinct (indistinct) sets was one of the first who else in the mid-sixties realized need for adaptation of mathematical methods to difficulties of the real world.

The concept of an indistinct set offered them is an attempt of mathematical formalization of vague data for creation of mathematical models. Idea that the elements making a set having the general property can have this property in different degree is its cornerstone and, therefore, belong to this set with different degree.

In the 1980th years in the USSR there was a book the Back, it attracted huge interest, especially in the annex to such areas as geophysics or meteorology where there are measurements, but in general data are vague and are required not so much quantitative how many qualitative solutions. However, despite all their attractiveness, the ideas the Back could not be implemented by means of Software 1.0.

Implementation of a paradigm of Software 2.0 can become the first practical step towards work with indistinct data. If we cannot algoritmizirovat a task and write the program because of illegibility of initial data, then it is worth supplying the computer some with initial knowledge, something like libraries of subprogrammes and functional modules in combination with the mechanism of the choice necessary of library.

Further it is possible to enter with the computer dialog, to submit data on an input and to look at reaction of the computer, to adjust data on an input and again to estimate results. As a result of this iterative process, over time in the computer there will be a necessary code.

As envisioned by Karpaty, the procedure of development of Software 2.0 should become similar to pair programming where two at the same time work on the code, sitting at one workplace. One of them (contractor) writes the code, and other (observer) monitors process and is concentrated on strategy. Periodically these two change roles. In Software 2.0 development process there will be pair too. In it the contractor the computer with the rule of machine learning preset on it will become, and the person will get a role of the observer.

Karpaty suggests to give completely formation of codes to the machine, and to assign management of human-machine development process to the person. As the purpose of Software 2.0 serves creation of model which can generate codes, it studies what codes in compliance with the set rules should be created for obtaining these or those results. From the point of view of the programmer when developing within Software 2.0 processes of writing of the text and debugging of the program of activity give way to work with data and iterative process to training of the model based on neural networks.

For such approach it is possible to use the known abbreviation of AI (Artificial Intelligence), but to decrypt it as Augmented Intelligence, understanding as it capability of the machine to serve as the intellectual assistant to the person. The most important component of such AI approach is the iterative operation mode with permanent testing, the fact that call test-driven. The person writes a task and criteria of its assessment, and the machine looks for a method of the solution and shows result. In this process the developer is exempted from routine activity and has an opportunity to focus on an essence of a solvable task. So he should be the qualified specialist in application area.

In present conditions by Software 2.0 methods it will be possible to create codes for von Neumann machines, there is no other choice. But this solution offers the most interesting prospects for the created neyromorfny processors having elementary capabilities to more difficult actions than execution of the set program.

See Also

Programming methodology

Machine learning