RSS
Логотип
Баннер в шапке 1
Баннер в шапке 2

Mapping Natural Language Instructions to Mobile UI Action Sequences

Product
Developers: Google
Date of the premiere of the system: July, 2020
Technology: Tablet computers and smartphones,  Development tools of applications

2020: Disclosure of source codes

On July 10, 2020 Google provided a new AI system which allows to manage portable devices, in particular the smartphone, using a natural language. Innovation under the name Mapping Natural Language Instructions to Mobile UI Action Sequences first of all useful to visually impaired people.

Details concerning functioning of an AI system are presented in the report from a conference of Association of computational linguistics (ACL) for 2020. Researchers offer a method of control over smartphones with the help of a natural language. The provided initial base of commands for AI allows to reach effective interaction between several devices. The programmed base of commands processes a request from the user, predicts the sequence of actions of the application and also screens and interactive elements for transition from one screen to another.

Using AI, developers already created three sets of instructions which can be used for multistage work with smartphones. Moreover, organizers of the project already have about 300 thousand one-step commands which are intended for the user interface. It is expected that they will function practically on all devices from Android OS.

Disclosure of source codes of an AI system for control of the smartphone a natural language

Within testing of AI technology successfully it was succeeded to transfer the natural speech of the user to actions, accuracy was 89.21%. Nevertheless at complication of pronunciation of commands and also during creation of artificial commands during pronunciation the efficiency was considerably reduced (to 70.59%). Developers of the company are sure that in the future performance will be improved.

All practices of developers will be in open access on the GitHub resource. Researchers of Google not against to receive the professional help from colleagues as are sure that this project can become the first step for solution of the problem of device management using a natural language.[1][2]

Notes

  1. [1] Google’s AI tool lets users trigger mobile app actions with natural language instructions Mapping Natural Language Instructions to Mobile UI Action Sequences
  2. on GitHub