RSS
Логотип
Баннер в шапке 1
Баннер в шапке 2
Project

Webpractik has developed a platform for the RVC competition for the development of AI

Customers: Russian Venture Company (RVC)

Moscow; Financial services, investments and auditing

Contractors: Webpractik
Product: IT outsourcing projects

Project date: 2020/05  - 2020/09

2020: Creating an AI Development Platform

On March 18, 2021, Webpractik announced the creation of a platform for the Up Great ABM//READ artificial intelligence competition. The customer was the Russian Venture Company (RVC).

The technical assignment was received in May 2020. Already in July launched Markup for Specialists. In September, they launched platform testing.

The platform should compare the way the texts analyzed the software complex with the analysis of specialist teachers. Therefore, to start testing on the platform, you need: texts, teachers, tools for analyzing texts.

The browser service "Markup" helps specialists to highlight semantic blocks in the text and leave uniform comments with a description of the error, checking with the Classifier.

The Participant Software Complex (PKU) includes a console utility and two data packages.

Using the PCH, you can set up a session between your local device and the platform at any time through the API. The latter gives files with undetected essays on request. Then, on the local device, the algorithm marks them and transmits them back. The parser package collects markup and converts it into a machine-readable JSON structure.

The session data and the results of checking the marked files are displayed in the Participant's LC. The user can download the report and logs, see in the visual interface, how his markup variant differs from Expert.

When marked-up files arrive on the platform, an automatic solution verification system is turned on, which analyzes the sent markups, comparing them with Experts. This is an traditimic qualification.

In addition, the system checks the state of the connected UCP every 4 hours and evaluates its readiness for operation. This process is a technical qualification that simulates final competitions.

To reduce the burden on the Solution Verification System, an auxiliary tool, the Validator, has been developed. It automatically checks the submitted files for logical errors in the markup and forwards the bagreport to the Participant.

For the final tests on November 9, 2020, a separate service was developed - the Solutions Comparison Program (AKP). It is a worker that reads tasks from servers queues and stores results base in datasets. It is on it is stored fault tolerant cloud Yandex Object Storage. Session logs are sent to the ELK stack.

During the development of the platform created:

  • Markup service,
  • Validator service,
  • Solution verification system,
  • Member Personal Complex Module,
  • Solution Comparison Program.

The systems of personal rooms on the platform turned out to be two, with different architecture and functionality. With a visual interface but no software for the Editors and with both for the Members.

Tests on the platform will last until 2022.