RSS
Логотип
Баннер в шапке 1
Баннер в шапке 2

OLMo (Open Language Model)

Product
Developers: Paul Allen Institute for Artificial Intelligence (Ai2)
Date of the premiere of the system: November 2024
Branches: Information Technology

Content

History

2024: Product Announcement

On November 26, 2024, the Paul Allen Institute of Artificial Intelligence (Ai2) introduced the fully open large language model OLMo 2 (second generation Open Language Model). The neural network supports, among other things, the Russian language.

OLMo 2 is a family of AI models developed from start to finish using open and accessible training data. Versions with 7 billion and 13 billion parameters are presented. OLMo 2 is said to be superior in performance to Meta's Llama 3.1 (recognized as an extremist organization; activities in the Russian Federation are prohibited) and other open source models. In particular, the OLMo 2 7B version is ahead of Llama 3.1 8B in academic English tests, and OLMo 2 13B bypasses Qwen 2.5 7B.

Launched an open neural network OLMo 2 with 13 billion parameters. She supports the Russian language

The second generation model is based on the original OLMo version. The Ai2 team took an innovative two-step approach to learning. At first, the model was trained on a large dataset of 3.9 trillion tokens. Then the developers improved the neural network with high-quality data taken from academic materials, mathematical textbooks, etc. The team paid special attention to the stability of training.

A chatbot with free access has been launched on the basis of OLMo 2 13B: it can communicate in Russian, it can generate texts and code. The neural network database, as noted, contains extensive information on many topics, from science and history to practical advice and problem solutions. It talks about the ability to learn and adapt to new demands, which allows AI to become more useful over time. The OLMo 2 family models and the datasets on which they were trained can be downloaded from the Ai2 website and used for commercial purposes.[1]

Notes