RSS
Логотип
Баннер в шапке 1
Баннер в шапке 2
2024/05/20 16:25:16

Reading thoughts

Content

2024

Introduced an 8mm mind-reading chip that converts thoughts into text with 91% accuracy. It is already being implanted in people

In late August 2024, researchers at EPFL unveiled a next-generation miniature chip interface capable of converting thoughts into text. The new MiBMI technology, which is already implanted, not only increases the efficiency and scalability of brain-machine interfaces, but also paves the way for the development of practical, fully implantable devices that can significantly improve the quality of life of patients with diseases such as amyotrophic lateral sclerosis and spinal cord injuries, the developers say. Read more here

A working technology has been created that allows people to communicate with the power of thought

In mid-July 2024, working technology was introduced that allows people to communicate with the power of thought. According to researchers at Tel Aviv University and Tel Aviv Suraski Medical Center, the findings offer hope for those unable to speak due to a variety of medical conditions such as ALS, brainstem stroke or brain injury.

As part of the experiment, an Israeli patient with epilepsy was able to "pronounce" two syllables using only the power of thought. At the first stage of the experiment, the researchers asked a patient with electrodes implanted in the brain to say two syllables aloud :/a/and/e/. They recorded brain activity as he uttered those sounds. Using deep learning and machine learning, the researchers trained artificial intelligence models to identify specific brain cells whose electrical activity indicates an intention to pronounce these syllables. When the computer learned to recognize patterns of electrical activity associated with these two syllables, the patient was asked to imagine what he was saying/a/and/e/. The computer converted the captured electrical signals and reproduced the pre-recorded sounds/a/or/e/respectively.

A working technology is presented that allows people to communicate with the power of thought

File:Aquote1.png
In this experiment, for the first time in history, we were able to connect elements of oral speech with the activity of individual brain cells, "said lead researcher, Dr. Ariel Tankus from the Tel Aviv University Faculty of Medicine. - This allowed us to distinguish between the electrical signals that represent the sounds/a/and/e/. At the moment, our research includes only two building blocks of speech, two syllables. Of course, our goal is to achieve complete reproduction of mental speech, but even two different syllables can allow a completely paralyzed person to make it clear to others, he means yes and no[1]
File:Aquote2.png

The first neurointerface of reading "mental speech" was created and successfully applied

Scientists at the University of Geneva have invented brain implants capable of decoding 'internal speech' - the implant is able to identify words spoken by people inside without lip movements or sounds.

While the technology is at an early stage - at the moment it works with just a few words rather than phrases or sentences - it could find clinical application in the future. Previously, neurointerfaces reached processing speeds of up to 62-78 words per minute, but for their work, words had to be at least partially pronounced or spoken with their lips. The development of the University of Geneva is the first example of recording signals from neurons in real time.

University of Geneva scientists invent brain implants capable of decoding 'internal speech'

File:Aquote1.png
The technology would be particularly useful for people who no longer have the means of transportation, such as locked-in human syndrome, says study co-author Sarah Vandelt.
File:Aquote2.png

Researchers implanted arrays of tiny electrodes in the brains of two people with spinal cord injuries. They placed the devices in the supramarginal gyrus (SMG), an area of the brain that had not previously been investigated using BCI (brain-computer interface) to decode speech.

Two weeks after implanting microelectrode matrices into the participants' left limb, the researchers began collecting data. They trained BCI to work with six words (battlefield, cowboy, python, spoon, swimming and phone) and two meaningless pseudo-signs (nifzig and bindip). For three days, participants imagined how to pronounce the words shown on the screen. The interface, then combining the participants' brain activity measurements with a computer model, predicted their internal speech in real time.

While the study is a big step forward for decoding human thoughts, researchers believe the clinical application of the development is a long way off. In particular, scientists note that no one fully knows how internal speech works. It is not known whether the brain represents internal speech phonetically or semantically. For further discoveries, scientists will need to increase the number of words with which the interface can work.[2]

2023

A neural network has been launched that can analyze the neural activity of the cerebral cortex and reconstruct human speech from it

On October 11, 2023, American researchers from New York University announced the development of a neural network that can analyze the neural activity of the cerebral cortex and reconstruct human speech from it. The system then recreates the person's voice, allowing the ability to speak to those people who have lost the ability to communicate with others for whatever reason.

Scientists note that human speech is a very complex process. It involves not only precise management of large numbers of oral, laryngeal and respiratory muscles, but also an inverse auditory link to adjust speech. These actions require the coherent work of several neural networks of the brain. Understanding the mechanisms of direct and feedback is necessary to create new systems that allow, for example, paralyzed patients to maintain communication with the outside world.

A neural network has been developed that can analyze the neural activity of the cerebral cortex and reconstruct human speech from it

In the new work, the researchers successfully unraveled the complex brain processes occurring during speech production. Experts have applied a special deep learning architecture that trains on electrocorticographic recordings of the activity of the human cortex. The AI model distinguishes between causal (using current and past neural signals to decode speech), anti-causal (current and future neural signals), and non-causal (their combination) temporal convolutions, making it possible to reconstruct speech.

Experts used a new neural network to develop a system capable of reading brain activity and converting it directly into speech. Moreover, the technology makes it possible to largely recreate the patient's natural voice before its loss, using only a small set of audio recordings. Thus, people who have lost the opportunity to speak can regain the gift of speech.[3]

Artificial intelligence can now create videos directly from people's brains

On May 19, 2023, researchers from the National University of Singapore and the Chinese University of Hong Kong announced the development of technology that allows the generation of video materials based on information about brain activity. The project was named MinD-Video. Read more here.

A neurointerface has been developed to control robots with the power of thought

On March 20, 2023, Australian scientists from the University of Technology Sydney announced the development of a biosensor system that allows you to control robots and various automated complexes through the power of thought.

We are talking about an improved brain-computer interface. When creating wearable sensors, experts used advanced graphene material in combination with silicon, which made it possible to solve the problems of corrosion, durability and resistance to contact with the skin. Such sensors are resistant to adverse environmental conditions, so they can be used in manufacturing, aerospace and other areas.

Ghost Robotics Robot

The sensors are located in the back of the user's head to record waves from the visual cortex of the brain. For control, an augmented reality module displays flickering white squares. Concentrating on a certain square, the operator can control robotic devices: in this case, brain waves are captured by a biosensor, and the decoder transfers signals to control commands.

Tests with the four-legged robot Ghost Robotics showed that the new brain-computer interface allows you to control the machine without the help of hands with an accuracy of 94%. The technology makes it possible to generate at least nine commands in two seconds by force of thought. The researchers believe their solution could find applications in a wide variety of fields. For example, people with limited mobility will be able to control a wheelchair or prostheses through this interface. The system can be in demand in the defense and healthcare industries, in particular, when deploying telemedicine platforms. The technology is capable of replacing voice and gesture controls, as well as traditional keyboards, manipulators and touch displays.[4]

2022

A neuroimplant has been created that reads words directly from people's thoughts

In mid-November 2022, scientists developed a way to read words directly from the brain. Brain implants can translate internal speech into external signals, allowing people with paralysis or other medical conditions to communicate that deprive them of the ability to speak or type.

The new findings from the two studies, presented Nov. 13, 2022, at the Society for Neurology's annual meeting, provide further evidence for the extraordinary potential of brain implants to restore lost communication, reports neuroscientist and neurocritical care physician Lee Hochberg.

A neuroimplant has been created that reads words directly from people's thoughts

Some people who need communication help for November 2022 may use devices that require small movements, such as changing their eye gaze. Not everyone can do these tasks. Therefore, new research was aimed at studying internal speech, which requires only thought efforts from a person.

File:Aquote1.png
Our device predicts internal speech directly, allowing the patient to simply focus on saying a word in their head and convert it into text! It can be much easier and more intuitive than if a patient had to spell or mouth words, "said Sarah Wandelt, a neurologist at Caltech.
File:Aquote2.png

Neural signals associated with words are recorded by electrodes implanted in the brain. These signals may then be translated into text that may be voiced by speech-generating computer programs.

Vandelt and her colleagues were able to predict exactly which of the eight words a person who is paralyzed from the neck down thinks. This person spoke two languages, and the researchers were able to recognize both English and Spanish words. Electrodes picked up nerve cell signals in the posterior parietal cortex, a brain region involved in speech and hand movements. A brain implant installed there could eventually be used to control devices that can perform tasks typically performed by the hand, Vandelt said.[5]

fMRI: For the first time, a non-invasive method of reading thoughts was successfully used

In mid-October 2022, scientists report that they have developed a method that uses recordings of functional magnetic resonance imaging of the brain to restore a continuous language. The findings are the next step in the search for better brain-computer interfaces, which are being developed as an assistive technology for those unable to speak or type. It is argued that this is the first application of the non-invasive method of reading thoughts. Read more here.

Developing Technology to Control the Computer with the Power of Thought

Scientists at Southern Federal University (SFU) have announced the development of computer control technology for the power of thought. The system is a software package that allows you to control a computer and communicate with people who are paralyzed. A special device reads brain activity and binds to a bionic prosthesis or other external apparatus. Read more here.

Chinese cobots taught to "read" the thoughts of workers on the assembly line to help with work

In early January 2022, it became known about the creation in China of an industrial robot that can read human thoughts with 96% accuracy. As well as tracking a worker's brain waves, the robot collects electrical signals from muscles as people work together to assemble a complex product. Read more here.

2021

Paralysed neuroimplant patient sends Twitter message by force of thought for the first time

On December 22, 2021, a Synchron neuroimplant company announced it was sending Twitter messages to Filip O'Keefe, one of the patients brain with a neural interface implanted. O'Keefe is the first person to successfully send a message to the world on social media through thought using a brain implant. More. here

The start of sales of a smart helmet for reading thoughts for $50 thousand.

In mid-June American startup Kernel 2021, it announced the start of sales of thought reading helmets that can analyze activity. neurons The cost of the device is $50 thousand. More. here

2020: Computer has learned to read a person's mind almost unmistakably

In early April 2020, the University of California at San Francisco presented artificial intelligence, which, according to researchers, allows you to read human thoughts. During the experiment, the system showed high accuracy - only 3% of errors.

Scientists have tested the algorithm on four women. Electrodes were implanted into their brains to monitor epilepsy attacks. The participants in the experiment read the words aloud, and the scientists broadcast the activity of their AI brain so that it would detect certain patterns associated with a particular word.

University of California San Francisco unveils artificial intelligence that researchers say allows human mind to be read

For the neural network itself, only those phrases that were repeated by at least three different participants were selected. As a result, scientists had at their disposal about 50 phrases consisting of 250 unique words. During reading, electrodes were connected to the participants in the experiment, which allowed them to read their brain activity. The findings were uploaded to a program that transformed them into sets of numbers. Then the neural network came into play directly, which tried to reproduce the initially read text from the numbers received.

Participants in the experiment repeated each of the 30-50 sentences twice. The response of nerve cells to the first pronunciation of the phrase served to teach the system, and to the second pronunciation - to test its skills.

According to the researchers, the average vocabulary of a person speaking English is 20 thousand words, so artificial intelligence is still far from understanding everyday speech. In addition, each additional word increases the number of assumptions, which means it reduces the overall accuracy of decoding brain activity. Scientists said their system had reached previously unprecedented heights in terms of reading human thoughts.[6]

Notes