| Developers: | Massachusetts Institute of Technology (MIT), UC Berkeley |
| Date of the premiere of the system: | September 2025 |
| Branches: | Electrical and Microelectronics |
| Technology: | Robotics |
2025: Product Announcement
In September 2025, researchers from Improbable AI Lab, MIT and the University of California, Berkeley (UC Berkeley) unveiled the passive Dexop hand exoskeleton. The device is designed to collect data with which robots learn to grab objects.
The exoskeleton is physically connected to the robotic arm and accurately transmits the user's movements to the machine. The system captures not only movements, but also grip force thanks to pressure sensors at the tips of robotic fingers. The Dexop-trained robot screwed the light bulb in 11 seconds versus 86 seconds when using traditional techniques.
Traditional robot training methods are based on digital simulations and joysticks. Devices remember movements and learn to reproduce them. Such training takes a long time, and the resulting datasets contain a lot of "noise" and inaccuracies.
It is difficult for machines to determine the optimal compression force of objects. Robots must hold objects strong enough, but not damage them. Grip force calibration presents a major technical challenge for robotic system developers.
The Dexop exoskeleton solves these problems through a direct physical link to the robotic arm. When the user bends the index finger, the robot repeats this motion exactly. Synchronization takes place in real time without delay.
Pressure sensors at the tips of robotic fingers record the force of action on objects. The system records not only the trajectories of movements, but also the dynamics of the application of efforts. Comprehensive data collection improves the quality of training sets.
The resulting datasets contain fewer artifacts and inaccuracies compared to traditional methods. Cleaner data speeds up the machine learning process. Robots are faster at mastering complex manipulations of objects.[1]
