If you’ve ever driven a drone, you’ll know that using a joystick-style controller will get used to it. MIT scientists have developed what they claim as a more intuitive control system, which reads the muscle signal operator.
Known as the Conduct-A-Bot, experimental settings use multiple electromyography sensors (which detect electrical activity) and motion sensors – these are worn on biceps, triceps and forearm right-arm users. Working together, the sensor detects muscle activity and arm movements, conveying that data to a wired microprocessor.
Machine learning-based algorithms are then used to identify various arm actions, each of which has been programmed to be converted into specific commands. The commands in turn are wirelessly transmitted to the quadcopter Parrot Bebop 2, which responds accordingly.
In the current setting, the hardening of the upper arm stops the drone; Clenchizing the hand moving it forward; Rotating the fist clockwise or counterclockwise causes it to spin; and wave up, down, left or right move it sideways or horizontally. In a recent test where it was made to fly through a circle, Bebop responded correctly 82 percent of more than 1,500 of those numbers should be improved when the system was further developed.
Any drone model can be used, and in fact it is imagined that technology can eventually be used in applications such as auxiliary robot control by a parent or physical disability.
“This system moves a step closer to letting us work seamlessly with robots so they can be a more effective and intelligent tool for everyday tasks,” says graduate student Joseph DelPreto, lead author of the paper on this research. “Because such collaborations continue to be more accessible and pervasive, the possibilities for synergistic benefits continue to grow.”