The Myo Drone is based on my research in Neuromuscular Adaptations to Human Computer Interfaces while at the Bretl Research Group at the University of Illinois in 2015. This research was motivated by my interest in human-computer interfaces and applications of HCI using quadcopters. To conduct my research, I set out to understand neuromuscular adaptations in a variety of applications, to understand Human Computer Interfaces in a variety of applications, to research current issues affecting construction workers, to design a pilot in the loop controller for the Parrot AR Drone, and to design a pilot in the loop controller for the Asctec Hummingbird Quadcopter.
Through my research, I discovered various applications of neuromuscular technologies, from adaptive robot co-workers to exercise, robot vehicles, and even drivers. This area emphasizes the importance of increased control and stability in human-robot interactions. I then dove into the principles behind human-computer interfaces. The primary principles behind these interfaces include early focus on users and tasks, numerical or empirical measurement of results throughout the process, and overall iterative design. One recent technology that provides a seamless experience and neuromuscular feedback is the Myo.
The Myo armband uses propriety EMG sensors to measure electrical activity from muscles to detect poses made by your hand. It even includes 9-axis IMU to determine motion, orientation, and rotation.
It uses a technique called electromyography, a medicine technique that evaluates and records electrical activity from skeletal muscles. The Myo uses this feedback as inputs for the controller.
My research was focused on using this technology to control the asctec hummingbird, allowing operators to intuitively control a quadcopter. In this project, I programmed the Myo to communicate with the quadcopter through a desktop-based GUI. Unfortunately, due to time constraints, I wasn’t able to control the asctec hummingbird, but was able to control the Parrot AR drone.
Using the myo armband, I was successfully able to control the quadcopter using hand gestures. I had control over the yaw and pitch of the quad, while the roll and altitude were kept constant. Waving my hand left or right caused the quadcopter to yaw left or right. A closed fist caused the quadcopter to pitch backward, and an open fist caused the quadcopter to pitch forward. Using this research, construction works would be able to precisely navigate through tight spaces and areas where autonomous control just won’t be enough. Future research will use the IMU’s to control the quadcopter by simply moving your arm, instead of gestures.
References:
Performance and Neuromuscular Adaptations
Neuromuscular Adaptations to Training
Understanding Neuromuscular Adaptations in Human-Robot Physical Interaction
A Framework for Analysing Driver Interactions with Semi-Autonomous Vehicles
A Hybrid Human-Computer Autonomous Vehicle Architecture
Neuromuscular Adaptations to Human Computer Interfaces from Jay Mulakala