During the summer of 2011, an eight-person research team comprised of undergrads from Purdue, The Ohio State University, University of Alabama and elsewhere worked on a small quad-rotor helicopter that could autonomously detect and follow human targets from a set distance. This project was part of the Summer at the Edge internship program at the Air Force Research Lab’s Tec^Edge facility. For human tracking purposes a modified Kinect, Microsoft’s 3D infrared camera for their Xbox console, was used.
End of summer video for SATE project DragonFire.
-Canon 50mm f/1.4
-Manfrotto Triman tripod
If you’d like to learn more about the “how,” then you should check-out Christopher Sneeder Design and Arts.
The Kinect detects a 640×480 image with depth data for each pixel using an infrared laser projector. We stripped one down to the bare components to reduce weight and suspended it between the custom landing skids underneath our quad-rotor. The data was processed onboard by a small PandaBoard Linux computer. The Kinect comes with skeleton-tracking software to detect human motion for Xbox games, but this was too processor-intensive for the PandaBoard. We developed an algorithm that scans for human forms but disregards unnecessary variables like arm position. This allowed the quad-rotor to detect and track the 3D coordinates of the nearest human.
We also researched the issue of maintaining translational stability while not actively tracking a target. This would greatly increase the autonomous capability of the vehicle by allowing it to hover in place to await commands. Most quad-rotors come with an IMU that helps them stay level even in windy conditions, but these minute corrections cause the vehicle to drift. Our team experimented with methods to counteract this drift. One possibility was to take a double integral of data coming from the IMU’s accelerometer to produce position coordinates, a technique known as dead reckoning. However, small errors in acceleration data result in much larger inaccuracies once integrated. Even after supplementing with a better IMU, our position coordinates began to drift after only a few seconds of flight and the vehicle would crash trying to correct. We also tested visual techniques like horizon tracking, which met with somewhat better results.
DragonFire was one of the first Summer at the Edge projects to experiment with using 3D printing as a prototyping technique. Our team used a Dimension uPrint provided by the Oakwood High School engineering department, where I was currently a student. This allowed us to quickly design and fabricate custom structural components like the Kinect mount and landing skids. After our successful tests, the program purchased two printers.