Artificial Intelligence is disruptive, easily as disruptive as drones (if not more), so when you combine the two you really get something amazing. And that’s exactly what happened, thanks to funding from Google and two years of hard work by a NASA JPL team. Combining the two, the JPL team in question created a self-piloting, racing drone. Once built, they really took their project took the next level with their “testing methodology,” too. What exactly was the methodology? Well, it wasn’t so much was a what, but rather a who; specifically, Ken Loo, a Google engineer and Drone Racing League pilot.
In a California warehouse in October, quadcopter drones zoomed and buzzed, racing through an obstacle course of black-and-white checkered arches. On one team: drones guided by software and AI, the work of a team from NASA’s Jet Propulsion Laboratory. On the other: a drone steered by a human professional—Ken Loo, a Google engineer and Drone Racing League pilot.
The official results? Score one for flesh and blood. The human-piloted drone completed the course faster, on average flying the laps more than two seconds quicker than the software-powered craft.
While this might seem like something out of a sci-fi movie, if JPLs new technology goes mainstream, you can expect it to be commonplace. Implausible? Consider the computer power required by this AI feat? The JPL drone is powered by a Qualcomm Snapdragon Flight board, so in layman’s terms, that means it has the processing power of a smart phone. How far did they take it?
The team built three custom drones (dubbed Batman, Joker and Nightwing) and developed the complex algorithms the drones needed to fly at high speeds while avoiding obstacles. These algorithms were integrated with Google’s Tango technology, which JPL also worked on.
The drones were built to racing specifications and could easily go as fast as 80 mph (129 kph) in a straight line.
Considering the limited resource requirements, the software is doing some impressive things. Consider how it navigates:
Like a racecar driver learning a course, the drone needs to know the best lines to take to get where it’s going quickly. “We either hand-carry the drone around the course, or we manually fly it,” Reid says, “so we can teach the drone where the race track is.”
But that’s just the beginning. From there, the team figures out the best route for the drone to take by modeling it on computers. That process allows the humans to participate and make sure that the path is actually a safe one that keeps their pricey drone in one piece. In other words, for this competition, the drone wasn’t figuring out the best way to fly all on its own—people were involved. In that sense, it wasn’t a true, independent artificial intelligence system like those that automatically power, for example, language translation on Facebook.
From there, after the drone is programmed with the route, it’s off to the races. Reid stresses that while the route planning actually took place offboard the drone, in the future it could happen using just the drone’s onboard computer.
In my opinion, the real magic isn’t how it navigates, but instead, how it can improve algorithmically. The AI let’s it learn, just like a real pilot, but without all the test flights.
Loo learned quickly by flying the course multiple times, Reid says. But the NASA team did things differently. “We only need to fly once, and then we can sit there for a few hours crunching numbers to get better,” he says. Interestingly, that optimization process—using algorithms to figure out the best route—took a lot of time.
“The human pilot has to learn by flying—whereas we can record it, and learn without even flying the drone,” Reid says.
Had the NASA team had more time that day to run the software and figure out the best route to take around the course, the resulting race times could have been different—the AI drone might have beaten the human.
So what’s it’s biggest limitation to performance today?
The biggest performance limitation for fast indoor flight comes from the shutter speed of the onboard cameras that are used to track the drone’s motion—flying too fast while too close to the ground, or rolling or pitching too quickly can cause the image to blur and the drone to become lost. We addressed this in two ways: First, by using two wide field-of-view cameras—by pointing one forwards and the other downwards, the >250-degree field-of-view allows the drone to always see the horizon. Second, we adjusted trajectories to cap rotation rates and speed-to-height ratio.
I for one am excited to watch the continued research and work being done by JPL, and others like the University of Pennsylvania and DARPA. Drones and their components continue to miniaturize, more efficient, and more capable, which leads to an increased amount of research in autonomous and in some cases AI driven autonomous flight.
As exciting as this is, it’s important to recognize that there will be a human element in commercial drone operation for years and years to come. Skill requirements changes over time, and it’s important to stay abreast of the changes. At Drone Universities, you can learn the skills you needs, hands-on, from industry experts. We offer a variety of courses, includes FAA Part 107 training. Learn at a drone school you can trust, learn from Drone Universities.
Want to see this drone in action? Then check-out the video below:
Drone Race: Human vs. Machine
JPL engineers put together a drone race to find which is faster – a drone operated by a human or one operated by artificial intelligence. The race capped two years of research into drone autonomy funded by Google.