Researchers at Queen’s University in Ontario, Canada have taken quadcopter technology and applied it in a very unique way. If you’ve seen Big Hero 6, then you are already familiar with the idea of swarming microbots. While we aren’t there yet, the Canadian research team has created “a fleet of flying 3-D pixels” that may just be our first step in that direction.
Think of it as a sort of macro test run for future nanobot swarms. Developed at the Queen’s University’s Human Media Lab, the BitDrones system is being billed as a first step toward creating “interactive self-levitating programmable matter.”
The BitDrones system actually incorporates three different kinds of 3-D pixels, or “voxels.” The cubic ShapeDrones feature a light mesh and 3D-printed geometric frame around the miniature quadcopter. PixelDrones carry an onboard LED and small dot matrix screen. DisplayDrones are fitted with a curved and lightweight touchscreen monitor.
If you’d like to see the BitDrones in action, then checkout this video that shows what the Human Media Lab researchers have named the “Real Reality interface.”
About the Human Media Lab
The Human Media Lab (HML) at Queen’s University is one of Canada’s premier Human-Computer Interaction (HCI) laboratories. Inventions include ubiquitous eye tracking sensors, eye tracking TVs and cellphones, PaperPhone, the world’s first flexible phone, PaperTab, the world’s first flexible iPad and TeleHuman, the world’s first pseudo-holographic teleconferencing system. HML is directed by Dr. Roel Vertegaal, Professor of HCI at Queen’s University’s School of Computing. Working with him are a number of graduate and undergraduate students with computing, design, psychology and engineering backgrounds.