An interactive swarm of flying 3D pixels (voxels) is set to revolutionize the way people interact with virtual reality. The system, called BitDrones, allows users to explore virtual 3D information by interacting with physical self-levitating building blocks – materials capable of changing their 3D shape in a programmable fashion – using swarms of nano quadcopters. Apps include real-reality 3D modeling, gaming, molecular modeling, medical imaging, robotics and online information visualization.
“BitDrones brings flying programmable matter, such as featured in the futuristic Disney movie Big Hero 6, closer to reality,” says Dr. Vertegaal. “It is a first step towards allowing people to interact with virtual 3D objects as real physical objects.”
Dr. Vertegaal’s team at Human Media Lab created 3 types of BitDrones, each representing self-levitating displays of distinct resolutions. “PixelDrones” are equipped with one LED and a small dot matrix display. “ShapeDrones” are augmented with a light-weight mesh and a 3D printed geometric frame, and serve as building blocks for complex 3D models. “DisplayDrones” are fitted with a curved flexible high resolution touchscreen, a forward-facing video camera and Android smartphone board. All 3 BitDrone types are equipped with reflective markers, allowing them to be individually tracked and positioned in real time via motion capture technology. The system also tracks the user’s hand motion and touch, allowing users to manipulate the voxels in space.
In one scenario, users could physically explore a file folder by touching the folder’s associated PixelDrone. When the folder opens, its contents are shown by other PixelDrones flying in a horizontal wheel below it. Files in this wheel are browsed by physically swiping drones to the left or right. Users would also be able to manipulate ShapeDrones to serve as building blocks for a real-time 3D model. Finally it will allow for remote telepresence by allowing users to appear locally through a DisplayDrone with Skype. The DisplayDrone would be capable of automatically tracking and replicating all of the remote user’s head movements, allowing a remote user to virtually inspect a location and making it easier for the local user to understand the remote user’s actions.
While their system currently only supports dozens of comparatively large 2.5″ – 5″ sized drones, the team at the Human Media Lab are working to scale up their system to support thousands of drones. These future drones would measure no more than a half inch in size, allowing users to render more seamless, high resolution programmable matter.
Recent Comments