In the world of robotic vision motion is the engineer’s friend. The relationship between a robot (usually a laser based vision system) and the environment though which it moves is part of a process known as SLAM – Simultaneous Localisation and Mapping. SLAM uses your location to help map an environment as you pass through it. For the Melbourne based research organisation CSIRO (Commonwealth Scientific and Industrial Research Organisation), the natural motion of users equipped with a backpack has made a project named Zebedee, after the jack-in-the-box from the film Doogal (based on The Magic Roundabout children show), a hop away from commercial success. The user is so integral to SLAM that he or she becomes part of the sensory process – also referred to as passive actuation – once the backpack equipment is turned on and the hand held device is lifted up.
The Zebedee consists of a small LiDAR system, an Inertial Measurement Unit (IMU), a spring, a camera with a trigger to take shots, and a USB cable relaying information to a netbook in the backpack. By incorporating the IMU, the system is able to constantly monitor all six degrees of freedom of movement (heave, sway, surge, roll, pitch, yaw) along with the laser’s position in space as it captures information. This process is different from more traditional forms of mapping because the system uses movement as opposed to trying to correct for it. In more everyday settings IMU’s are common features in airplanes – used to monitor and inform pilots of movement while in the flight.
Zebedee was developed with perception in mind. In this context, perception is framed within the world of robotic engineering and the desire to understand processes involved in navigating, mapping and interpreting a built environment – much in the same way the human brain adapts sensory information to establish position every time a person moves through a scene. For the user, this influence from the world of robotics also signals a move to more kinaesthetic forms of information capture – made possible by current day central processing unit capabilities.
Elliot Duff, Principal Research Scientist at CSIRO, highlights the immediate application advantages of this handheld mapping marvel: ‘Because you’re mobile you can go through quite confined spaces – up and down stairs’. Drawing humour from the feature that led to the prototype’s namesake he continues, describing how not every user is ready for such an up close and personal motion capture device: ‘Design engineers had real problems with the spring. We called it Zebedee and they called it Mr Floppy’.
Whimsical name squabbles aside; the device is still user friendly in its current form, capable of 2-cm resolution (depending on environmental features) and faster than real time processing for most current devices. These features are still being refined at this stage of development and there is much more spring in Zebedee’s future steps according to CSIRO. This includes a wrist worn real time user interface, greater incorporation of camera information into the 3D workflow, and live internet streaming where possible.