handheld augmented reality

Augmented Reality Anywhere and Anytime   

Projects

   Social Augmented Reality

   Information Presentation

   Real-Time Self Localization

   Structural Modelling

   AR Graphics

   AR Navigation

   Augmented Reality Games

   Past Projects

 

Technology

   Hybrid SLAM

   Panorama SLAM

   Planar SLAM

   Model-Based Tracking

   Marker-Based Tracking

 

Software Libraries

   Studierstube ES

   Studierstube Tracker

   Muddleware

   ARToolkitPlus

 

More

   Videos

   Media/Press

   Team

   Publications

   Collaborations

   Student projects

   FAQ

   Site map

 

The Handheld Augmented Reality
Project is supported by the
following institutions:

Qualcomm

 

Christian Doppler Forschungsgesellschaft

 

Graz University of Technology

 

 


 

Come to ISMAR 2011

ISMAR2011

 

Follow us on twitter

Handling Pure Camera Rotation in Keyframe-Based SLAM

Teaser

Rotation-only camera movements are handled by tracking and mapping local panorama maps registered within a global 3D map. The information contained in the panorama maps is also used for 3D reconstruction.

Summary


Handling degenerate rotation-only camera motion is a challenge for keyframe-based simultaneous localization and mapping with six degrees of freedom. Existing systems usually filter corresponding keyframe candidates, resulting in mapping starvation and tracking failure. We propose to employ these otherwise discarded keyframes to build up local panorama maps registered in the 3D map. Thus, the system is able to maintain tracking during rotational camera motions. Additionally, we seek to actively associate panoramic and 3D map data for improved 3D mapping through the triangulation of more new 3D map features.

 

Teaser

Figure: After initialization, the system starts to operate in full 6DOF mapping mode (1). If pure rotation motion is detected, the system switches to 3DOF mapping mode and creates a new panorama map (2). 6DOF measurements move the system back to full 6DOF operation (3b). In case of tracking failure, relocalization always recovers a full 6DOF pose (3a).

 

Teaser

Figure: The same three non-consecutive frames as processed by (a) 6DOF SLAM and (b) hybrid SLAM (our approach). 6DOF SLAM discards low-parallax candidate keyframes, resulting in tracking failure due a lack of new finite map features (rendered in red). Hybrid SLAM detects the pure-rotation camera motion, creates a local panorama map, and continues camera tracking from infinite map features (rendered in cyan).

 

Teaser

Figure: Hybrid SLAM handling a pure-rotation camera movement in real-time on the mobile phone.

 

Teaser

Figure: Panoramas generated from keyframes of two local panorama maps.

 

Teaser

Figure: Relationships in our hybrid map representation between keyframes and features depicted in two stages. In stage (a) 6DOF keyframes observe finite map features. Local panorama maps are registered in the 3D map via reference panorama keyframes (green) that have finite and infinite feature observations, while the remaining dependent panorama keyframes (dark blue) observe infinite features only. In stage (b) infinite features are triangulated from corresponding observations matched between additional 6DOF keyframes and/or localized panorama keyframes from different local panorama maps. Note that the additional features enable the localization of further dependent panorama keyframes.

 

Video



Publication


Christian Pirchheim, Dieter Schmalstieg, Gerhard Reitmayr
Handling Pure Camera Rotation in Keyframe-Based SLAM
To appear in: IEEE International Symposium on Mixed and Augmented Reality, 2013.

Abstract    PDF PDF

 

copyright (c) 2014 Graz University of Technology