handheld augmented reality

Augmented Reality Anywhere and Anytime   

Projects

   Social Augmented Reality

   Information Presentation

   Real-Time Self Localization

   AR Graphics

   AR Navigation

   Augmented Reality Games

   Past Projects

 

Technology

   Hybrid SLAM

   Panorama SLAM

   Planar SLAM

   Model-Based Tracking

   Marker-Based Tracking

 

Software Libraries

   Studierstube ES

   Studierstube Tracker

   Muddleware

   ARToolkitPlus

 

More

   Videos

   Media/Press

   Team

   Publications

   Collaborations

   Student projects

   FAQ

   Site map

 

The Handheld Augmented Reality
Project is supported by the
following institutions:

Qualcomm

 

Christian Doppler Forschungsgesellschaft

 

Graz University of Technology

 

 


 

Come to ISMAR 2011

ISMAR2011

 

Publications

 

This page lists publications of team members from the Handheld Augmented Reality project.

 

2013


 

Exploring Distant Objects with Augmented Reality

Authors: Markus Tatzgern, Raphael Grasset, Eduardo Veas, Denis Kalkofen, Hartmut Seichter, Dieter Schmalstieg

Details: To appear in: Joint Virtual Reality Conference, 2013.

Augmented reality (AR) enables users to retrieve additional information about the real world objects and locations. Exploring such location-based information in AR requires physical movement to different viewpoints, which may be tiring and even infeasible when viewpoints are out of reach. In this paper, we present object-centric exploration techniques for handheld AR that allow users to access information freely using a virtual copy metaphor to explore large real world objects. We evaluated our interfaces in controlled conditions and collected first experiences in a real world pilot study. Based on our findings, we put forward design recommendations that should be considered by future generations of location-based AR browsers, 3D tourist guides, or in situated urban planning.

 

Handling Pure Camera Rotation in Keyframe-Based SLAM

Authors: Christian Pirchheim, Dieter Schmalstieg, Gerhard Reitmayr

Details: IEEE International Symposium on Mixed and Augmented Reality, 2013.

Handling degenerate rotation-only camera motion is a challenge for keyframe-based simultaneous localization and mapping with six degrees of freedom. Existing systems usually filter corresponding keyframe candidates, resulting in mapping starvation and tracking failure. We propose to employ these otherwise discarded keyframes to build up local panorama maps registered in the 3D map. Thus, the system is able to maintain tracking during rotational camera motions. Additionally, we seek to actively associate panoramic and 3D map data for improved 3D mapping through the triangulation of more new 3D map features. We demonstrate the efficacy of our approach in several evaluations that show how the combined system handles rotation only camera motion while creating larger and denser maps compared to a standard SLAM system.

 

Dynamic Compact Visualizations for Augmented Reality

Authors: Markus Tatzgern, Denis Kalkofen, Dieter Schmalstieg

Details: IEEE Virtual Reality Conference (VR), 16-23 March. 2013

In Augmented Reality (AR), careless augmentations can easily lead to information overflow. Especially on small screen devices, only a limited amount of information can be displayed comprehensively. Compact visualization filters data by reducing redundancies and creating a layout of the remaining information. Previously, this approach was applied to create static compact explosion diagrams. In this paper, we extend the approach to annotations, which are a major source of information in AR, and create compact layouts of annotations and annotated explosion diagrams. We present methods to transfer compact visualizations to dynamic AR settings and achieve interactive frame rates even on limited-resource hardware, such as mobile phones. Moreover, we create temporally coherent and scene-aware layouts.

 

2012


 

Full 6DOF Pose Estimation from Geo-Located Images

Authors:Clemens Arth, Gerhard Reitmayr, Dieter Schmalstieg

Details:Asian Conference on Computer Vision (ACCV), 5-9 Nov. 2012

Estimating the external calibration - the pose - of a camera with respect to its environment is a fundamental task in Computer Vision (CV). In this paper, we propose a novel method for estimating the unknown 6DOF pose of a camera with known intrinsic parameters from epipolar geometry only. For a set of geo-located reference images, we assume the camera position - but not the orientation - to be known. We estimate epipolar geometry between the image of the query camera and the individual reference images using image features. Epipolar geometry inherently contains information about the relative positioning of the query camera with respect to each of the reference cameras, giving rise to a set of relative pose estimates. Combining the set of pose estimates and the positions of the reference cameras in a robust manner allows us to estimate a full 6DOF pose for the query camera. We evaluate our algorithm on different datasets of real imagery in indoor and outdoor environments. Since our pose estimation method does not rely on an explicit reconstruction of the scene, our approach exposes several significant advantages over existing algorithms from the area of pose estimation.

 

Image-Driven View Management for Augmented Reality Browsers

Authors:Raphael Grasset, Tobias Langlotz, Denis Kalkofen, Markus Tatzgern and Dieter Schmalstieg

Details:International Symposium on Mixed and Augmented Reality (ISMAR), 5-8 Nov. 2012

In this paper, we introduce a novel view management technique for placing labels in Augmented Reality systems. A common issue in many Augmented Reality applications is the absence of knowledge of the real environment, limiting the efficient representation and optimal layout of the digital information augmented onto the real world. To overcome this problem, we introduce an image-based approach, which combines a visual saliency algorithm with edge analysis to identify potentially important image regions and geometric constraints for placing labels. Our proposed solution also includes adaptive rendering techniques that allow a designer to control the appearance of depth cues. We describe the results obtained from a user study considering different scenarios, which we performed for validating our approach. Our technique will provide special benefits to Augmented Reality browsers that usually lack scene knowledge, but also to many other applications in the domain of Augmented Reality such as cultural heritage and maintenance applications.

 

Real-time Photometric Registration from Arbitrary Geometry

Authors:Lukas Gruber, Thomas Richter-Trummer, Dieter Schmalstieg

Details:International Symposium on Mixed and Augmented Reality (ISMAR), 5-8 Nov. 2012

Visually coherent rendering for augmented reality is concerned with seamlessly blending the virtual world and the real world in real-time. One challenge in achieving this is the correct handling of lighting. We are interested in applying real-world light to virtual objects, and compute the interaction of light between virtual and real. This implies the measurement of the real-world lighting, also known as photometric registration. So far, photometric registration has mainly been done through capturing images with artificial light probes, such as mirror balls or planar markers, or by using high dynamic range cameras with fish-eye lenses. In this paper, we present a novel non-invasive system, using arbitrary scene geometry as a light probe for photometric registration, and a general AR rendering pipeline supporting real-time global illumination techniques. Based on state of the art real-time geometric reconstruction, we show how to robustly extract data for photometric registration to compute a realistic representation of the real-world diffuse lighting. Our approach estimates the light from observations of the reconstructed model and is based on spherical harmonics, enabling plausible illumination such as soft shadows, in a mixed virtual-real rendering pipeline.

 

A GPGPU Accelerated Descriptor for Mobile Devices

Authors:Robert Hofmann, Hartmut Seichter and Gerhard Reitmayr

Details:International Symposium on Mixed and Augmented Reality (ISMAR), 5-8 Nov. 2012

We present a modified upright SURF feature descriptor for mobile phone GPUs. Our implementation called uSURF-ES is multiple times faster than a comparable CPU variant on the same device. Our results proof the feasibility of modern mobile graphics acceler- ators for GPGPU tasks especially for the detection phase in natural feature tracking used in Augmented Reality applications.

 

AR Record&Replay: Situated Compositing of Video Content in Mobile Augmented Reality

Authors:Tobias Langlotz, Mathäus Zingerle, Raphael Grasset, Hannes Kaufmann, Gerhard Reitmayr

Details:ACM OzCHI 2012, 26-30 Nov. 2012

In this paper we present a novel approach to record and replay video content composited in-situ with a live view of the real environment. Our real-time technique works on mobile phones, and uses an panorama-based tracker to create visually seamless and spatially registered overlay of video content. We apply a temporal foregroundbackground segmentation of video footage and show how the segmented information can be precisely registered in real-time in the camera view of a mobile phone. We describe the user interface and the video post effects implemented in our prototype as well as our approach with a skateboard training application. Our technique can also be used with online video material and supports the creation of augmented situated documentaries.

 

Exploiting Sensors on Mobile Phones to Improve Wide-Area Localization

Authors:Clemens Arth, Alessandro Mulloni, Dieter Schmalstieg

Details:International Conference on Pattern Recognition (ICPR), 11-15 Nov. 2012

In this paper, we discuss how the sensors available in modern smartphones can improve 6-degree-of-freedom (6DOF) localization in wide-area environments. In our research, we focus on phones as a platform for largescale Augmented Reality (AR) applications. Thus, our aim is to estimate the position and orientation of the device accurately and fast – it is unrealistic to assume that users are willing to wait tenths of seconds before they can interact with the application. We propose supplementing vision methods with sensor readings from the compass and accelerometer available in most modern smartphones. We evaluate this approach on a largescale reconstruction of the city center of Graz, Austria. Our results show that our approach improves both accuracy and localization time, in comparison to an existing localization approach based solely on vision. We finally conclude our paper with a real-world validation of the approach on an iPhone 4S.

 

Experiences with the Impact of Tracking Technology in Mobile Augmented Reality Evaluations

Authors:Alessandro Mulloni, Jens Grubert, Hartmut Seichter, Tobias Langlotz, Raphael Grasset, Gerhard Reitmayr, Dieter Schmalstieg

Details:MobileHCI 2012, MobiVis workshop

In this paper, we discuss the impact of tracking technology on user studies of mobile augmented reality applications. We present findings from several of our previous publica-tions in the field, discussing how tracking technology can impact, influence and compromise experimental results. Lessons learned from our experience show that suitable tracking technology is a key requirement and a fundamental factor in the user experience of the application. Tracking technology should therefore be considered not only during implementation but also as a factor in the design and evaluation phases.

 

Indoor Navigation with Mixed Reality World-in-Miniature Views and Sparse Localization on Mobile Devices

Authors:Alessandro Mulloni, Hartmut Seichter, Dieter Schmalstieg

Details:Advanced Visual Interfaces (AVI), May 2012

We present the design of an interface that provides continuous navigational support for indoor scenarios where localization is only available at sparse, discrete locations (info points). Our interface combines turn-by-turn instructions with a World-in-Miniature (WIM). In a previous study, we showed that using an Augmented Reality WIM at info points, and turn-by-turn instructions elsewhere, is a valid support for navigation inside an unknown building. In particular, we highlighted that users value the WIM as a tool for monitoring their location in the building. In this work, we focus on using the WIM continuously, not only at info points, to support navigation. We adapt the WIM views to the quality of localization by transitioning within Mixed Reality: we use Augmented Reality to provide an overview of the whole path at info points and Virtual Reality to communicate the next instruction when localization is not available. Our results from a new user study validate our interface design and show that users exploit not only turn-by-turn instructions but also the WIM throughout the path, to navigate with our interface. This paper provides insight on how a low-infrastructure indoor solution can support human navigational abilities effectively.

 

Anywhere Interfaces Using Handheld Augmented Reality

Authors:Michael Gervautz and Dieter Schmalstieg

Details:IEEE Computer,Volume: 45 ,Issue: 7, Page: 26-31, August 2012

An investigation of the technology and human factors that drive augmented reality research describes recent developments in handheld AR, concentrating on localization, tracking, interaction, and visualization, and offers several examples illustrating the vast potential and important applications of AR. A related video can be seen here: http://youtu.be/ol371rIyUFY. It shows several real-world examples illustrating the vast potential and important applications of augmented reality.

 

Augmented Reality Browser Survey

Authors: Jens Grubert, Tobias Langlotz, Raphael Grasset

Details: Technical Report

While mobile Augmented Reality browsers have become commercially successful, real-world usage behavior with Augmented Reality browsers is still widely unexplored. We present findings of an online survey with 77 participants, and an analysis of app store feedback for popular Augmented Reality browsers. Amongst others, we found that while the usage of Augmented Reality browsers is often driven by their novelty factor, a substantial amount of long term users exists. The analysis of quantitative and qualitative data showed that missing content or context is a major reason for quitting the use of Augmented Reality browsers.

 

360° Panoramic Overviews for Location-Based Services

Authors: Alessandro Mulloni, Hartmut Seichter, Andreas Dünser, Patrick Baudisch, Dieter Schmalstieg

Details: CHI 2012

We investigate 360° panoramas as overviews to support users in the task of locating objects in the surrounding environment. Panoramas are typically visualized as rectangular photographs, but this does not provide clear cues for physical directions in the environment. In this paper, we conduct a series of studies with three different shapes: Frontal, Top-Down and Bird’s Eye; the last two shapes are chosen because they provide a clearer representation of the spatial mapping between panorama and environment. Our results show that good readability of the panorama is most important and that a clear representation of the spatial mapping plays a secondary role. This paper is the first to provide understanding on how users exploit 360° panoramic overviews to locate objects in the surrounding environment and how different design factors can affect user performance.

 

2011


 

Embedded Virtual Views for Augmented Reality Navigation

Authors: Markus Tatzgern, Denis Kalkofen, Raphael Grasset, Dieter Schmalstieg

Details: Workshop on Visualization in Mixed Reality Environments at IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2011

In this paper, we present virtual embedded views used for turn-based pedestrian navigation in Augmented Reality (AR). Embedded views allow users to see around occluding structures and at the same time seamlessly integrate the augmented navigation aid into the otherwise occluded view. Users get a preview on upcoming route changes, without the need to consult an additional map view. We compare embedded views to other methods revealing the occluded navigation aids. We demonstrate that the technique is more screen-space ef?cient when compared to ghosting, which is a typical x-ray vision technique, and may better facilitate the mental linking of information, when compared to a mirror.

 

User Experiences with Augmented Reality Aided Navigation on Phones

Authors: Alessandro Mulloni, Hartmut Seichter, Dieter Schmalstieg

Details: Poster at ISMAR 2011

We investigate user experiences when using augmented reality (AR) as a new aid to navigation. We integrate AR with other more common interfaces into a handheld navigation system, and we conduct an exploratory study to see where and how people exploit AR. Based on previous work on augmented photographs, we hypothesize that AR is used more to support wayfinding at static locations when users approach a road intersection. In partial contrast to this hypothesis, our results from a user evaluation hint that users will expect to use the system while walking. Further, our results also show that AR is usually exploited shortly before and after road intersections, suggesting that tracking support will be mostly needed in proximity of road intersections.

 

Homography-Based Planar Mapping and Tracking for Mobile Phones

Authors: Christian Pirchheim, Gerhard Reitmayr

Details: IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2011

We present a real-time camera pose tracking and mapping system which uses the assumption of a planar scene to implement a highly efficient mapping algorithm. Our light-weight mapping approach is based on keyframes and plane-induced homographies between them. We solve the planar reconstruction problem of estimating the keyframe poses with an efficient image rectification algorithm. Camera pose tracking uses the continuously extended and refined planar point maps and delivers robustly estimated 6DOF poses. We compare system and method with bundle adjustment and monocular SLAM on synthetic and indoor image sequences. We demonstrate large savings in computational effort compared to the monocular SLAM system while the reduction in accuracy remains acceptable.

 

Rapid Scene Reconstruction on Mobile Phones from Panoramic Images

Authors: Qi Pan, Clemens Arth, Gerhard Reitmayr, Ed Rosten, Tom Drummond

Details: International Symposium on Mixed and Augmented Reality (ISMAR), 26-29 Oct. 2011

Rapid 3D reconstruction of environments has become an active research topic due to the importance of 3D models in a huge number of applications, be it in Augmented Reality (AR), architecture or other commercial areas. In this paper we present a novel system that allows for the generation of a coarse 3D model of the environment within several seconds on mobile smartphones. By using a very fast and flexible algorithm a set of panoramic images is captured to form the basis of wide field-of-view images required for reliable and robust reconstruction. A cheap on-line space carving approach based on Delaunay triangulation is employed to obtain dense, polygonal, textured representations. The use of an intuitive method to capture these images, as well as the efficiency of the reconstruction approach allows for an application on recent mobile phone hardware, giving visually pleasing results almost instantly.

 

Real-Time Self-Localization from Panoramic Images on Mobile Devices

Authors: Clemens Arth, Manfred Klopschitz, Gerhard Reitmayr, Dieter Schmalstieg

Details: International Symposium on Mixed and Augmented Reality (ISMAR), 26-29 Oct. 2011

Self-localization in large environments is a vital task for accurately registered information visualization in outdoor Augmented Reality (AR) applications. In this work, we present a system for selflocalization on mobile phones using a GPS prior and an onlinegenerated panoramic view of the userís environment. The approach is suitable for executing entirely on current generation mobile devices, such as smartphones. Parallel execution of online incremental panorama generation and accurate 6DOF pose estimation using 3D point reconstructions allows for real-time self-localization and registration in large-scale environments. The power of our approach is demonstrated in several experimental evaluations.

 

Enhancing Handheld Navigation Systems with Augmented Reality

Authors: Alessandro Mulloni, Hartmut Seichter, Dieter Schmalstieg

Details: MobileHCI 2011, Workshop on Mobile Augmented Reality

We investigate the role of augmented reality (AR) as a new kind of handheld interface to enhance navigation. We integrate AR with other more common interfaces into a handheld navigation system, and we conduct an exploratory study to see where and how people exploit the AR interface. Based on previous work on augmented photographs, we hypothesize that AR is more useful as a support for wayfinding at static locations just before road intersections. In partial contradiction with our hypotheses, our results show that AR is used mostly while walking, usually shortly before and after road intersections. Our results help drawing considerations informing both the design of AR interfaces and the development of tracking technologies.

 

Handheld Augmented Reality Indoor Navigation with Activity-Based Instructions

Authors: Alessandro Mulloni, Hartmut Seichter, Dieter Schmalstieg

Details: MobileHCI 2011

We present a novel design of an augmented reality interface to support indoor navigation. We combine activity-based instructions with sparse 3D localisation at selected info points in the building. Based on localisation accuracy and the users’ activities, such as walking or standing still, the interface adapts the visualisation by changing the density and quality of information shown. It provides a smooth flow of the navigational activities with minimal user input. We refine and validate our design through user involvement in pilot studies. We finally present the results of a comparative study conducted to validate the effectiveness of our design and to explore how the presence of info points affects users’ performance on indoor navigation tasks. The results of this study validate our design and show an improvement in task performance when info points are present, which act as confirmation points and provide an overview of the task.

 

Navigation Techniques in Augmented and Mixed Reality: Crossing the Virtuality Continuum

Authors: Raphael Grasset, Alessandro Mulloni, Mark Billinghurst, Dieter Schmalstieg

Details: Chapter of the Handbook of Augmented Reality Technologies and Applications, Springer.

We introduce some of the benefits, issues and challenges around AR navigation by presenting previous work in this area, and proposing a general navigation framework, addressing future challenges and research topics. The work we present also applies to the broader field of Mixed Reality, where real and virtual information are mixed without a precise definition of which space (virtual or real) is augmented and which space is augmenting. We first present the general concept of human navigation and location through space. Then we describe our general model of AR navigation before illustrating related work derived from our model. We classify related work by the viewpoint of the user on the spatial information, either considering AR as a primary or a secondary source of spatial information.

 

Robust detection and tracking of annotations for outdoor augmented reality browsing

Authors: Tobias Langlotz, Claus Degendorfer, Alessandro Mulloni, Gerhard Schall, Gerhard Reitmayr, Dieter Schmalstieg

Details: Accepted for Computer & Graphics special issue on mobile Augmented Reality

A common goal of outdoor augmented reality (AR) is the presentation of annotations that are registered to anchor points in the real world. We present an enhanced approach for registering and tracking such anchor points, which is suitable for current generation mobile phones and can also successfully deal with the wide variety of viewing conditions encountered in real life outdoor use. The approach is based on on-the-fly generation of panoramic images by sweeping the camera over the scene. The panoramas are then used for stable orientation tracking, while the user is performing only rotational movements. This basic approach is improved by several new techniques for the re-detection and tracking of anchor points. For the re-detection, specifically after temporal variations, we first compute a panoramic image with extended dynamic range, which can better represent varying illumination conditions. The panorama is then searched for known anchor points, while orientation tracking continues uninterrupted. We then use information from an internal orientation sensor to prime an active search scheme for the anchor points, which improves matching results. Finally, global consistency is enhanced by statistical estimation of a global rotation that minimizes the overall position error of anchor points when transforming them from the source panorama in which they were created, to the current view represented by a new panorama. Once the anchor points are redetected, we track the user's movement using a novel 3-degree-of- freedom orientation tracking approach that combines vision tracking with the absolute orientation from inertial and magnetic sensors. We tested our system using an AR campus guide as an example application and provide detailed results for our approach using an off-the-shelf smartphone. Results show that the re-detection rate is improved by a factor of 2 compared to previous work and reaches almost 90% for a wide variety of test cases while still keeping the ability to run at interactive frame rates.

 

Collaborative use of mobile augmented reality with paper maps

Authors: Ann Morrison, Alessandro Mulloni, Saija Lemmelä, Antti Oulasvirta, Giulio Jacucci,
Peter Peltonen, Dieter Schmalstieg, Holger Regenbrecht

Details: Accepted for Computer & Graphics special issue on mobile Augmented Reality

The popularity of augmented reality (AR) applications on mobile devices is increasing, but there is as yet little research on use in real settings. We review data from two pioneering field trials where MapLens, a magic lens that augments paper-based city maps, was used in small-group collaborative tasks. The first study compared MapLens to a digital version akin to Google Maps, the second looked at using one shared mobile device versus using multiple devices. The studies find place-making and use of artefacts to communicate and establish common ground as predominant modes of interaction in AR-mediated collaboration with users working on tasks together despite not needing to.

 

Multi-Perspective Compact Explosion Diagrams

Authors: Markus Tatzgern, Denis Kalkofen, and Dieter Schmalstieg

Details: Computers and Graphics, Vol. 35, Issue 1, Feb. 2011

This article presents a system to automatically generate compact explosion diagrams. Inspired by handmade illustrations, our approach reduces the complexity of an explosion diagram by rendering an exploded view only for a suitable subset of the assemblies of an object. However, the exploded views are chosen so that they allow inferring the remaining unexploded assemblies of the entire 3D model. In particular, our approach demonstrates the assembly of a set of identical groups of parts by presenting an exploded view only for a single representative. In order to identify the representatives, our system automatically searches for recurring subassemblies. It selects representatives depending on a quality evaluation of their potential exploded view. Our system takes into account visibility information of both the exploded view of a potential representative as well as visibility information of the remaining unexploded assemblies. This allows rendering a balanced compact explosion diagram, consisting of a clear presentation of the exploded representatives as well as the unexploded remaining assemblies. Since representatives may interfere with one another, our system furthermore optimizes combinations of representatives. The optimization process also generates good views on the explosion diagram. Labels are added to the explosion diagram to increase the visibility of small or occluded parts. Throughout this article, we show a number of examples, which all have been rendered from unmodified 3D CAD data.

 

Rapid Reconstruction of Small Objects on Mobile Phones

Authors: Hartl Andreas, Gruber Lukas, Arth Clemens, Hauswiesner Stefan and Schmalstieg Dieter

Details: Accepted for Proceedings of the Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2011 Embedded Computer Vision Workshop (ECVW), pp. 20-27

Augmented Reality (AR) on mobile phones is receiving more and more attention recently, becoming a popular research topic and an important commercial field. In this paper we present a lightweight method to create coarse 3D models of small-scale objects. The goal is to give the users the possibility to create and maintain AR content themselves without the need for expensive tools and complex interaction. Our algorithm is based on shape-from-silhouette using voxel carving and runs on modern smartphone hardware. 3D models of certain object groups can be generated interactively and instantly. The actual result is visualized continuously using image based rendering methods to inform the user about the actual model quality. Given a suitably accurate model it can further be used for any means of AR and can easily be shared with other users. The usability of our approach is evaluated using modern smartphone hardware (see Figure 1). The advantages are demonstrated using a set of models for playing a board game.

 

2010


 

Instant Segmentation and Feature Extraction for Recognition of Simple Objects on Mobile Phones

Authors: Andreas Hartl, Clemens Arth, Dieter Schmalstieg

Details: Intern. Workshop on Mobile Vision (held in conjunction with CVPR 2010) (IWMV´10)

Object detection and recognition algorithms are an integral part of the architecture of many modern image processing systems employing Computer Vision (CV) techniques. In this paper we describe our work in the area of segmentation and recognition of simple objects in mobile phone imagery. Given an image of several objects on a structured background, we show how these objects can be segmented efficiently and how features can be extracted efficiently for further object recognition and classification. We prove the algorithms presented are useful given a set of test cases, and we show that the algorithms discussed can be used for instant object segmentation and recognition in a real-world application on ordinary off-the-shelf smartphones.

 

Color harmonization for Augmented Reality

Authors: Lukas Gruber, Denis Kalkofen, and Dieter Schmalstieg

Details: Poster in Proceedings of Int. Symposium on Mixed and Augmented Reality 2010 (ISMAR´10)

In this paper we discuss color harmonization for Augmented Reality. Color harmonization is a technique used to adjust the combination of colors in order to follow aesthetic guidelines. We implemented a system which is able to harmonize the combination of the colors in video based AR systems. The presented approach is able to re-color virtual and real-world items, achieving overall more visually pleasant results. In order to allow preservation of certain colors in an AR composition, we furthermore introduce the concept of constraint color harmonization.

 

The City of Sights: Design, construction, and measurement of an Augmented Reality stage set

Authors: Lukas Gruber, Steffen Gauglitz, Jonathan Ventura, Stefanie Zollmann, Manuel Huber, Michael Schlegel, Gudrun Klinker, Dieter Schmalstieg, Tobias Hoellerer

Details: In proceedings of Int. Symposium on Mixed and Augmented Reality 2010 (ISMAR´10)

We describe the design and implementation of a physical and virtual model of an imaginary urban scene-the ìCity of Sightsî- that can serve as a backdrop or ìstageî for a variety of Augmented Reality (AR) research. We argue that the AR research community would benefit from such a standard model dataset which can be used for evaluation of such AR topics as tracking systems, modeling, spatial AR, rendering tests, collaborative AR and user interface design. By openly sharing the digital blueprints and assembly instructions for our models, we allow the proposed set to be physically replicable by anyone and permit customization and experimental changes to the stage design which enable comprehensive exploration of algorithms and methods. Furthermore we provide an accompanying rich dataset consisting of video sequences under varying conditions with ground truth camera pose. We employed three different ground truth acquisition methods to support a broad range of use cases. The goal of our design is to enable and improve the replicability and evaluation of future augmented reality research.

 

Compact Explosion Diagrams

Authors: Markus Tatzgern, Denis Kalkofen, and Dieter Schmalstieg

Details: In Proc. of the 8th International Symposium on Non-Photorealistic Animation and Rendering 2010 (NPAR´10)

This paper presents a system to automatically generate compact explosion diagrams. Inspired by handmade illustrations, our approach reduces the complexity of an explosion diagram by rendering an exploded view only for a subset of the assemblies of an object. However, the exploded views are chosen so that they allow inference of the remaining unexploded assemblies of the entire 3D model. In particular, our approach demonstrates the assembly of a set of identical groups of parts, by presenting an exploded view only for a single representative. In order to identify the representatives, our system automatically searches for recurring subassemblies. It selects representatives depending on a quality evaluation of their potential exploded view. Our system takes into account visibility information of both the exploded view of a potential representative as well as visibility information of the remaining unexploded assemblies. This allows rendering a balanced compact explosion diagram, consisting of a clear presentation of the exploded representatives as well as the unexploded remaining assemblies. Since representatives may interfere with one another, our system furthermore optimizes combinations of representatives. Throughout this paper we show a number of examples, which have all been rendered from unmodified 3D CAD models.

 

Zooming Interfaces for Augmented Reality Browsers

Authors: Alessandro Mulloni, Andreas Dünser, Dieter Schmalstieg

Details: In proceedings of Mobile HCI 2010

Augmented Reality combines real world and virtual information in interactive visualizations. Since phones started integrating GPS, compass and accelerometer, several Augmented Reality browsers for phones have hit the market. These are applications that access large amounts of geo-referenced information from online sources and present it at corresponding physical locations, superimposed onto a live video stream. However, Augmented Reality is constrained by the camera’s field of view and restricted to first- person views, limiting the amount of overview that users can gain. We present two zooming interfaces that compensate for these constraints by enabling users to smoothly zoom between the Augmented Reality view and (1) an egocentric panoramic view of 360°, and (2) an exocentric top-down view. We present the results from two studies that show how in most search tasks our zooming interfaces are faster and require less panning than an overlay- based tool, scaling better as the amount of information grows.

 

Techniques for View Transition in Multi-Camera Outdoor Environments

Authors: Eduardo Veas, Alessandro Mulloni, Ernst Kruijff, Holger Regenbrecht, Dieter Schmalstieg

Details: In proceedings of Graphics Interface 2010

Environment monitoring using multiple observation cameras is increasingly popular. Different techniques exist to visualize the incoming video streams, but only few evaluations are available to find the best suitable one for a given task and context. This article compares three techniques for browsing video feeds from cameras that are located around the user in an unstructured manner. The techniques allow mobile users to gain extra information about the surroundings, the objects and the actors in the environment by observing a site from different perspectives. The techniques relate local and remote cameras topologically, via a tunnel, or via bird’s eye viewpoint. Their common goal is to enhance spatial awareness of the viewer, without relying on a model or previous knowledge of the environment. We introduce several factors of spatial awareness inherent to multi-camera systems, and present a comparative evaluation of the proposed techniques with respect to spatial understanding and workload.

 

Sketching up the world: In-situ authoring for mobile Augmented Reality

Authors: Tobias Langlotz, Stefan Mooslechner, Stefanie Zollmann, Claus Degendorfer, Dieter Schmalstieg

Details: Accepted for the 2010 International Workshop on Smartphone Applications and Services (9-11 December, Gwangju, South Korea)

We present a novel system allowing in-situ content creation for mobile Augmented Reality in unprepared environments. This system targets smartphones and therefore allows a spontaneous authoring while in place. We describe two different scenarios, which are depending on the size of the working volume and consequently use different tracking techniques. A natural feature-based approach for planar targets is used for small working spaces whereas for larger working volume a panoramic based rotation tracking is deployed. Both are integrated into one system allowing the user to use the same interaction for creating the content applying a set of simple, yet powerful modeling functions for content creation. The resulting AR content can be shared with other users using a dedicated content server or kept in a private inventory for later use.

 

Online Creation of Panoramic Augmented Reality Annotations on Mobile Phones

Authors: Tobias Langlotz, Daniel Wagner, Alessandro Mulloni, Dieter Schmalstieg

Details: Accepted for IEEE Pervasive Computing

We present a novel approach for creating and exploring annotations in place using mobile phones. The system can be used in large-scale indoor and outdoor scenarios and offers an accurate mapping of the annotations to physical objects. The system uses a drift-free orientation tracking based on panoramic images, which can be initialized using data from a GPS sensor. Given the current position and view direction, we show how annotations can be accurately mapped to the correct objects, even in the case of varying user positions. Possible applications range from Augmented Reality browsers to pedestrian navigation.

 

International Symposium on Ubiquitous Virtual Reality 2009

Authors: Wonwoo Lee, Antonio Kr¸ger, Bruce H. Thomas, Daniel Wagner, Kari Pulli, Woontack Woo

Details:IEEE Pervasive Computing, vol. 9, no. 2, pp. 78-80, Apr.-June 2010

We report on the current state and future direction of ubiquitous VR, based on presentations and discussions during the 2009 International Symposium on Ubiquitous Virtual Reality. In the symposium, enabling technologies and applications for realizing ubiquitous VR on mobile devices are discussed and several approaches were proposed aimed at improving ubiquitous VR applications with pervasive and ubiquitous computing technologies.

 

Augmented Reality 2.0

Authors: Dieter Schmalstieg, Tobias Langlotz, Mark Billinghurst

Details: Virtual Realities, Dagstuhl seminar proceedings (eds. Sabine Coquillart, Guido Brunnett, Greg Welch), 2010

Augmented Reality (AR) was first demonstrated in the 1960s, but only recently have technologies emerged that can be used to easily deploy AR applications to many users. Camera- equipped cell phones with significant processing power and graphics abilities provide an inexpensive and versatile platform for AR applications, while the social networking technology of Web 2.0 provides a large-scale infrastructure for collaboratively producing and distributing geo-referenced AR content. This combination of widely used mobile hardware and Web 2.0 software allows the development of a new type of AR platform that can be used on a global scale. In this paper we describe the Augmented Reality 2.0 concept and present existing work on mobile AR and web technologies that could be used to create AR 2.0 applications.

 

Optimization of Target Objects for Natural Feature Tracking

Authors: Lukas Gruber, Stefanie Zollmann, Daniel Wagner, Tobias Hoellerer and Dieter Schmalstieg

Details: Proceedings of IEEE International Conference on Pattern Recognition 2010 (ICPR 2010)

This paper investigates possible physical alterations of tracking targets to obtain improved 6DoF pose detection for a camera observing the known targets. We explore the influence of several texture characteristics on the pose detection, by simulating a large number of different target objects and camera poses. Based on statistical observations, we rank the importance of characteristics such as texturedness and feature distribution for a specific implementation of a 6DoF tracking technique. These findings allow informed modification strategies for improving the tracking target objects themselves, in the common case of man-made targets, as for example used in advertising. This fundamentally differs from and complements the traditional approach of leaving the targets unchanged while trying to optimize the tracking algorithms and parameters.

 

Real-time Panoramic Mapping and Tracking on Mobile Phones

Authors: Daniel Wagner, Alessandro Mulloni, Tobias Langlotz, Dieter Schmalstieg

Details: Proceedings of IEEE Virtual Reality Conference 2010 (VR´10)

We present a novel method for the real-time creation and tracking of panoramic maps on mobile phones. The maps generated with this technique are visually appealing, very accurate and allow drift-free rotation tracking. This method runs on mobile phones at 30Hz and has applications in the creation of panoramic images for offline browsing, for visual enhancements through environment mapping and for outdoor Augmented Reality on mobile phones.

 

Real Time Detection and Tracking for Augmented Reality on Mobile Phones

Authors: Daniel Wagner, Gerhard Reitmayr, Alessandro Mulloni, Tom Drummond, Dieter Schmalstieg

Details: In IEEE Transactions on Visualization and Computer Graphics May/June 2010 (vol. 16 no. 3)

In this paper, we present three techniques for 6DOF natural feature tracking in real time on mobile phones. We achieve interactive frame rates of up to 30 Hz for natural feature tracking from textured planar targets on current generation phones. We use an approach based on heavily modified state-of-the-art feature descriptors, namely SIFT and Ferns plus a template-matching-based tracker.

 

2009


 

Towards Wide Area Localization on Mobile Phones

Authors: Clemens Arth, Daniel Wagner, Manfred Klopschitz, Arnold Irschara, Dieter Schmalstieg

Details: In Proceedings of Int. Symposium on Mixed and Augmented Reality 2009 (ISMAR´09)

We present a fast and memory effcient method for localizing a mobile user’s 6DOF pose from a single camera image. Our approach registers a view with respect to a sparse 3D point reconstruction. The 3D point dataset is partioned into pieces based on visibility constraints and occlusion culling, making it scalable and efficient to handle. Starting with a coarse guess, our system only considers features that can be seen from the user’s position. Our method is resource efficient, usually requiring only a few megabytes of memory, thereby making it feasible to run on low-end devices such as mobile phones. At the same time it is fast enough to give instant results on this device class.

 

Multiple Target Detection and Tracking with Guaranteed Framerates on Mobile Phones

Authors: Daniel Wagner, Dieter Schmalstieg, Horst Bischof

Details: In Proceedings of Int. Symposium on Mixed and Augmented Reality 2009 (ISMAR´09)

In this paper we present a novel method for real-time pose estimation and tracking on low-end devices such as mobile phones. The presented system can track multiple known targets in real-time and simultaneously detect new targets for tracking. We present a method to automatically and dynamically balance the quality of detection and tracking to adapt to a variable time budget and ensure a constant frame rate. Results from real data of a Mobile Phone Augmented Reality system demonstrate the efficiency and robustness of the described approach. The system can track 6 planar targets on a mobile phone simultaneously at framerates of 23fps.

 

Global Pose Estimation using Multi-Sensor Fusion for Outdoor Augmented Reality

Authors: Gerhard Schall, Daniel Wagner, Gerhard Reitmayr, Manfred Wieser, Elise Teichmann, Dieter Schmalstieg, Bernhard Hofmann-Wellenhof

Details: In Proceedings of Int. Symposium on Mixed and Augmented Reality 2009 (ISMAR´09)

Outdoor Augmented Reality typically requires tracking in unprepared environments. For global registration, GPS is currently the best sensing technology, but its precision and update rate are not sufficient for high quality tracking. We present a system that uses Kalman filtering for fusion of DGPS/RTK with barometric heights and also for an inertial measurement unit with gyroscopes, magnetometers and accelerometers to improve the transient oscillation. Typically, inertial sensors are subjected to drift and magnetometer measurements are distorted by electro-magnetic fields in the environment. For compensation, we additionally apply a visual orientation tracker which is drift-free through online mapping of the unknown environment. This tracker allows for correction of the distortions of the 3-axis magnetic compass, which increases the robustness and accuracy of the pose estimates. We present results of applying this approach in an industrial application scenario.

 

Making Augmented Reality Practical on Mobile Phones, Part 2

Authors: Daniel Wagner, Dieter Schmalstieg

Details: IEEE Computer Graphics and Applications, Vol. 29, No. 4 July/Aug. 2009

In part 1, we introduced a software environment for augmented reality (AR) on mobile phones, discussed development and debugging strategies, and showed how to execute several tasks of a common AR system in parallel on a mobile device. Here, we discuss how to overcome the most severe limitations, such as memory, rendering speed, and computational power. We analyze in detail where an optimized mobile phone AR application spends most of its processing time and give an outlook on what to expect in the next few years.

 

History and Future of Tracking for Mobile Phone Augmented Reality

Authors: Daniel Wagner, Dieter Schmalstieg

Details: International Symposium on Ubiquitous Virtual Reality (ISUVR 2009)

We present an overview on the history of tracking for mobile phone Augmented Reality. We present popular approaches using marker tracking, natural feature tracking or offloading to nearby servers. We then outline likely future work.

 

Making Augmented Reality Practical on Mobile Phones, Part 1

Authors: Daniel Wagner, Dieter Schmalstieg

Details: IEEE Computer Graphics and Applications, Vol. 29, No. 3, May/June 2009

In the past few years, mobile phones have become an increasingly attractive platform for augmented reality (AR). According to Gartner, 1.22 billion mobile phones were sold in 2008. Some forecasts estimate that this number will rise to 1.8 billion units in 2012, of which 800 million are expected to be smartphones. Although not all these devices are open for custom software development, the trend toward open software environments for mobile phones seems inevitable.
In 2003, we started an AR framework for mobile phones. We intended its fi rst generation as primarily a proof of feasibility. The second generation was an attempt to port a fully featured PC-based AR framework, Studierstube (http://studierstube.org), to a phone platform. You can port existing applications and make them run on mobile phones. However, as we had to painfully experience ourselves, this approach typically produces slow, bloated, and unstable software. Optimally using phones’ scarce resources requires different algorithms and architectural decisions than for PCs, leading to a complete reengineering of an existing solution.
So, for the third generation, Studierstube ES, we largely abandoned compatibility requirements and added new elements to the design, such as an asymmetric client-server technique, that are specifi c to mobile devices. In this first installment of our two-part tale of Studierstube ES and what we’ve learned along the way, we describe the mobile phone platform’s restrictions and how our software architecture allows fast development of mobile phone AR applications.

 

Indoor Positioning and Navigation with Camera Phones

Authors: Alessandro Mulloni, Daniel Wagner, István Barakonyi, Dieter Schmalstieg

Details: IEEE Pervasive Computing, Vol 8, Nr 2, pp. 22-31

This low-cost indoor navigation system runs on off-the-shelf camera phones. More than 2,000 users at four different large-scale events have already used it. The system uses built-in cameras to determine user location in real time by detecting unobtrusive fiduciary markers. The required infrastructure is limited to paper markers and static digital maps, and common devices are used, facilitating quick deployment in new environments. The authors have studied the application quantitatively in a controlled environment and qualitatively during deployment at four large international events. According to test users, marker-based navigation is easier to use than conventional mobile digital maps. Moreover, the users' location awareness in navigation tasks improved. Experiences drawn from questionnaires, usage log data, and user interviews further highlight the benefits of this approach.

 

An Authoring Tool for Mobile Phone AR Environments

Authors: Yuan Wang, Tobias Langlotz, Mark Billinghurst, Tim Bell

Details: Proceedings of New Zealand Computer Science Research Student Conference (NZCSRSC '09)

This paper describes an authoring tool for mobile phone Augmented Reality (AR) applications. This work is based on earlier work at the HIT Lab NZ on a tool for authoring PC based AR applications called ComposAR. In this paper, we describe modifications to ComposAR that allows end-users to prototype mobile AR applications on a PC, and mobile player software that will allow the prototype applications to be delivered on a mobile phone. In this way, end-users with little programming experience can develop simple mobile AR applications.

 

Augmented Reality on Mobile Phones

Authors: Daniel Wagner, Lukas Gruber, Dieter Schmalstieg

Details: Chapter in the ShaderX7 book, edited by Wolfgang Engel

Our motivation comes from research on a novel kind of user interface called augmented reality (AR). The real world as viewed with a camera is augmented with virtual objects, which are spatially registered in the scene. Probably the most widely recognized example is the Eye of Judgment, a computer/board game for Sony's PlayStation 3 that puts virtual game characters on playing cards.
Recently, mobile phones with cameras have become attractive as inexpensive AR devices. Mobile phones have a market penetration of 100% in many industrial countries now, and their graphics performance is constantly increasing, primarily driven by the desire for compelling mobile games. Unfortunately, the phone market is highly fragmented in terms of platforms and capabilities, so that delivering a graphics application to a wide user base is a very tedious task.
What is interesting about our work to develop AR experiences on mobile phones is that until recently, AR was considered too demanding to be executed on mobile phones due to the requirement to simultaneously perform video acquisition, image processing, and graphics synthesis in real-time. We have recently shown that compelling AR applications are possible on phones. The solutions we have developed are not limited to AR, but can be applied to any high-performance graphics/video application.
The purpose of this article is to provide some insight in how to approach performance-sensitive issues for mobile phone development, and how to obtain sufficient platform independence to allow dissemination for a reasonably large user base. First, we describe the special needs and techniques of AR applications. The article continues with issues and solutions when developing for Symbian and Windows Mobile, OpenGL ES and Direct3D, software and hardware rendering, and scene-graph as well as video processing issues.

 

2008


 

Pose Tracking from Natural Features on Mobile Phones

Authors: Daniel Wagner, Gerhard Reitmayr, Alessandro Mulloni, Tom Drummond, Dieter Schmalstieg

Details: The 7th IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR 2008)
Best Paper Award

Abstract: In this paper we present two techniques for natural feature tracking in real-time on mobile phones. We achieve interactive frame rates of up to 20Hz for natural feature tracking from textured planar targets on current-generation phones. We use an approach based on heavily modified state-of-the-art feature descriptors, namely SIFT and Ferns. While SIFT is known to be a strong, but computationally expensive feature descriptor, Ferns classification is fast, but requires large amounts of memory. This renders both original designs unsuitable for mobile phones. We give detailed descriptions on how we modified both approaches to make them suitable for mobile phones. We present evaluations on robustness and performance on various devices and finally discuss their appropriateness for Augmented Reality applications.

 

Robust and Unobtrusive Marker Tracking on Mobile Phones

Authors: Daniel Wagner, Tobias Langlotz, Dieter Schmalstieg

Details: The 7th IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR 2008)

Abstract: Marker tracking has revolutionized Augmented Reality about a decade ago. However, this revolution came at the expense of visual clutter. In this paper, we propose several new marker techniques, which are less obtrusive than the usual black and white squares. Furthermore, we report methods that allow tracking beyond the visibility of these markers further improving robustness. All presented techniques are implemented in a single tracking library, are highly efficient in their memory and CPU usage and run at interactive frame rates on mobile phones.

 

Mobility and Social Interaction as Core Gameplay Elements in Multi-Player Augmented Reality

Authors: Alessandro Mulloni, Daniel Wagner, Dieter Schmalstieg

Details: 3rd International Conference on Digital Interactive Media in Entertainment and Arts (DIMEA 2008)

Abstract: In this paper, we present an Augmented Reality game that strongly exploits mobility and social interaction between players as core gameplay elements. We have implemented this game on handheld devices and conducted a qualitative user study, investigating the level of mobility and social involvement of players. We discuss the results from this user study, describing the problems experienced by players and how we tackled them.

 

Mobile Phones as a Platform for Augmented Reality

Authors: Dieter Schmalstieg, Daniel Wagner

Details: Proceedings of the IEEE VR 2008 Workshop on Software Engineering and Architectures for Realtime Interactive Systems (Reno, NV, USA), pp. 43-44, March 2008

Abstract: Handheld Augmented Reality (AR) running on self-contained handheld computers and smartphones, can leverage an extremely large potential user base of existing devices and users knowing how to operate them. In this paper we report on a platform for collaborative handheld AR applications, which employs specific efficient techniques from embedded development to push the limits of AR applications in terms of physical size, number of users and content intensity.

 

 

2007


 

Design Aspects of Handheld Augmented Reality Games

Authors: Daniel Wagner, Dieter Schmalstieg

Details: Pervasive Gaming Applications, Volume 2, pp. 159-178, Shaker Publishing, December 2007

Abstract: Handhelds such as mobile phones and PDAs are common today. Developing games for these devices has become a huge and profitable market. In this article we discuss design aspects that are specific to the still mostly unexplored area of Augmented Reality games on handheld devices.

 

Experiences with Handheld Augmented Reality

Authors: Dieter Schmalstieg, Daniel Wagner

Details: The Sixth IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR 2007), pp. 3-15, November 2007

Abstract: In this paper, we present Studierstube ES, a framework for the development of handheld Augmented Reality. The applications run self-contained on handheld computers and smartphones with Windows CE. A detailed description of the performance critical tracking and rendering components are given. We also report on the implementation of a client-server architecture for multi-user applications, and a game engine for location based museum games that has been built on top of this infrastructure. Details on two games that were created, permanently deployed and evaluated in two Austrian museums illustrate the practical value of the framework and lessons learned from using it.

 

Handheld Augmented Reality

Author: Daniel Wagner

Details: PhD Thesis, October 2007

Abstract: Augmented Reality (AR) aims at developing new user interfaces. Although research has produced a large number of application prototypes and AR frameworks in the last 20 years, no project has yet been practical enough to create a mass market success.
There are many reasons for this. Traditionally, AR researchers have primarily created prototypes that aim to solve engineering problems such as maintenance or new interfaces for complex environments such as head up displays for navigation and battlefield systems. Most researchers still see AR as a basic research area. Developing easy to use, practical applications, such as for home users, is therefore usually not a goal. Another problem with many Augmented Reality systems is the highly complex hardware setup, often including expensive commercial sensors, input devices and output devices. These devices are often bulky and fragile, since they were never meant to be operated by untrained users.
Research at the Vienna University of Technology and the Graz University of Technology has aimed at moving Augmented Reality to a mass-market. Instead of specialized and expensive hardware, this project targets low cost mobile devices, namely mobile phones. In contrast to traditional AR hardware, people already own these devices and know how to operate them. Recently, processing capabilities of mobile phones have reached a level that makes these devices capable of running standalone AR applications and renders them ideal candidates for mass marketed Augmented Reality solutions.
This thesis presents a framework that for the first time allows for the creation of practical AR applications on end user-owned devices. The software runs on a broad range of devices and has been used for several–some even commercial–applications. To prove the applicability of the new platform the author of this thesis has performed evaluations with untrained users in real-life environments such as museum exhibitions or conferences.

 

Managing Complex Augmented Reality Models

Authors: Dieter Schmalstieg , Gerhard Schall, Daniel Wagner, István Barakonyi, Gerhard Reitmayr, Joseph Newman, Florian Ledermann

Details: IEEE Computer Graphics and Applications, pp. 32-41, July/August 2007

Abstract: Mobile augmented reality requires georeferenced data to present world-registered overlays. To cover a wide area and all artifacts and activities, a database containing this information must be created, stored, maintained, delivered, and finally used by the client application. We present a data model and a family of techniques to address these needs.

 

A collaborative and location-aware application based on augmented reality for mobile devices

Author: Alessandro Mulloni

Details: Master's Thesis, July 2007

Abstract: The purpose of the present thesis project is to design and implement a multi-user collaborative Augmented Reality game which is based on the Studierstube ES framework and that exploits the capability of users to move inside the environment. If the gameplay is compelling game players tend to be more forgiving than users of professional application when hardware or software issues occur; games are therefore an ideal class of applications for research projects focused on developing new technological solutions. The overall goal of this project is to develop a case-study application to prove the feasibility of merging Computer Supported Cooperative Work, location-awareness and Augmented Reality in a system based on standalone handheld platforms and exploiting user mobility in an environment. The proposed solution can be partially or totally reused for non-entertainment applications: the system developed for user orientation in the physical game environment could be used for indoor navigation systems, the whole system could be used to coordinate groups of workers in a cooperative context and a large environment (e.g. workers on a building site).

 

Muddleware for Prototyping Mixed Reality Multiuser Games

Authors: Daniel Wagner, Dieter Schmalstieg

Details: Proceedings of IEEE Virtual Reality 2007 (VR2007), March 2007

Abstract: We present Muddleware, a communication platform designed for mixed reality multi-user games for mobile, lightweight clients. An approach inspired by Tuplespaces, which provides decoupling of sender and receiver is used to address the requirements of a potentially large number of mobile clients. A hierarchical database built on XML technology allows convenient prototyping and simple, yet powerful queries. Server side-extensions address persistence and autonomous behaviors through hierarchical state machines. The architecture has been tested with a number of multi-user games and is also used for non-entertainment applications.

 

ARToolKitPlus for Pose Tracking on Mobile Devices

Authors: Daniel Wagner, Dieter Schmalstieg

Details: Proceedings of 12th Computer Vision Winter Workshop (CVWW'07), February 2007

Abstract: In this paper we present ARToolKitPlus, a successor to the popular ARToolKit pose tracking library. ARToolKitPlus has been optimized and extended for the usage on mobile devices such as smartphones, PDAs and Ultra Mobile PCs (UMPCs). We explain the need and specific requirements of pose tracking on mobile devices and how we met those requirements. To prove the applicability we performed an extensive benchmark series on a broad range of off-the-shelf handhelds.

 

2006


 

Handheld AR for Collaborative Edutainment

Authors: Daniel Wagner, Dieter Schmalstieg, Mark Billinghurst

Details: Proceedings of 16th International Conference on Artificial Reality and Telexistence (ICAT), 2006

Abstract: Handheld Augmented Reality (AR) is expected to provide ergonomic, intuitive user interfaces for untrained users. Yet no comparative study has evaluated these assumptions against more traditional user interfaces for an education task. In this paper we compare the suitability of a handheld AR arts-history learning game against more traditional variants. We present results from a user study that demonstrate not only the effectiveness of AR for untrained users but also its fun-factor and suitability in environments such as public museums. Based on these results we provide design guidelines that can inform the design of future collaborative handheld AR applications.

 

How Real Should Virtual Characters Be?

Authors: Daniel Wagner, Mark Billinghurst, Dieter Schmalstieg

Details: ACM Conference on Advances in Computer Entertainment Technology 2006 (ACE 2006), June 2006

Abstract: In recent years 3D virtual characters have become more common in desktop interfaces, particularly in gaming and entertainment applications. In this paper we describe how augmented reality (AR) technology can be used to bring virtual characters into the real world and compare AR characters to other types of virtual characters. We have developed a handheld AR educational application in which a virtual character teaches users about art history. We present results from a user study that explores how realistic the character needs to be for it to be an effective and engaging educational tool and if augmented reality offers benefits for this type of application.

 

Handheld Augmented Reality Displays

Authors: Daniel Wagner, Dieter Schmalstieg

Details: Proceedings of IEEE Virtual Reality 2006, March 2006

Abstract: Augmented Reality (AR) can naturally complement mobile computing on wearable devices by providing an intuitive interface to a three-dimensional information space embedded within physical reality. However, existing AR systems like MARS or Tinmith , which require a user to wear a notebook computer in a backpack and a head-mounted display (HMD) are expensive, fragile and inconvenient to wear. Thin-client approaches using a Tablet PC or Personal Digital Assistant (PDA) merely as a portable display require a dedicated server infrastructure and limit mobility. We believe there is a need for an unconstrained, infrastructure-independent AR display running to fill the gap in situations where traditional backpack systems are too costly and cumbersome, but thin client implementations exhibit inadequate deployability, scalability or interactive behavior. Particular examples include sporadic use over lengthy time spans, in between which devices must be stowed away, mixed indoor/outdoor use in wide-area environments, and massively multi-user application scenarios. This has motivated us to develop a state of the art AR framework targeting lightweight handheld displays.

 

2005


 

Towards Massively Multi-User Augmented Reality on Handheld Devices

Authors: Daniel Wagner, Thomas Pintaric, Florian Ledermann, Dieter Schmalstieg

Details: Proceedings of Third International Conference on Pervasive Computing, Pervasive 2005, May 2005

Abstract: Augmented Reality (AR) can naturally complement mobile computing on wearable devices by providing an intuitive interface to a three-dimensional information space embedded within physical reality. Unfortunately, current wearable AR systems are relatively complex, expensive, fragile and heavy, rendering them unfit for large-scale deployment involving untrained users outside constrained laboratory environments. Consequently, the scale of collaborative multi-user experiments have not yet exceeded a handful of participants. In this paper, we present a system architecture for interactive, infrastructure-independent multi-user AR applications running on off-the-shelf handheld devices. We implemented a four-user interactive game installation as an evaluation setup to encourage playful engagement of participants in a cooperative task. Over the course of five weeks, more than five thousand visitors from a wide range of professional and socio-demographic backgrounds interacted with our system at four different locations.

 

A Handheld Augmented Reality Museum Guide

Authors: Dieter Schmalstieg, Daniel Wagner

Details: Proceedings of IADIS International Conference on Mobile Learning 2005 (ML2005), June 2005

Abstract: We report on an ongoing project with the aim of developing an electronic tour guide for museums based on a self-contained, inexpensive PDA, that delivers a fully interactive 3D Augmented Reality (AR) to a group of visitors. Previously, it was believed that AR is technically too difficult to render services for museum users. Clearly, regular PCs or notebooks are not feasible for everyday use, and PDAs were believed to be not powerful enough. In the past two years, we have developed a platform for AR on unmodified commercial PocketPC devices, which performs real-time optical fiducial tracking and 3D graphics video overlays at interactive frame rates (typically ~20Hz overall). No server infrastructure is required, except for mediating wireless communication among a group of users and for database access.

 

2003


 

Augmented Reality Kanji Learning

Authors: Daniel Wagner, István Barakonyi

Details: Proceedings of the 2nd IEEE/ACM Symposium on Mixed and Augmented Reality (ISMAR 2003), October 2003

Abstract: ARToolKit programmers are familiar with the kanji symbols supplied with the distribution. Most of them have do not know what these kanji symbols mean. We propose a piece of educational software that uses collaborative Augmented Reality (AR) to teach users the meaning of kanji symbols. The application is laid out as a two player Augmented Reality computer game. The novelty of our approach is that we do not use regular workstations or laptops to host the AR (Augmented Reality) application. Instead we use fully autonomous PDAs, running the application together with an optical marker-based tracking module that makes this application not only available for a broad audience but also optimally mobile.

 

First Steps Towards Handheld Augmented Reality

Authors: Daniel Wagner, Dieter Schmalstieg

Details: Proceedings of the 7th International Conference on Wearable Computers (ISWC 2003), October 2003

Abstract: In this paper we describe the first stand-alone Augmented Reality (AR) system with self-tracking running on an unmodified personal digital assistant (PDA) with a commercial camera. The project exploits the ready availability of consumer devices with a minimal need for infrastructure. The application provides the user with a three-dimensional augmented view of the environment. Our system achieves good overlay registration accuracy by using a popular marker-based tracking toolkit (ARToolKit), which runs directly on the PDA. We introduce an optional client/server architecture that is based on wireless networking and is able to dynamically and transparently offload the tracking task in order to provide better performance in select areas. The hardware/software framework is modular and can be easily combined with many elements of an existing AR framework. As a demonstration of the effectiveness, we present a 3D navigation application that guides a user through an unknown building to a chosen location.

 

ARToolKit on the PocketPC Platform

Authors: Daniel Wagner, Dieter Schmalstieg

Details: Augmented Reality Toolkit Workshop, October 2003

Abstract: In this paper we describe the port of ARToolKit onto the PocketPC platform including optimizations that led to a three-fold speedup over the native cross-compiled version. The ported ARToolKit module was successfully used in the Handheld AR project.

 

 

 

website maintained by Tobias Langlotz
last updated on 2011-08-02

copyright (c) 2011 Graz University of Technology