Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
…
4 pages
1 file
This paper presents experimental results of a mobile mixed reality interface designed for geovisualization of 3D realistic urban environments which allows dynamic switching between three visualization domains: a virtual reality; an augmented reality and a mixed reality interface to get the best possible representation for visual exploration. On each domain, four different types of geovisualisation and navigation aids can be superimposed including geo-referenced 3D maps, 2D digital maps, spatial 3D sound and 3D/2D textual annotations. Interaction is performed using keyboard, mouse, menus and tangible ways. To gather user requirements about urban and virtual navigation and to assess the effectiveness of mobile interface, a two-stage evaluation was performed.
Aslib Proceedings, 2007
The motivation for this research is the emergence of mobile information systems where information is disseminated to mobile individuals via handheld devices. A key distinction between mobile and desktop computing is the significance of the relationship between the spatial location of an individual and the spatial location associated with information accessed by that individual. Given a set of spatially referenced documents retrieved from a mobile information system, this set can be presented using alternative interfaces of which two presently dominate: textual lists and graphical two-dimensional maps. The purpose of this paper is to explore how mixed reality interfaces can be used for the presentation of information on mobile devices.
ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences
In this paper, we assume that augmented reality (AR) and mixed reality (MR) are relevant contexts for 3D urban geovisualization, especially in order to support the design of the urban spaces. We propose to design an in situ MR application, that could be helpful for urban designers, providing tools to interactively remove or replace buildings in situ. This use case requires advances regarding existing geovisualization methods. We highlight the need to adapt and extend existing 3D geovisualization pipelines, in order to adjust the specific requirements for AR/MR applications, in particular for data rendering and interaction. In order to reach this goal, we focus on and implement four elementary in situ and ex situ AR/MR experiments: each type of these AR/MR experiments helps to consider and specify a specific subproblem, i.e. scale modification, pose estimation, matching between scene and urban project realism, and the mix of real and virtual elements through portals, while proposing occlusion handling, rendering and interaction techniques to solve them.
ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
This paper is focused to address the map display usability for finding given POI addresses in a popular urban city area. LOD 1 of 3D representations of city buildings are presented into a 2.5D map for pedestrian navigation test. This 3D map display is evaluated against familiar 2D map system on the test participants' smartphones. 16 participants were involved in the field test. The typical walking model of a searching task that is focused only to look for a certain address of building is chosen as the way finding model during the field test. Three kinds of navigation processes i.e. self-orientation, spatial knowledge acquisition and navigation decision for searching task were evaluated for each test participant. Usability measures of 3D map-based display over 2D-map based display for pedestrian navigation were collected from test participants' mobile devices. In addition to that, activities of test participants in terms of acceleration and orientation information are used to support analysis of pattern and trends of test participants. As the testing app is also intended to support smart city application, its ability to provide user report on complaints was also assessed. Most participants agreed with the statements in the questionnaire that were organized into three sections, i.e. addressing participants' interaction, participants' responses in navigation processes and crowdsensing. The results suggest that 3D map-based pedestrian navigation is more usable to be used to look for a certain address of building in central tourist area of urban city.
Technology|Architecture + Design, 2019
As a consequence of inexpensive sensors, ubiquitous tracking devices, and abundant digital storage, by 2019 smarter cities have become platforms for increasing levels of operational monitoring and data capture. Various government agencies and private interests have generated a tremendous database of urban content, which, despite its availability, remains comprehensively inaccessible. How can new emerging MR technologies relate urban big data back to the physical and spatial conditions of the cities to which they relate? This paper examines data-integrated virtual urban environments, identifies their audience, determines their mechatronic configurations, and reviews strengths and limitations of use before presenting an experimental MR technology interface for translating urban data into accessible information.
ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Three-dimensional (3D) visualisations of geospatial data have become very popular in the last years. Various applications and tools are based on interactive 3D geovisualisations. However, the user aspects of these 3D geovisualisations are not yet fully understood. While several studies have focused on how users work with these 3D geovisualisations, only few studies focus directly on interactive 3D geovisualisations and employ usability research methods like screen logging. This method enables the objective recording of movement in 3D virtual environments and of user interactions in general. Therefore, we created a web-based research tool: a 3D Movement and Interaction Recorder (3DmoveR). This tool is based on the user logging method, combined with a digital questionnaire and practical spatial tasks. The design and implementation of this tool follow the spiral model, and its current version is 2.0. It is implemented using open web technologies such as PHP, JavaScript, and the Three.js library. After building this tool, we verified it through load testing and a simple pilot test verifying accessibility. We continued to describe the first deployment of 3DmoveR 2.0 in a real user study. The future modifications and applications of 3DmoveR 2.0 are discussed in the conclusion section. Attention was paid to future deployment during user testing outside controlled (laboratory) conditions.
Ninth International …, 2005
This paper presents the first prototype of an interactive visualisation framework specifically designed for presenting geographical information in both indoor and outdoor environments. The input of our system is ESRI Shapefiles which represent 3D building geometry and landuse attributes. Participants can visualise 3D reconstructions of geographical information in real-time based on two visualisation clients: a mobile VR interface and a tangible AR interface. To prove the functionality of our system an educational application specifically designed for university students is illustrated with some initial results. Finally, our conclusions as well as future work are presented.
Proceedings of the 1st …, 2006
In this paper we propose the use of specific mobile system architecture for navigation in urban environments. The aim of this work is to evaluate how virtual and augmented reality interfaces can provide location and orientation-based services using different technologies. The virtual reality interface is entirely based on sensors to detect the location and orientation of the user while the augmented reality interface uses computer vision techniques to capture patterns from the real environment. The knowledge obtained from the evaluation of the virtual reality experience has been incorporated into the augmented reality interface. Some initial results in our experimental augmented reality navigation are presented.
… : Teleoperators & Virtual …, 2002
In this paper we describe two explorations in the use of hybrid user interfaces for collaborative geographic data visualization. Our first interface combines three technologies; Augmented Reality (AR), immersive Virtual Reality and computer vision based hand and object tracking. Wearing a lightweight display with camera attached, users can look at a real map and see three-dimensional virtual terrain models overlaid on the map. From this AR interface they can fly in and experience the model immersively, or use free hand gestures or physical markers to change the data representation. Building on this work, our second interface explores alternative interface techniques, including a zoomable user interface, paddle interactions and pen annotations. We describe the system hardware and software, and the implications for GIS and spatial science applications.
Journal of Virtual Reality …, 2006
In this paper, we propose the use of specific system architecture, based on mobile device, for navigation in urban environments. The aim of this work is to assess how virtual and augmented reality interface paradigms can provide enhanced location based services using real-time techniques in the context of these two different technologies. The virtual reality interface is based on faithful graphical representation of the localities of interest, coupled with sensory information on the location and orientation of the user, while the augmented reality interface uses computer vision techniques to capture patterns from the real environment and overlay additional way-finding information, aligned with real imagery, in real-time. The knowledge obtained from the evaluation of the virtual reality navigational experience has been used to inform the design of the augmented reality interface. Initial results of the user testing of the experimental augmented reality system for navigation are presented.
Cartographic Perspectives
Various widely available applications such as Google Earth have made interactive 3D visualizations of spatial data popular. While several studies have focused on how users perform when interacting with these with 3D visualizations, it has not been common to record their virtual movements in 3D environments or interactions with 3D maps. We therefore created and tested a new web-based research tool: a 3D Movement and Interaction Recorder (3DmoveR). Its design incorporates findings from the latest 3D visualization research, and is built upon an iterative requirements analysis. It is implemented using open web technologies such as PHP, JavaScript, and the X3DOM library. The main goal of the tool is to record camera position and orientation during a user’s movement within a virtual 3D scene, together with other aspects of their interaction. After building the tool, we performed an experiment to demonstrate its capabilities. This experiment revealed differences between laypersons and expe...
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
International Journal of Mobile Computing and Multimedia Communications, 2013
International Journal of Digital Earth, 2019
Journal of Science and Technology
IOP Conference Series: Materials Science and Engineering
Proceedings of the …, 2008
Computer Graphics Theory and Applications, 2009
IEEE Visualization, 2002. VIS 2002., 2002
KN - Journal of Cartography and Geographic Information
ISPRS International Journal of Geo-Information