posted on 2019-06-10, 16:26authored byKe HuoKe Huo
With rapidly increasing mobile computing devices and high speed networks, large amounts of digital information and intelligence from the surrounding environment have been introduced into our everyday life. However, much of the context and content is in textual and in 2D. To access the digital contents spontaneously, augmented reality~(AR) has become a promising surrogate to bridge the physical with the digital world. Thanks to the vast improvement to the personal computing devices, AR technologies are emerging in realistic scenarios. Commercially available software development kits~(SDKs) and hardware platforms have started to expose AR applications to a large population.
In a broader level, this thesis focuses on investigating suitable interactions metaphors for the evolving AR. In particular, this work leverages the spatial awareness in AR environment to enable spatially-aware interactions. This work explores (i) spatial inputs around AR devices using the local spatial relationship between the AR devices and the scene, (ii) spatial interactions within the surrounding environment exploiting the global spatial relationship among multiple users as well as between the users and the environment. In this work, I mainly study four spatially-aware AR interactions: (i) 3D tangible interactions by directly mapping input to the continuous and discrete volume around the device, (ii) 2D touch input in 3D context by projecting the screen input to the real world, (iii) location aware interactions which use the locations of the real/virtual objects in the AR scene as spatial references, and (iv) collaborative interactions referring to a commonly shared AR scene. This work further develop the enabling techniques including a magnetic sensing based 3D tracking of tangible devices relative to a handheld AR device, a projection based 3D sketching technique for in-situ AR contents creation, a localization method for spatially mapping the smart devices into the AR scene, and a registration approach for resolving the transformations between multiple SLAM AR devices. Moreover, I build systems towards allowing pervasive AR experiences. Primarily, I develop applications for increasing the flexibility of AR contents manipulation, creation and authoring, intuitively interacting with the smart environment, and spontaneously collaborating within a co-located AR scene.
The main body of the research has contributed to multiple on-going collaborative projects. I briefly discuss the key results and visions from these projects including (i) autonomous robotic exploration and mapping of smart environment where the spatial relationship between the robot and the smart devices is resolved, and (ii) human-robot-interaction in AR where the spatial intelligence can be seamlessly exchanged between the human and the robot. Further, I suggest future research projects leveraging three critical features from AR, namely situatedness, mobility, and the capability to support spatial collaborations.