Purdue University Graduate School
Browse
PhD_final_thesis_Yuanzhi_9-21.pdf (16.98 MB)

Exploring Novel Human Smart-thing Interaction through Augment Reality Framework Design

Download (16.98 MB)
thesis
posted on 2020-12-16, 15:25 authored by Yuanzhi CaoYuanzhi Cao
We have never felt so connected with the surrounding social and physical environment, thanks to the increasingly populating mobile computing devices and rapidly developing high-speed network. These technologies transform the everyday objects into smart-things and make us accessible to a large amount of digital information and intelligence relating closely to the physical reality. To bridge the gap between the digital interface and physical smart-thing, Augmented Reality (AR) has become a promising media that allows users to visually link the digital content to its physical target, with spatial and contextual awareness. Thanks to the vast improvement to the personal computing devices, AR technologies are emerging to popular realistic scenarios empowered by commercially available software development kits (SDKs) and hardware platforms, which makes it easier for human users to interact with the surrounding smart-things.

Due to the scope of this thesis, we are interested in exploring for the smart-things that have physical interaction capabilities with the reality world, such as Machines, Robots, and IoTs. Our overarching goal is to create better experience for users to interact with these smart-things, that is visual, spatial, contextual, and embodied, and we try to achieve this goal through novel augmented reality system workflow/framework design.

This thesis is based on our four published conference papers (Ani-Bot, V.Ra, GhostAR, Avatutar-study), which are described in chapters 3-6 respectively. On a broader level, our works in this thesis focus on exploring spatially situated visual programming techniques for human smart-thing interaction. In particular, we leverage contextual awareness in the AR environment with the interactivity of physical smart-things. We explore (1) spatial and visual input techniques and modalities for users to intuitively interact with the physical smart-things through interaction and interface design, and (2) the ecology of human smart-thing through system workflow design corresponding to the contextual awareness powered by the AR interface. In this thesis, we mainly study the following spatial aware AR interactions with our completed work: (i) Ani-Bot demonstrates Mixed-Reality (MR) interaction for tangible modular robotics through a Head-Mounted Device (HMD) with mid-air gestures, (ii) V.Ra describes spatially situated visual programming for Robot-IoT task planning, (iii) GhostAR has presented a time-space editor for Human-Robot Collaborative (HRC) task authoring. (iv) while AvaTutAR-study has presented an exploratory study that provided valuable design guidance for future AR avatar-based tutoring systems.

We further develop the enabling techniques including a modular robotics kit with assembly awareness and the corresponding MR features for the major phases of its lifecycle; a lightweight and coherent ecosystem design that enables spatial and visual programming as well as IoT interactive and navigatory task execution with a single AR-SLAM mobile device; and a novel HRC task authoring workflow using robot programming by human demonstration method within AR scene with avatar reference and motion mapping with dynamic time warping (DTW). Primarily, we design system workflows and develop applications for increasing the flexibility of AR content manipulation, creation, authoring, and intuitively interacting with the smart environment visually and pervasively.

Based on our completed projects, we conclude this thesis by summarizing the overall contributions of my Ph.D. works, and briefly providing my humble vision for the future of AR.

Funding

FW-HTF: Collaborative Research: Pre-Skilling Workers, Understanding Labor Force Implications and Designing Future Factory Human-Robot Workflows Using a Physical Simulation Platform

Directorate for Education & Human Resources

Find out more...

NRI: Towards Dexterous Micromanipulation and Assembly

Directorate for Computer & Information Science & Engineering

Find out more...

PFI:BIC MAKERPAD: Cognitively Intuitive Shape-Modeling and Design Interface enabling a Distributed Personalized Fabrication Network.

Directorate for Technology, Innovation and Partnerships

Find out more...

Convergence Accelerator Phase I (RAISE): Skill-LeARn: Affordable Augmented Reality Platform for Scaling Up Manufacturing Workforce, Skilling, a

Directorate for Technology, Innovation and Partnerships

Find out more...

EAGER/Collaborative Research/Cyber Manufacturing: Cyber-Manufacturing Systems for Open Product Realization

Directorate for Engineering

Find out more...

History

Degree Type

  • Doctor of Philosophy

Department

  • Mechanical Engineering

Campus location

  • West Lafayette

Advisor/Supervisor/Committee Chair

Karthik Ramani

Additional Committee Member 2

David Cappelleri

Additional Committee Member 3

Alex Quinn

Additional Committee Member 4

Shreyas Sen

Usage metrics

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC