Research
We are a human-computer interaction research lab contributing new methods, tools, and technologies for creating novel interactive systems and technologies.
We have a history of building and studying next-generation user interfaces, involving touch, gesture, speech, multi-modal, multi-device interactions, and now virtual, augmented, and mixed reality interfaces. When studying these interfaces, our focus is on improving both the designers’ and end-users’ experience through novel ways of creating and using them.
We thank our sponsors for supporting the research below.
Current Research Topics
Our research group is currently pursuing three major research themes:
- Future of Work and Collaboration in XR: In our latest systems and experiments, we study how to best enable collaboration in mixed reality with AI agents, and explore the benefits and limitations of using LLMs to guide, facilitate, and enrich the conversation and communication among human users.
- Democratizing XR development: In a series of novel systems, we explore different ways of enabling designers and domain experts to create new AR/VR experiences without significant training in 3D programming, animation, or modeling.
- Instructional XR experiences: As part of our involvement in the U-M wide XR initiative, we investigate new techniques and tools for creating immersive instructional materials making use of XR technology to facilitate training and learning.
- Safe and accessible XR applications: One of the main challenges with innovating in the XR space is balancing usability, accessibility, and safety. We always work on new methods and tools that help designers form a critical mindset and understand their increased responsibility when designing spatial user experiences, making XR more accessible and safe in every design stage (this shouldn’t be an afterthought).
Recent Work
Here, we summarize recent contributions to ACM CHI, the premier conference in the human-computer interaction field:
- Enhancing lecture materials with immersive content: With Paper Trail, we designed and studied an immersive authoring system that allows instructors to create AR powered lecture materials from paper-based resources, enhancing them with virtual and interactive content in AR so that students can better visualize and understand complex lecture topics. We designed Paper Trail to support both hand-held and head-worn AR and our studies with high-school instructors showed benefits with using hand-held AR for creating and capturing immersive lecture materials and head-worn AR for visualizing and interacting with the enhanced lecture materials.
- Virtual production of immersive lectures: With XRStudio, we created a virtual production and live streaming system for immersive instructional experiences. XRStudio equips instructors interested in using VR for teaching with tools and infrastructure to record or live stream lectures in VR. Students can access these VR lectures with or without VR devices using different output channels including mixed reality capture, 3D desktop, mobile AR, or fully immersive VR.
- Mixed reality user studies: With MRAT, we created a mixed reality analytics toolkit, designed to support user evaluations on AR/VR devices, eliciting a diverse set of requirements and presented general concepts and techniques to collect, pre-process, and visualize mixed reality user data.
- Collaborative immersive authoring: With XRDirector, we created a role-based, multi-user immersive authoring system that adapts roles familiar from filmmaking to coordinate multiple AR/VR designers and characterize the issues related to spatial coordination.
- Key barriers to entry for novice XR creators: We conducted an interview study with 21 participants elaborating on their practices and challenges making use of current XR technologies and tools and explored opportunities for better end-user programming support.
Earlier Research
Over the first three years at U-M, the lab has worked on cross-device interfaces, multi-modal interaction, and AR/VR. Our systems research contributions include ProtoAR [CHI’18] and 360proto [CHI’19] for rapid prototyping of AR/VR interfaces using cross-device authoring, 360Anywhere [EICS’18] for 360 video based collaboration using mixed reality, GestureWiz [CHI’18] for using Wizard of Oz and crowdsourcing in gesture design and recognition tasks. Using the knowledge from building these systems, the lab has also been working on conceptual frameworks and new ways of thinking about future interfaces’ design through contributions such as What is Mixed Reality? [CHI’19 Best Paper Honorable Mention], The Trouble with AR/VR Authoring Tools [ISMAR’18 Adj.], Playing the Tricky Game of Toolkits Research [CHI’17 HCI.Tools Workshop], and studies on creating cross-device AR experiences resulting in XD-AR [EICS’18 Best Paper Award] as well as user-driven design principles for gesture representations [CHI’18].
We keep an archive of our previous research here.
Major Publications (since starting the lab at U-M in 2016)
For the full list of publications, please check Google Scholar and DBLP.
- Paper Trail: An Immersive Authoring System for Augmented Reality Instructional Experiences CHI’22
S. Rajaram, M. Nebeling - XRStudio: A Virtual Production and Live Streaming System for Immersive Instructional Experiences CHI’21
M. Nebeling, S. Rajaram, L. Wu, Y. Cheng, J. Herskovitz - XRDirector: A Role-Based Collaborative Immersive Authoring System CHI’20
M. Nebeling, K. Lewis, Y-C. Chang, L. Zhu, M. Chung, P. Wang, J. Nebeling - MRAT: The Mixed Reality Analytics Toolkit CHI’20
?BEST PAPER AWARD
M. Nebeling, M. Speicher, X. Wang, S. Rajaram, B.D. Hall, Z. Xie, A.R.E. Raistrick, M. Aebersold, E.G. Happ, J. Wang, Y. Sun, L. Zhang, L. Ramsier, R. Kulkarni - Creating Augmented and Virtual Reality Applications: Current Practices, Challenges, and Opportunities CHI’20
?BEST PAPER AWARD
N. Ashtari, A. Bunt, J. McGrenere, M. Nebeling, P.K. Chilana - iGYM: An Interactive Floor Projection System for Inclusive Exergame Environments CHI PLAY’19
?BEST PAPER AWARD
R. Graf, P. Benawri, A.E. Whitesall, D. Carichner, Z. Li, M. Nebeling, H.S. Kim - What is Mixed Reality? CHI’19
?BEST PAPER HONORABLE MENTION
M. Speicher, B.D. Hall, M. Nebeling - 360proto: Making Interactive Virtual Reality & Augmented Reality Prototypes from Paper CHI’19
M. Nebeling, K. Madier - ProtoAR: Rapid Physical-Digital Prototyping of Mobile Augmented Reality Applications CHI’18
M. Nebeling, J. Nebeling, A. Yu, R. Rumble - GestureWiz: A Human-Powered Gesture Design Environment for User Interface Prototypes CHI’18
M. Speicher, M. Nebeling - User-Driven Design Principles for Gesture Representations CHI’18
E. McAweeney, H. Zhang, M. Nebeling - XD-AR: Challenges and Opportunities in Cross-Device Augmented Reality Application Development EICS’18
?BEST PAPER AWARD
M. Speicher, B.D. Hall, A. Yu, B. Zhang, H. Zhang, J. Nebeling, M. Nebeling - 360Anywhere: Mobile Ad-hoc Collaboration in Any Environment using 360 Video and Augmented Reality EICS’18
M. Speicher, J. Cao, A. Yu, H. Zhang, M. Nebeling - XDBrowser 2.0: Semi-Automatic Generation of Cross-Device Interfaces CHI’17
M. Nebeling