XRTC Leadership
Fernando De la Torre
Associate Research Professor
Robotics Institute | School of Computer Science
Research interests: computer vision and machine learning. In particular, his research interests are applications to human health, augmented reality, virtual reality and methods that focus on the data (not the model).
Prof. De la Torre directs the Human Sensing Laboratory.
Kris Kitani
Associate Research Professor
Robotics Institute | School of Computer Science
Research interests: computer vision, machine learning and human-computer interaction. In particular, his research interests lie at the intersection of first-person vision, human activity modeling and inverse reinforcement learning.
Prof. Kitani directs the Cognitive Assistance Lab.
David Lindlbauer
Assistant Professor
Human-Computer Interaction Institute | School of Computer Science
Research interests: understanding how humans perceive and interact with digital information, and building technology that goes beyond the flat displays to advance our capabilities when interacting with the virtual world. To achieve this, he creates and studies enabling technologies and computational approaches that control when, where and how virtual content is displayed to increase the usability of Mixed Reality interfaces.
Prof. Lindlbauer directs the Augmented Perception Lab.
Associated Faculty
Using VR for language and cultural learning as well as storytelling and content production in XR.
Justin Chan's research focuses on building intelligent mobile and embedded systems for computational health and large-scale environmental sensing.
Focuses research in immersive environments and interactive techniques, performance capture and expanded animation.
Develops practical and deployable XR experiences with a focus on helping users take care of their health and human-AI interaction.
Leveraging motion capture for movement analysis and biomechanics.
Human-Computer Interaction Institute
School of Computer Science
Creates new and delightful sensing and interface technologies for human-computer interactions leveraging smart environments, wearable computing, mixed reality, and gestural interfaces.
Culturally competent design of XR spaces and artifacts, including immersive environments for medical skills assessment and other educational interventions.
Using VR, both as an individual user and multi-user experience, to improve public speaking and communication in the workplace.
Develops new adaptive materials for wearable haptic feedback and dynamic interfaces for increased immersion.
Research involves reconstructing and modeling dynamic environments and behaviors using both static and wearable devices, with the goal to develop neural representations that can accurately predict human behavior, thereby enhancing interactions in virtual and augmented reality settings.
Instrumenting environments for high-end location-based XR experiences.
Research on interaction techniques that are applicable in 3D, as well as supporting application developers to be more efficient and productive.
Building creative tech platforms for spatial computing design.
Through humor and play Nica creates participatory video installations and games that reveal and challenge social constructions that are reinforced by technology and performance.
Anthony Rowes research interests are in networked real-time embedded systems with a focus on wireless communication.
Aswin Sankaranarayanans research deals with understanding the interaction of light with materials, devising theories and imaging architectures to capture these interactions, and finally developing a deeper understanding of the world around us based on these interactions.
Develops approaches for high-fidelity reconstruction of environments and objects using 3D Gaussians.
Exploring system and network support for XR. This includes developing systems that can scalably capture 3D scenes in real time, designing protocols for streaming and encoding 3D content on the Internet and developing techniques to efficiently render 3D content on viewing devices with limited computational resources.
Work in the XR space revolves around helping students (A) learn Critical Literacies for XR and (B) understand the ethical and humanistic concerns of using XR as a tool for education.
Developing and researching mixed-reality AI systems bridging physical and virtual worlds to improve children's inquiry-based STEM learning.
Dina El-Zanfaly is a computational design and interaction researcher and an Assistant Professor in the School of Design at Carnegie Mellon University (CMU). She currently directs a research lab that she recently founded, hyperSENSE: Embodied Computations Lab.
Jun-Yan's research focuses on computer vision, graphics, computational photography, and generative models. His lab studies the collaboration between Human Creators and Generative Models, with the goal of building intelligent machines capable of helping everyone tell their visual stories.