XR
XR+VR ASL Learning Platform
Beesper: A Virtual Reality Sign Language Learning Adventure
Beesper is a VR application designed to teach American Sign Language (ASL) interactively and engagingly. It provides real-time feedback, making the learning process enjoyable and effective. Traditional methods of learning ASL can be dull and lack interaction. Beesper bridges the gap by creating a virtual space where learners can practice ASL through games and interactive experiences. Beesper is for anyone who wants to learn ASL, whether you're a beginner or looking to improve your skills. It's designed to make learning accessible and fun for all. MIT Reality Hack 2024 Winner of "Best Use of Ultraleap Hand Tracking".
Beesper: A Virtual Reality Sign Language Learning Adventure
Beesper is a VR application designed to teach American Sign Language (ASL) interactively and engagingly. It provides real-time feedback, making the learning process enjoyable and effective. Traditional methods of learning ASL can be dull and lack interaction. Beesper bridges the gap by creating a virtual space where learners can practice ASL through games and interactive experiences. Beesper is for anyone who wants to learn ASL, whether you're a beginner or looking to improve your skills. It's designed to make learning accessible and fun for all. MIT Reality Hack 2024 Winner of "Best Use of Ultraleap Hand Tracking".
XR+VR Animal-Perspective Journey through Human Impacts
In Their Paws
“In their Paws" transports you to 2035, where is ravaged by climate change and human encroachment. Stepping into the "paws" of a red panda, you’ll experience immersively and raise awareness about the critical challenges faced by endangered animals and inspire actionable change toward sustainability. By leveraging cutting-edge mixed reality (MR) and virtual reality (VR) technologies, the project immerses users in the lives of these animals, helping them experience the impact of environmental degradation, habitat loss, and human activities from a first-hand perspective. This experience seeks to foster empathy and understanding, encouraging users to reflect on their role in promoting sustainability and conservation. By bridging technology and environmental education, the project highlights the importance of preserving biodiversity and underscores the interconnectedness of all living beings and ecosystems.
In Their Paws
“In their Paws" transports you to 2035, where is ravaged by climate change and human encroachment. Stepping into the "paws" of a red panda, you’ll experience immersively and raise awareness about the critical challenges faced by endangered animals and inspire actionable change toward sustainability. By leveraging cutting-edge mixed reality (MR) and virtual reality (VR) technologies, the project immerses users in the lives of these animals, helping them experience the impact of environmental degradation, habitat loss, and human activities from a first-hand perspective. This experience seeks to foster empathy and understanding, encouraging users to reflect on their role in promoting sustainability and conservation. By bridging technology and environmental education, the project highlights the importance of preserving biodiversity and underscores the interconnectedness of all living beings and ecosystems.
VR
Procedural Nature Simulation
The River Story
The River Story deconstructs and re-constructs the life of a river, which is simulated based on the research of theories on land formation, plant growth, and other nature development procedures with Unreal Engine. To depict such a story with visual language, camera performances are studied in this project to help viewers experience maximum tension and narration within the virtual environment. The resulting film displays the life of a river, originating from a primitive snow mountain environment, through human settlements, highly-occupied urban context, to final “death” and simulation within the bio-lab environment, which starts another cycle of life in the virtual built environment
The River Story
The River Story deconstructs and re-constructs the life of a river, which is simulated based on the research of theories on land formation, plant growth, and other nature development procedures with Unreal Engine. To depict such a story with visual language, camera performances are studied in this project to help viewers experience maximum tension and narration within the virtual environment. The resulting film displays the life of a river, originating from a primitive snow mountain environment, through human settlements, highly-occupied urban context, to final “death” and simulation within the bio-lab environment, which starts another cycle of life in the virtual built environment
VR Storytelling
Immersive Classroom: Yingxian Pagoda
This VR project for Harvard FAS CAMLab proposal aims to bring the audiences into the historical scenes of Yingxian Pagoda, which was built by Emperor Daozong of Liao and is the largest well-preserved ancient wooden tower in China, illustrating the stories behind each masterpiece within the murals, statues, and structure.
Immersive Classroom: Yingxian Pagoda
This VR project for Harvard FAS CAMLab proposal aims to bring the audiences into the historical scenes of Yingxian Pagoda, which was built by Emperor Daozong of Liao and is the largest well-preserved ancient wooden tower in China, illustrating the stories behind each masterpiece within the murals, statues, and structure.
AR
Augmented Environmental Cognition
VivariSense:
Tailored Environmental Perception
VivariSense aims to help users create a tailored environmental sensory experience in their daily lives. The project uses augmented reality and multi-sensory modulation via a wearable system to provide a tool enabling users to control their environments at the input point. The device allows users to filter away undesirable stimuli with a click, log preferences for future memory, and provide alternative feedback for a positive curated experience.
VivariSense:
Tailored Environmental Perception
VivariSense aims to help users create a tailored environmental sensory experience in their daily lives. The project uses augmented reality and multi-sensory modulation via a wearable system to provide a tool enabling users to control their environments at the input point. The device allows users to filter away undesirable stimuli with a click, log preferences for future memory, and provide alternative feedback for a positive curated experience.
Spatial Memory of Olfaction
Olfactory Odyssey: Visualization of smell in interior environments
Vision dominates other senses in our ways of perceiving the world; however, much of the richness of architecture comes from the multi-sensory space we experience, the emotions we echo, and the ways it evokes imagination. As the least considered sense in architecture, Smell can integrate our perception of the world, time, and selves. How can olfactory experience help us to be physically, emotionally, and psychologically aware of our presence in the world? Does smell mediate the relationship between space? This project aims to visualize smells through experiments and quantification calculations to assist in articulating users' scent experience in the AR platform.
Olfactory Odyssey: Visualization of smell in interior environments
Vision dominates other senses in our ways of perceiving the world; however, much of the richness of architecture comes from the multi-sensory space we experience, the emotions we echo, and the ways it evokes imagination. As the least considered sense in architecture, Smell can integrate our perception of the world, time, and selves. How can olfactory experience help us to be physically, emotionally, and psychologically aware of our presence in the world? Does smell mediate the relationship between space? This project aims to visualize smells through experiments and quantification calculations to assist in articulating users' scent experience in the AR platform.
