Immersive experiences that bridge physical and digital worlds.
This project leverages cutting-edge virtual reality technology to create an immersive training environment while advancing our understanding of human cognition. By combining the Varjo VR-3 headset's high-precision eye tracking capabilities with Shimmer3 GSR sensors, we aim to measure and adapt to users' cognitive load in real time. The training module is designed to replicate complex real-world tasks in a controlled VR environment, allowing for detailed analysis of user performance and mental workload. The ultimate goal is to develop an adaptive training system that optimizes learning outcomes by dynamically adjusting the VR environment based on the user’s cognitive state.
A machine learning system that detects cognitive load in virtual reality training environments by analyzing pupil dilation and fixation duration. Tested on 22 participants learning cold spray manufacturing in VR, achieving 84% accuracy with Multi-Layer Perceptron models. This research enables adaptive VR training that personalizes difficulty based on real-time mental effort detection.
An immersive art therapy system that transforms your body into a tree, using pupil dilation and biometric sensing to create responsive virtual environments. As you regulate your emotions, birds nest in your branches, and you connect through underground root networks to help other "trees" in a collective healing forest. Inspired by Persian poetry and mycorrhizal networks, the project explores how ancient wisdom about interconnection can guide therapeutic technology design.
The Four-Hand Illusion VR project explores the boundaries of perception and embodiment in virtual environments. By introducing additional virtual limbs, it challenges conventional notions of self-representation. It seeks to deepen our understanding of how the brain integrates sensory inputs to create a sense of bodily ownership. This experience aims to evoke curiosity, provoke emotional responses, and provide a space where users can question their reality and relationship with their own bodies. The project is rooted in the belief that VR can transcend entertainment to become a medium for experimentation, self-reflection, and scientific exploration. It emphasizes the potential of immersive technology to alter the user's sense of self and enable transformative, almost otherworldly experiences.
VR bicycle is an exergame designed to research MS women patients in many factors, including depression and pain reception. In this interdisciplinary project, I am responsible for developing the VR and interface application. VR app is an interactive 360-degree video player in which based on the data received from a bicycle (via Bluetooth), the playback speed will regulate. Also, as the next step user could grab 3D objects or calculate random numbers during the VR experience to research cognitive improvements in patients.· Teaching Social Media Skills to Mid-aged Women Members of an Entrepreneurship Association.
Teaching technology to recognize, understand, and respond to human emotions.
This project explores the intersection of art, emotion, and technology by harnessing the power of virtual reality to evoke the profound emotion of awe. Through carefully designed immersive experiences, we aim to inspire creativity and introspection, connecting participants with vast, transcendent stimuli that challenge their ordinary perspectives. We delve into how emotions like awe influence our creativity and emotional well-being by integrating artistic elements with cutting-edge physiological sensing—such as eye-tracking and skin conductance. This work celebrates the artistic potential of VR as a medium for emotional connection and pushes the boundaries of how we understand and model human emotions in the digital age (A: a VR scene to induce awe, B: a participant in the experience, C: eye-tracking camera in the headset.)
Where art, technology, and human experience converge—projects that defy singular categories.
This is my graduation animation, which is under final editions to participate in international festivals. The story is about a girl who has lost all her family and tries to seek their presence in a tree that is grown up with her. In this film, I tend to experiment with multi techniques such as live movies, 2D animation, and stop motion using folded papers.
This is our final stop motion animation for the Poppet Making Course at the Art University of Tehran. My friends and I had about 2 months to make the character and produce the movie. The core idea is about our perception of the mind. Can we consider the mind as a useful tool? Or can it prevent us from exploring new things that are out of its reach, such as spiritual matters? P.S.: The poster is designed by my friend Mahboobeh Kalaee. For watching, please follow the link below https://vimeo.com/user91389957/review/469731753/48f958f696
In this short animation, I was proudly selected to participate in creating a clip for the ASIFA 60th anniversary. The goal was to express the concept of “spiritus mundi” which is reused by William Butler Yeats and I tend to illustrate the concept of blooming by animating a woman dancing in a historical mosque of Isfahan Sheikh Loft-Allah. Please visit the project at the link: https://youtu.be/TZbFrgb_vK8
This is a self-project that I got the idea from contemporary poetry about the concept of homeland and immigration. The poet MR. Shafie Kadkani describes his wish that if we had carried our home country similar to violet flowers, we would have had it wherever we wanted. P.S.: In Iran, on the first day of spring, which is called “Nowrooz”, many people buy violet flowers to plant them in their gardens as a symbol of happiness. Hence, it is a common picture to see violets are traveling from one garden to another with their soil in a wooden box.
A hand illustration of myself, surrounded by my favorites. Inspired by vector arts and done totally in TVpaint.
Understanding patterns in human interaction through machine learning and neural networks.
Entropy Escape is a VR escape room that watches how you look, not just what you do. We compute gaze entropy from eye tracking to tell when you’re focused, exploring, or stuck, and use that to model your problem-solving behavior.
Academic contributions and publications.
This is an extended abstract presented at the UMAP conference.
Read Paper →Detecting cognitive load based on pupil dilation
Read Paper →UIST Conference. Machine learning approaches to personalize interactive systems based on user behavior...
Read Paper →IDC on designing a game for children with speech disorders.
Read Paper →IDC on designing a game for children with speech disorders.
Read Paper →Best way to reach me is through email: mahsa.nasri27@gmail.com
Connect with me on LinkedIn: Mahsa Nasri
Download my resume: CV/Resume (PDF)
Open to collaborations, research opportunities, and interesting conversations about the intersection of technology and human experience.