Augmented Body - Echo AR

An immersive experience that introduces alternative sensory pathways to reshape how users perceive and navigate the world, using auditory, visual, and tactile stimuli to simulate echolocation.

Individual Project

Timeline: F24, 4 weeks

Tools/Skills: Interaction Design, Digital Prototyping, VR Development, Haptic Integration, Unity (MetaXR SDK)

“See” through your voice.

Research

Echolocation inspired my research because it demonstrates the incredible adaptability of sensory systems in nature. Blind individuals, like Daniel Kish, have used echolocation to navigate their environments, inspiring a rethinking of how humans perceive and interact with the world.

This led me to three guiding questions:

  • Can sensory augmentation reshape how we experience space?

  • What does it mean to "see" in ways beyond sight?

  • How can technology simulate and expand sensory modalities?

Echolocation is described as "flashes of 3D geometry," where depth and spatial relationships become clear momentarily with each sound. This formed the foundation of EchoAR's design.

Insights:

Approach:

“Sonar Flashes”

Sound is converted into “flashes” of information

Detail

“Fuzzy,” 3D Depth and shape perception

Control

Visual information changes based on the characteristics of the noise emitted


View fades from a black screen to the camera view and back depending on the user mic input

VR passthrough camera layer with filters to replicate the “fuzzy” effect, object outlines for shape perception, & haptic feedback for object distance detection

Opacity and depth of the camera view changes based on the loudness and length of user’s mic input

Find the hidden object.

View starts off pitch black.

2.

As the user speaks, their surroundings “flash” in response after a short delay.

4.

The user’s objective is to find the hidden controller.

3.

The closer you get, the more vibrations you feel.

1.

As they continue, their own controller vibrates while they approach the hidden controller.

I added a layer of tactile simulation and gamification by incorporating haptic feedback through the Quest 2’s controllers, allowing users to "feel" their surroundings by controller vibrations when their objective was nearby.

Prototyping Process

I modified the style of the camera passthrough layer by using a black background as the base and using edge rendering set to green for the “fuzzy” details for simulated shape perception.

The fade from microphone script changes the opacity of the black image UI layer based on the loudness from the user’s microphone after it reaches a certain threshold. The controller feedback script tracks the live controller distance and vibrates within a certain distance from each other on the mic input, with varying intensities based on distance.

I imported the Oculus Quest 2’s camera rig using Unity’s MetaXR SDK package. I then added a canvas UI element of a black image that would fade according to the user’s mic input. I also added empty game objects for the audio loudness detection script, one of which would handle the detection of the microphone’s loudness and another that handles the VR pass-through camera being revealed based on those mic input levels.

Previous
Previous

Expo 2025 - Thailand Pavilion

Next
Next

CMU Special Collections