Project: Mobile Auditory User Interface

Mobile Auditory User Interfaces are an ongoing research project. The general aim is to design and build an interface that has almost the same capabilities as a visual interface but only relies on (spatial) sound. The prototype we built uses fmod (in the first version) and OpenAL (in the second version) to position sound items on a horizontal plane around the user (who is wearing headphones). As the figure below shows the interface consists of two virtual "rings" around the users head on which auditory representations of items or streams are positioned.


Click "read more" to see some interactive flash demos.

The user can choose to "pull" an item closer towards them or to "push" it away. By pulling it closer it is played louder. If it is pushed to the background it is played from the distance and allows the user to focus on the items closer. By focussing their attention on the sounds in the background the user can maintain an awareness of "open" applications, files, or streams.



This is an interactive demo of the selection method:



And this is an interactive demo from the users perspective (with sound). Please note that Flash doesn't support spatial sound. This demo is in stereo only (but it's 3.2 MB so give it some time to load) and it's a bit buggy. But try activating a sound source by hitting "on". You can now hear it circling around you if you press the red < or >. You can also move it to the outer circle by pressing the red icon for "up" or move it back to the inner circle by pressing "down".



If a sound icon is pulled to the most inner ring, all other items are muted so the user can completely focus on what this icon is representing (phone call, podcast, email, etc.). In the case of a multi-party phone call, the user will start a private conversation with the person pulled to the inner ring. Pushing the representation of that person away will return the user to the conference call with all participants spatially spread out.



We design hand gestures made with the mobile device (for the first version we user a mockup with an intersense inertia cube 3D) to allow manipulation of the interface. By flicking the device to the left or right the user can rotate the rings ccw or cw. By tilting the device up or down the user can pull items closer or push them away.

Related Publications

Dicke, C., Deo, S., Billinghurst, M. and Lehikoinen, J., Experiments in Mobile Spatial Audio-Conferencing: Key-based and Gesture-based interaction. in Mobile HCI, (Amsterdam, 2008), 91-100. download

2 comments:

  1. I love this! Where can I buy it?

    ReplyDelete
  2. Hi, unfortunately you can't buy it (yet). But great to hear that you like it!

    ReplyDelete