Spatial Sound

For my interfaces I use mostly spatial sound. There no big deal about spatial sound as in real life it is always spatial. Which means you can tell, just by listening to the sound, where it is coming from. How does your brain do it? It relies mostly on two cues (out of 7). These two are the Interaural Time Difference (ITD) and Interaural Level Difference (ILD) (the other five are: Reflections from the pinnae and the shoulders, head motion, early echo response/reverberation, vision). ITD simply means that the sound waves arrive at one of the ears earlier than at the other, hence there is a time difference (unless the source is at 0/180 degree azimuth). The brain can compute the direction of the source from this. The ILD is similar: as sound waves are pressure differences in air the head blocks some of them. Let's say the sound source is on your right side - now your head will shield your left ear. This is again detectable by the brain (unless the source is at 0/180 degree azimuth) and helps calculating where the sound source is located.


Project: Elevator interface

Touch screens are widely used, in mobile phones, laptops, on information displays of all kinds ... and they will come to elevators, too. Houses are getting higher and higher and "button space" is limited. And of course you can place nice adds on the screens while you transport people. I've looked at how such an interface could look like. Below you can see two prototypes I've made to run a small study on. They were displayed on a huge touch screen in an elevator simulator. But I've shrunk them for you. Click "read more" to see demos.

Project: Augmented Reality Guide

Last year I attended the IPCity Summer School in Vienna.

We've had three days to develop an Edutainment application based on the TimeWarp project developed by Fraunhofer FIT.

This is a quick hack for an augmented reality guide. The idea was to create an immersive experience based on the story of Sisi (or rather her ghost). The user becomes a ghostbuster and has to collect items for Sisi's gost. The items are scattered all over Vienna and can be detected with a special (gadget) device (like all serious ghostbusters own). When the user enters the "hot zone" and comes close to one of the items, the local audio theme starts playing. Sound themes are atmospheric and set the mood for the next encounter with Sisi's gost and/or the nearby item.


The docu is a bit trashy as we had only three days for designing, coding, implementing, video shooting, editing, and ... having discussions with the makeup artist.


Visiting Patrick Baudisch at HPI

Wow, 6 months at Patrick Baudisch's new Lab at the Hasso-Plattner-Institut in Potsdam, Germany. Patrick is one of the few rock stars in HCI and especially research on touch and multi touch interaction. His brainstorming sessions are legendary (and I can't wait to see the sniff-o-meter in action).

By the time he bought this saw I knew he was serious about building the new Luminos (get the CHI paper here). Did I mention that the saw moved into the women's loo?



It was a brilliant time at Patrick's lab and I think it was the most intense learning experience I've had in a while. My project with him was focussed around sound based overview techniques for object detection and metaphors to improve distance perception with sound. A publication is in preparation!

Project: Simulator sickness

In collaboration with Nokia Research we investigated if spatial sound causes effects of simulator sickness (or motion sickness (there is a lengthy discussion about whether simulator sickness is a form of motion sickness. Therefore we will refer to what made people sick during our experiments as "simulator sickness")).

We started off with a recording made with an improvised binaural recording device (the head) mounted on a manipulated record player that allowed seamless speed manipulation. We compared the binaural recording to a stereo recording (made with a Zoom H4) and found it worth to take a closer look at.

Click "read more" to listen to some sound demos.

Project: Mobile Auditory User Interface

Mobile Auditory User Interfaces are an ongoing research project. The general aim is to design and build an interface that has almost the same capabilities as a visual interface but only relies on (spatial) sound. The prototype we built uses fmod (in the first version) and OpenAL (in the second version) to position sound items on a horizontal plane around the user (who is wearing headphones). As the figure below shows the interface consists of two virtual "rings" around the users head on which auditory representations of items or streams are positioned.


Click "read more" to see some interactive flash demos.

Project: in-vehicle Auditory Interface

We looked at how an in-Vehicle Auditory User Interface can improve the safety of drivers. Therefore we built a very simple graphical interface and compared it to a spatial auditory interface. They both had the same functionality: the user could navigate a hierarchical mobile phone menu (we used a stripped down menu from the Nokia S60 Series) and access typical items like contacts, music player, text messaging, etc. The visual interface was built to be displayed on on-board screens like being used in navigation systems while the auditory interface was spatially spread out using a 7.1 speaker system (resembling the on-board speaker system).


IPCity Summer School 2009

Just participated in the IPCity Summer School in Vienna (September 22-25th) . IPCity is a EU funded Sixth Framework programme Integrated project on Interaction and Presence in Urban Environments (http://www.ipcity.eu).

We've had lots of fun playing around with the Fraunhofer TimeWarp Tech Framework and we created our own little ghostly tourism app - helping the ghost of Sisi to retrieve her lost juwels (which are, of course, hidden all over Vienna).




And it was my first time visit to the Naschmarkt, which I loved. I'm still not sure whether this is a nutshaped snake or a snakeshaped nut: