Foogue - the movie

This is the video Katrin Wolf showed at MobileHCI 2010 to demo our project "foogue". You can download it here: foogue.mp4

Back to NZ

I'm back from my visit to Finland where I worked with Kai Puolamäki and his Ph.D. student Hannes Gamper. Seriously cool people. We ran a study on multiple audio source discrimination, that is if you play multiple sound sources (mono or binaural) how much onset difference is required so people can still discriminate sources and detect specific sources, how does the total number of sound sources effect the results, is there a difference between speech and non-speech sound, do "distractor" sounds have an influence, etc.
We gathered a huge amount of data from 22 participants and are just about to write everything up.

Travel advise: Visit Finland in summer, not in autumn or winter. Oh, and summer starts in July and ends in August.

Auditory Interfaces for Mobile Devices

Now for download: Jaka, my colleague from the University of Ljubljana, and I have written an entry on 'Auditory Interfaces for Mobile Devices' in the Encyclopedia of Wireless and Mobile Communications.
It's a nice little overview of mobile auditory interfaces, speech, non-speech, interface metaphors, special interest interfaces etc.

MobileHCI 2010

We are very happy that our paper has been accepted.

The paper titled "Foogue: Eyes Free Interaction for Smartphones" will soon be available for download here.

HCI 2010

"Talk to me: The Influence of Audio Quality on the Perception of Social Presence" has been accepted as full paper at HCI 2010 in Dundee, Scotland.

The paper will be available for download on the publications page soonish.

Geektool + AppleScript

Here's a little script I've written for Geektool, a fine little app to display information on the desktop.
You can get the whole shebang here.

This is the script:

global output

tell application "System Events"
set myList to (name of every process)
end tell

if (myList contains "Cog") is true then
tell application "Cog"
if (exists artist of the currententry) then
set this_artist to (get artist of the currententry)
else
set this_artist to " ?"
end if
if (exists title of the currententry) then
set this_title to (get title of the currententry)
else
set this_title to "dunno"
end if
end tell
set output to "currently playing: " & this_title & " by " & this_artist

else
set output to "Cog not running"
end if
do shell script "echo " & quoted form of output

Visiting Kai Puolamäki

Almost on my way to Espoo, Finland. Will work on enhancing orientation in an augmented reality lab guide.
Take a look at this report (PDF) - UBIQUITOUS CONTEXTUAL INFORMATION ACCESS WITH PROACTIVE RETRIEVAL AND AUGMENTATION. The title reads a bit stiff, but what it actually says is: AWESOME GLASSES THAT LET YOU SEE ADDITIONAL INFORMATION WHEN YOU LOOK AT PEOPLE AND OBJECTS. Or: Look through TERMINATOR'S eyes. The idea is to have object and face recognition figuring out what you are looking at. And then the system gives you information about it. Not any information but the ones you need. Because it can tell by how you look at things. Pretty impressive.

Augmented Reality Apps are taking over. Can't wait to see more of this.

Project: Social Presence & Audio Quality

This is an ongoing project. We ran a study and looked at how the audio quality influences the perception of presence, social presence, and the understanding of the emotional state of speakers.

So why is this important? Imagine you are talking to your friend on the phone, or maybe your partner ... would it feel different if that person seemed to stand right beside you? Or imagine you are playing a game like World of WarCraft and the voices of your raid members actually came from their positions within the game, would that make a difference? Would you have a better understanding of how NPCs feel in games like Heavy Rain depending on the sound quality?

The answer is yes, you would. That is if you are like the 82 participants we tested.

Unfortunately I'll have to leave you with a cliff-hanger, as the paper on this hasn't been published yet. But I'll add more details soon. You can find a version of the paper with all the gory details here.

Skin as interface

Researchers from Carnegie Mellon and Microsoft came up with a way of using skin (still attached to the human of course) as an input device. The idea is to use the conducting capacities of the body to detect the unique acoustical signature from a tap on the forearm or hand.

But see for yourself:

Technote: OpenAL on the Nokia N900

We've got it running but it's quite slow. But still, woohoo.

Tangible interaction

I'm looking at this from the perspctive of someone who wants to interact with a mobile phone. In a wider sense you could call this tangible interaction but actually it's more about performing gestures with the mobile phone. Gesture interaction is a nice thing to have especially when you are mobile and have to interact with an auditory interface. The WIMP paradigm (Window, Icon, Menu, Pointing device) works fine on the desktop, but not so well when you are actually moving and only have a small screen.


Auditory User Interfaces (I)

Often people ask me: Why do you research auditory interfaces? What are they good for? When I answer I feel a bit like a vegetarian who has to point out the obvious: visual interfaces are awesome (and very tasty - to stay in the metaphor), but there are drawbacks. For example when you want to interact with a visual interface in a situation that requires a lot of visual attention. Driving a car or just walking a busy street are some of these situations. Or let's say you have your mobile phone in your pocket and cannot see you graphical user interface (GUI). For some of these situations auditory user interfaces (AUI) offer an advantage over visual interfaces. Simple as that.


Metaphors

Metaphors are super important for user interfaces as they a) help the developer design the user interface b) help the user understand the system.

A well-known example is the desktop metaphor introduced by Alan Kay at Xerox PARC in 1970. It is used for the computer screen upon which iconic representations of documents, folders, and other objects including disks and printers can be placed. The advantage of this way of representing is that even if the users have no computer knowledge they can still apply their mental model of files and folders to learn how to operate the computer.

Besides understanding metaphors only in terms of their appearance in literature or poetic language, a much wider ontological impact has been ascribed to metaphors by cognitive linguistics in the early 1980s.