2. Hack: LeapPlay

This application grabs a list of songs, e.g. from your playlist or a service like SoundCloud. Each song's title is ran through a text-to-speech (tts) engine and becomes a sound source positioned in 3D space (JOAL, OpenAL Java wrapper). You can scroll through the list by using either a mouse, touchpad or LEAP motion controller. The song's title will be read to you when you point at it. If you want to listen to the song you can select it by either clicking the mouse of performing the "select" gesture on the LEAP. The prototype supports selecting and playing multiple songs simultaneously.

This is intended to be a non-visual interface. You can use it just by listening to it and interacting with the LEAP.

Done at the Berlin Geekettes Hackathon.

1. Hack: WonderBelt - Touch Enabled Belt

Wonderbelt - in action

(Picture courtesy of )

This is a regular belt hocked up to a Makey Makey. To make the belt touch sensitive I've added some conductive fabric (this attached to the belt and this wrapped around my hip) to connect to the Makey Makey. And that is me in the second picture playing Tetris with it at the Berlin Geekettes Hackathon (and won 3rd prize).

Check out all the other cool projects at Hacker League.


This is the very neat and amazingly precise LEAP I got a few weeks ago. It's going to be interesting to find applications for this that don't make you wish you could rest your hands [on a keyboard, mouse, touchpad] for a while.

New Year, new Job

Since Januar 2013 I work as a Postdoc researcher at the Telekom Innovation Labs in Berlin. And this is what I call a room with a view:

How to build a low-cost evaluation framework for in-vehicle interfaces

Based on our experience developing the driving simulator for this experiment, my colleagues Jaka Sodnik and Grega Jakus and I propose a low-cost, laboratory based testing framework for in-vehicle interfaces. Exemplified by a comparison between an auditory interface, a Head-up display (HUD), and a combination of both we show how task completion times, driving penalty points, mental workload, and subjective user evaluations of the interfaces can be collected through different logging systems and user questionnaires. The driving simulator used in the experiment enables the simulation of varying traffic conditions as well as different driving scenarios including a highway and a busy city center. The framework will be presented at the 5th International Conference on Advances in Computer-Human Interactions (ACHI 2012).

HUD study: first shots

This is the Head up display (HUD) that we are using in our study. We compare it to an audio only version of the same menu and a combination of both. We are half way through the study and so far our results suggest an overall preference for the audio-visual menu. Although, there are some who clearly prefer the visual only and some who clearly prefer the audio only version.

To interact with the menu we use a customized device (in fact a small computer mouse) that is attached to the steering wheel. It is placed behind the wheel as we want to reduce the haptic distraction as much as we can. The scrolling wheel of the mouse lets the user select items within the menu. By pressing the lower button of the two-button mouse the user can select items (or decent in the menu hierarchy) and by pressing the upper button the user can go up in the menu hierarchy.

NASA TLX Workload (HTML)

Keith Vertanen provides a one page NASA TLX HTML+JavaScript version, which is really handy if you want to translate it and run it online. You can download it here keithv.com/software/nasatlx/. Thanks, Keith.

Update: I've added a User ID and Timestamp to Keith's original version. Comes in handy if you do user testing. You can find it here.