Almost on my way to Espoo, Finland. Will work on enhancing orientation in an augmented reality lab guide.
Take a look at this report (PDF) - UBIQUITOUS CONTEXTUAL INFORMATION ACCESS WITH PROACTIVE RETRIEVAL AND AUGMENTATION. The title reads a bit stiff, but what it actually says is: AWESOME GLASSES THAT LET YOU SEE ADDITIONAL INFORMATION WHEN YOU LOOK AT PEOPLE AND OBJECTS. Or: Look through TERMINATOR'S eyes. The idea is to have object and face recognition figuring out what you are looking at. And then the system gives you information about it. Not any information but the ones you need. Because it can tell by how you look at things. Pretty impressive.
Augmented Reality Apps are taking over. Can't wait to see more of this.
Project: Social Presence & Audio Quality
This is an ongoing project. We ran a study and looked at how the audio quality influences the perception of presence, social presence, and the understanding of the emotional state of speakers.
So why is this important? Imagine you are talking to your friend on the phone, or maybe your partner ... would it feel different if that person seemed to stand right beside you? Or imagine you are playing a game like World of WarCraft and the voices of your raid members actually came from their positions within the game, would that make a difference? Would you have a better understanding of how NPCs feel in games like Heavy Rain depending on the sound quality?
The answer is yes, you would. That is if you are like the 82 participants we tested.
Unfortunately I'll have to leave you with a cliff-hanger, as the paper on this hasn't been published yet. But I'll add more details soon. You can find a version of the paper with all the gory details here.
So why is this important? Imagine you are talking to your friend on the phone, or maybe your partner ... would it feel different if that person seemed to stand right beside you? Or imagine you are playing a game like World of WarCraft and the voices of your raid members actually came from their positions within the game, would that make a difference? Would you have a better understanding of how NPCs feel in games like Heavy Rain depending on the sound quality?
The answer is yes, you would. That is if you are like the 82 participants we tested.
Skin as interface
Researchers from Carnegie Mellon and Microsoft came up with a way of using skin (still attached to the human of course) as an input device. The idea is to use the conducting capacities of the body to detect the unique acoustical signature from a tap on the forearm or hand.
But see for yourself:
But see for yourself:
Subscribe to:
Posts (Atom)