Wednesday, August 27, 2008

Head Tracking in Second Life.

Our avatars usually are just standing around when we communicate via text or voice in Second Life, not adding any emotion or emphasis to what we say. Sometimes we enhance them with gestures or animations, but it never reaches the deep level of communication you can have with your real life face.

VR-Wear is about to change that, they have been working for several months now to integrate signal processing capabilities in the Second Life Viewer. The VR-Wear team is now able to connect a web cam and analyze your head motions in real-time and have them acted out in Second Life.

VR-Wear has released a short YouTube video showing the tracking of a head gesturing Yes and No. Their software will be released in September with a load of emotions and motions filters that will make your Avatar behave more like you. Maybe even exactly like you, but if really would want that is up for debate.

For the programmers amongst us will be the opportunity to participate and create their own versions. The project will be released under a dual license (GPL-like for non-commercial applications).





via Mobitrends.com

No comments:

Post a Comment