If you’ve never tried to increase the immersive feel of Second Life by using a video projector, it’s worth the effort to track one down and try it at least once. You need to be able to hook up your computer to projector, have a large screen or, preferably, a large wall, a darkened room, and a comfortable chair. Sitting in a La-Z-Boy® chair with a laptop on your knee and the entire wall before you as your image of Second Life is quite dramatic. With avatars as large as real people, it can be an intense experience that makes you feel as if you are actually in the world as opposed to looking at it through the window of the screen. Although you are stationary with your hands doing the work, it’s possible to forget you are not actually “there” in the virtual world.
Consumer devices such as the Microsoft® Kinect are also demonstrating the potential to become more interactive with Second Life. Thai Phan, an engineer at the University of Southern California Institute for Creative Technologies has written a piece of software that can hack the Kinect and let you move an avatar in Second Life by using your body as input to the camera. Waving your arm in your real life results in your avatar waving in your Second. ‘
Using Kinect & OpenNI to Embody an Avatar in Second Life
At a more serious and intensive level, Stanford University in California have their Virtual Human Interaction Laboratory (VHIL) dedicated to the understanding of “the dynamics and implications of interactions among people in immersive virtual reality simulations.” Using a sophisticated array of technology, the researchers can provide people with a virtual environment that provides not just a visual experience but also an auditory and touch-based one. Not only can a subject feel a virtual tremor but also hear localized sounds from speakers embedded around the room.
Stanford’s Virtual Human Interaction Lab
One of their more fascinating projects is examining something called the “Proteus Effect.” The basic idea is that not only do we use our real selves to create our virtual avatars, but our virtual avatars then, in turn, can shape how our real selves behave. In 2007, researchers Nick Yee, Jeremy Bailenson, and Nicholas Ducheneaut conducted an experiment to see what effect avatar height had on a person’s real life interactions with people. They had people immerse themselves in the World of Warcraft virtual world as either a tall or short avatar. Then, they had the same people spend 15 minutes interacting in the real world with confederates who were primed to be aggressive. What they found was that people who had tall avatars were more aggressive in responding to the experimenters than those who had short avatars. The implication is that the avatar was, in fact, shaping how the person behaved in real life. 
Of course, the VHIL set up cost thousands of dollars and having your own basement version is still some way off. However, it may not be as far as we might think. Only last year, Sony released its unimaginatively titled HMT-Z1 visor, which is basically a headset with tiny television screens where the glass would be in a pair of spectacles. The aim is to provide an immersive 3D experience while sitting back in a chair. Developed for the movie and gaming market, this expensive piece of kit is not cheap, and currently appears not be available for the US market, but it shows how close we are to a more intense virtual experience.
Sony 3D OLED Headset Glasses – HMZ-T1
Not to be outdone in this wave of the future, Microsoft has a patent out for an Xbox gaming helmet that they filed back in 2010. According to the PatentBolt blog, which is dedicated to tracking technology patents filed at the US Patent and Trademark Office, the plans cover a helmet with built-in small TV screens that sit in front of the wearer’s eyes.  The aim is to allow the wearer to “view images from a computer, media player, or other electronic device with privacy and mobility.” Of course, a patent isn’t an actual device and there doesn’t seem to be any evidence yet of a 3D gaming headset from the Wizards of Redmond, but given the right market demand, they could put something together in relatively short order.
So we’re not quite at the total immersion effect of a Star Trek holodeck just yet. But there’s a good chance that within a couple of years some of us will be donning our 3D helmets and living a very different Second Life. And hey, add to that the use of remote sex toys and some people may only log back on to Real Life for food!
 Using Kinect and OpenNI to Embody an Avatar in Second Life: Gesture & Emotion Transference. Institute for Creative Technologies.
 Stanford University, Virtual Human Interaction Laboratory. http://vhil.stanford.edu/
 Yee, N., Bailenson, J.N., & Ducheneaut, N. (2009). The Proteus Effect: Implications of transformed digital self-representation on online and offline behavior. Communication Research, 36 (2), 285-312. Stanford VHIL Online Publications.
 Microsoft Invents Projector Eyewear for Xbox & Beyond. PatentBolt blog.