It’s been around for decades, but virtual reality has been anything but real for most people. That’s about to change as a slew of new virtual-reality technologies get set to tempt your wallet. Some of them are even available in time for this year’s holidays.
This is Scientific American’s 60-Second Science. I’m Larry Greenemeier. Got a minute? Virtual reality started off as a way for scientists to visualize their research. Ken Perlin, a computer science professor and pioneer in the field of virtual reality, explains:
[Ken Perlin:] The first people who seriously developed virtual reality were Ivan Sutherland and his student Bob Sproull back in 1968. They built a very large device, which they nicknamed the “Sword of Damocles” because it was a very large contraption that hung over your head and carried the headset with it as it moved around on a giant boom arm.”
New gadgets expected to launch in 2016 will be a bit more sophisticated.
[Perlin:] “The major commercial releases of virtual reality that will appear in the first half of 2016 track your head, and they track your two hands.
[Perlin:] “In order to have a full social experience with other people of being in a world together, you also need to know where your feet are. Once you know your head and your hands and your feet, then you can build a computer graphic representation of everybody.”
This version promises to address a major problem the technology faced in the past.
[Perlin:] “Motion sickness was a problem when the delay between my head movement and the graphics that I saw exceeded a certain threshold, generally about a 10th of a second. Modern technologies that make use of these inertial trackers in the headsets have pretty much gotten rid of that.”
Virtual reality will first invade our homes offices and classroomsthrough games and educational tools. But Perlin thinks the technology will become much more than that over time. Read the rest of this entry »
Rachel Metz writes: I’m sitting in Gordon Wetzstein’s lab at Stanford University with a hacked-together prototype of a head-mounted display strapped to my face, using a wireless Xbox controller to manipulate a series of 3-D models: a lion, a chessboard filled with chess pieces, an espresso machine, and so on.
“…the technology has improved immensely in the last couple years, there are still plenty of crucial issues to be sorted out—among them that feeling of motion sickness that some people like myself have when experiencing virtual reality, which arises from what’s known as vergence-accommodation conflict.”
The images are fairly simple, run-of-the-mill models—the kind that anyone could download from the Internet. What is interesting, though, is what happens as I stare at the models, turning them with the controller so I can inspect them from different angles: I can focus on the different parts of the images at different depths as I would when gazing at something in real life, so when I look at, say, the chess pieces up close, those in the background look fuzzy, and vice versa when I focus on the pieces in the distance. And I don’t feel nauseous or dizzy like I sometimes do when I’m playing around with virtual reality, especially when looking at objects that are close to my face.
“In real life, when you’re looking at something—a flower, for instance—your eyes move and the lens in each eye adjusts to bring whatever’s in front of you into focus. With stereoscopic 3-D, a technology commonly used by companies making virtual reality headsets, things gets trickier.”
Virtual reality is on the verge of commercial availability, with consumer-geared headsets like the Oculus Rift poised for release next year (see “Oculus Shows Its First Consumer Headset, Circular Hand Controls”). Yet while the technology has improved immensely in the last couple years, there are still plenty of crucial issues to be sorted out—among them that feeling of motion sickness that some people like myself have when experiencing virtual reality, which arises from what’s known as vergence-accommodation conflict.
This conflict is what Wetzstein, an assistant professor of electrical engineering, and other researchers at Stanford are trying to solve with the headset I tried on, which they call a light field stereoscope—essentially, a device that uses a stack of two LEDs to show each eye a “light field” that makes virtual images look more natural than they typically do.
Mike Orcutt writes: For all the hype around smart glasses, none of them actually look like normal glasses. But Vuzix, which develops wearable display technology for military and industrial applications, plans to change that this summer by releasing a pair of sleek wraparound shades that will let users see colorful images projected over objects in the real world.
Sunglasses made with nanoscale optical technology hint at a near future of inconspicuous head-mounted displays.
Vuzix CEO Paul Travers says his company’s sunglasses will not only be less bulky and obtrusive than Google Glass, they’ll also provide an augmented reality experience that actually resembles the one portrayed in Google’s first promotional video for Glass, in which useful bits of information like navigational cues are displayed in the middle of the wearer’s field of vision. This isn’t possible today with Glass, whose display sits off to the side, above the right eye, and is the visual equivalent of a 25-inch high-definition television seen from eight feet away.