Virtual Reality is making a comeback! Twenty
years after the first wave of hype, virtual reality is making a
comeback in NASA laboratories.
By Patrick L Barry and Dr Tony
Phillips
In the Matrix sequels, the metal gates to the city of Zion
are operated by air traffic controllers who work inside a virtual
control tower: a computer-generated, heavenly-white space where
controllers use fancy virtual control panels to guide sci-fi hovercraft.
This fantasy scenario
must seem familiar to anyone who rode the wave of Virtual Reality
hype during the 1980s. Helmet-mounted displays, power gloves, 3D
sights and sounds: these technologies were supposed to make immersive
environments commonplace, revolutionizing everything from video
games to stock market analysis to psychotherapy.
It didn't happen.
"The technology of
the 1980s was not mature enough," explains Stephen Ellis, who leads
the Advanced Displays and Spatial Perception Laboratory at NASA's
Ames Research Center. Virtual Reality helmets and their optics were
too heavy. Computers were too slow. Touch-feedback systems often
didn't work. The only thing consistently real about Virtual Reality
were headaches and motion sickness - common side effects of '80s-era
helmets.
At NASA/Ames,
Dr. Stephen Ellis models a virtual reality helmet.
|
Twenty years later,
things have improved. Computers are thousands of times faster; Virtual
Realioty peripherals are lighter-weight and they deliver a greater
sense of feedback and immersion. And, importantly, researchers are
beginning to understand crucial human factors, and are eliminating
nausea and fatigue from the virtual reality experience.
Once again, virtual
reality seems promising, and NASA is interested.
Picture this: an astronaut
on Mars sends a rover out to investigate a risky-looking crater.
Slip-sliding down the crater wall, the rover sends signals back
to the Mars Base where the astronaut, wearing virtual reality goggles
and gloves, feels like she herself is on the slope. Is the find
important enough to risk venturing out in person? virtual reality
helps decide.
In another scenario,
astronauts could use virtual reality to perform repairs on the outside
of their spacecraft by controlling a human-like robot, such as the
Robonaut being developed at Johnson Space
Center (JSC).
Image credit: John Frassanito & Associates,
Inc.
An artist's concept of the NASA Robonaut.
|
Ellis, who holds advanced
degrees in psychology and behavioral science, evaluates virtual
reality for space applications. At the moment he's investigating
user interfaces for robots such as AERCam, short for Autonomous
Extravehicular Robotic Camera. These are spherical free-flying robots
being developed JSC to inspect spacecraft for trouble-spots. AERCam
is designed to float outside, e.g., the ISS or the space
shuttle, using small xenon-gas thrusters and solid-state cameras
to view the vehicle's outer surfaces and find damage in places (such
as the shuttle's underside) where a human spacewalker or the orbiter's
robotic arm can't safely go.
The current plan is
to use a laptop and a normal, flat monitor to operate AERCam. But
Ellis is conducting research, funded by NASA's Office of Biological
and Physical Research, to see if a virtual environment might be
a better option. With a virtual reality system, the astronaut could
maneuver the melon-sized AERCam with standard hand controls while
intuitive head movements rotate AERCam to let the astronaut "look
around."
Ellis' research is
necessary because, he says, "virtual reality isn't always the best
choice." For example, at the Wright Patterson Air Force Base researchers
have tested virtual reality interfaces for pilots. Time after time,
their tests showed that pilots perform better with traditional panel-mounted
displays.
Why? No one is sure,
but here's one possibility: The field of view of the virtual reality
helmet was narrower than the pilots' natural peripheral vision.
Ellis believes these helmets effectively divorced the pilots from
the cockpit - the environment in which they learned to fly.
"There are some surprisingly
simple ergonomic issues that can interfere with virtual reality
interfaces," adds Ellis. For example, "in the early 1990s Mattel
sold the PowerGlove (a simple virtual reality glove) as a novel
way to control video games. It was cool. But kids quickly discovered
that it's very tiring to hold your hand up in front of you long
enough to play an entire game." You'd have to be an athlete to use
it. (The glove is no longer sold.)
Since the 1980s there
has been a dawning awareness among researchers that human factors
are crucial to virtual reality. Age, gender, health and fitness,
peripheral vision, posture, the sensitivity of the vestibular system:
all of these things come into play. Even self-image matters. One
study showed that people wearing virtual reality helmets like to
glance down and see their own virtual body. It helps "ground them"
in the simulation. And the body should be correct: arms, legs, torso;
male for men; female for women.
For every virtual environment,
there is a human-computer interface, and if the interface doesn't
match the person … game over.
To address these human
factors, Ellis's group performs fundamental research on human senses
and perception. One central concern is how people cope with "latencies,"
or delays, in the virtual reality system. When you swing your head,
does the virtual view follow immediately, or is there a split-second
lag? If your eyes and your inner ear (where vestibular organs sense
orientation) send conflicting reports to the brain, you might need
a motion-sickness bag.
more
Organs in the inner ear (the vestibular system) affect human
balance. Vestibular adaptation is a key human factor in
Virtual Reality interface design.
|
"The question is how much delay can
you tolerate?" Ellis says. For movement within the virtual environment
to feel natural, most people need the delay to be less than 15 milliseconds
(thousandths of a second), according to his group's research.
Bernard Adelstein,
Durand Begault, and Elizabeth Wenzel, colleagues of Ellis who work
in the Advanced Displays Laboratory at Ames, have discovered that,
because sounds in a virtual environment can be generated much faster
than touch feedback from a virtual reality glove, sound can help
compensate for the delay in touch. For example, when grabbing a
virtual object, the immediate "click" sound of contact enhances
the user's tactile perception of realism.
The years of research
are finally beginning to pay off, Ellis says. "The fully immersive,
head-mounted system is getting to be high enough fidelity for practical
use. We'll probably have the AERCam virtual reality experiment running
by August."
The Matrix will take
a little longer.
|