|
IBM's Dr. Mark Lucente takes a walk on Mars. |
walk on Mars!
Humans should be able to interact with a computer on their own terms--not
the machine's. That's the idea behind the Visualization Space project at
IBM Research. We're not just talking about an ergonomic mouse, either. The demo
we saw used no input devices other than a human voice and body movement.
The Visualization Space computer scans a room to establish a baseline
background, then creates outlines of human forms as they move through the
space. Using in-depth color analysis, the computer can keep track of hands
and limbs. Users can then point at, grab, and move virtual objects without
resorting to clunky VR gloves or other pointing devices. Vague phrases like
"make it bigger" result in onscreen action, and the application can learn
exactly what each user means by every phrase.
You can even navigate virtual spaces simply by walking around. Moving to
the left or right gives you a different viewing angle, and to move from
place to place, you need only point at a spot on the screen and say, "Go
there." The demonstration we saw featured a VRML world created with the
recent Pathfinder data, allowing the user to "walk" on Mars.
It's cool, but don't expect to see Visualization Space on your desktop
anytime in the next couple of years: even on a quad
Pentium Pro system with half a gigabyte of RAM, the demo ran slowly.
Friday, Nov 21, 1997
|
|