Physical mockups help in making design engineering decisions, but they’re pricey and making them, time-consuming. A quicker mockup might combine some physical objects in front of a computer-aided design (CAD) display, but that might not simulate enough of the physical environment and the user’s experience. A more encompassing simulation of the physical world can come from a cave automatic virtual environment (CAVE), but that’s expensive.
That’s where zSpace (zspace.com) comes in: “CAVE on a desktop,” says Veejay Gahir, director of global manufacturing solutions for the Sunnyvale, CA-based firm. zSpace is a system that generates realistic visualizations. It’s immersive, interactive, and “comfortable,” and it lets people manipulate simulated objects as if those objects were real.
The zSpace system consists of four elements: liquid crystal display (LCD) monitor, polarized glasses, wired stylus, and a software development platform (SDK) for integrating these components into existing rendering systems. The monitor is a full-color, high-resolution, quad-buffer, stereoscopic display measuring about 24-in. with a resolution of 1920 x 1080 pixels. It runs at 120 Hz, which is effectively displaying at 60 Hz for both the right and left eyes. (At that speed, rendered displays move smoothly in virtual space.) The glasses are passive; that is, they are not shutter glasses and they do not need a connection (wired or radio frequency) or batteries. The monitor has sensors to track the location of the glasses, which also stand as a proxy for where a person is looking. Based on this location, the zSpace software automatically renders right and left 2D views of objects on the monitor in real time. The result makes the 3D objects appear to float in space in front of the user’s eyes. People interact with this holographic display using the stylus, which contains sensors to determine its position and movement in space. The stylus acts like a 3D pointing device. It has three buttons: one for selecting objects, and two others for user-defined actions. It also has programmable vibration control for tactile feedback.
The 3D image is a technological trick on the brain. People see two different images—projections from the 2D monitor, one for each eye—based on “horizontal binocular parallax,” namely what each horizontally separated eye sees. The brain fuses these two images together into one, convincing the person that the displayed objects have depth. Hence the 3D.
However, there’s also motion parallax to contend with. Vision has evolved such that the image’s “3D-ness” is dependent on the relative motion of one object to the other as people move their head slightly, usually unconsciously, from side to side. This lateral movement causes the background behind objects in the foreground to repeatedly appear and disappear. (Cats, for instance, induce these movements by wagging their tails while stalking objects to pounce upon.) An object’s movement relative to the background is dependent on the distances from a person’s eyes to both that object and the background.
zSpace combines these visual cues by working backwards. It establishes the exact orientation and position of the person’s eyes by tracking the X, Y, and Z location of the eyeglasses. Based on that information, the system quickly renders a pair of stereo images to simulate the effect of binocular vision and motion parallax. The resulting views are comfortable to look at—no headaches due to mixed visual messages, no eyestrain, no unfocused or otherwise illogical views. “Visual comfort was one of the major design objectives, maybe the major design objective for our system, because stereo viewing has been done wrong so many times before to the point where the market has rejected it for the most part,” explains Dave Chavez, zSpace’s CTO.
zSpace lets people look around the simulated objects and visualize multiple perspectives simply by moving their head. Using the stylus, people can “grab” and move objects as they wish by reaching into the virtual volume of space in front of the monitor. Depending on the rendering software, people can combine objects, add and drop points, edit geometry, and do other design functions.
Plenty of firms sell virtual reality systems with various forms of visualization. This raises two obvious questions: What do engineers, designers, and managers want, and what are they willing to spend? CAVEs are available for rent at several major universities around the United States. Or for a few hundred thousands of dollars, companies can buy such a setup. zSpace is not as immersive as CAVE; nor does it immediately allow multiple people to view the same displays simultaneously. Virtual reality headgear solves both those problems, and from a visual perspective, the headgear is very immersive. Till recently, the headgear has been physically heavy and relatively expensive. Lightweight, affordable virtual-reality glasses with heads-up display are soon to come to the market. However, as far as sitting at one’s desk, keeping hardware requirements modest, and both comfortably seeing and interacting with a high-precision, high-quality, 3D display, there seems to be only one other company offering something similar to zSpace.
zSpace runs on the Microsoft Windows operating system and professional-level OpenGL-based graphics cards (nVidia Quadro and AMD FirePro series) or, at the consumer level, AMD Radeon graphics cards using DirectX. zSpace lists for about $4,000; for developers, $1,500. (Says Chavez, “We’re not in the business of writing applications. We’re encouraging others to do that.”) zSpace plug-ins can work with any CAD program as long as it has stereo-rendering capabilities. For example, there’s a plug-in for Autodesk Maya, a CAD product extensively used for concept modeling, application textures, animations, force analysis, and other product design and engineering issues. The plug-in is bidirectional: virtual holographic displays are based on the CAD data in Maya, and people can make changes to that data using the zSpace stylus.