The Immersion '94 project led by Michael Naimark at Interval Research Corporation used stereo imagery and image-based modeling and rendering techniques to capture and render a virtual hike down the trails of Banff National Forest. The project team consisted of Michael Naimark, John Woodfill, Paul Debevec, and Leo Villareal. See also the Dimensionalization Studies page maintained by Michael Naimark.


Right Image

Left Image

Computed Depth Map


Synthetic View One Meter Forward

Synthetic View One Meter Backward

The top two photos are a stereo pair (reversed for cross-eyed stereo viewing) taken in 1993 by Michael Naimark in Canada's Banff National forest. In the center is a depth map computed using a stereo correspondence algorithm designed by John Woodfill and Ramin Zabih; intensity indicates depth, with brighter pixeles being closer. Pixels the algorithm did not reliably match are indicated in blue. Below are two virtual views generated by casting each pixel out into space based on its depth, and reprojecting it into a virtual camera. On the left is the result of virtually moving one meter forward, on the right is the result of virtually moving one meter backward. Note the dark de-occluded areas produced by these virtual camera moves; these areas were not seen in the original stereo pair. In the animations below, such regions were filled in from neighboring stereo pairs.

Michael Naimark presented the Immersion '94 project at SIGGRAPH 95 at the panel session "Museums without Walls: New Media for New Museums".

Click below for a QuickTime of the video animations from the project:



immersion94.mov
-- 2:26 -- 14,846,354 bytes.


Paul E. Debevec / paul@debevec.org