22 February 2008
New Camera Chip Design Can Take Photos in 3D
CNET blog authored by journalist Stephen Shankland, has reported on interesting research taking place at Stanford University.
Sure, its already possible to take a 3D images with the classic sensors, it is even possible to take the 3D photos with our lovely phones by the little help of the small application that merges 2D photos taken from diffrent angles into one 3D image but this is something a bit different.
Most folks think of a photo as a two-dimensional representation of a scene. Stanford University researchers, however, have created an image sensor that also can judge the distance of subjects within a snapshot.
To accomplish the feat, Keith Fife and his colleagues have developed technology called a multi-aperture image sensor that sees things differently than the light detectors used in ordinary digital cameras.
Each subarray on the multi-aperture sensor captures a small portion of the overall image, a portion that overlaps slightly with that of the neighboring subarrays. By comparing the differences, a camera can judge the distance of elements in the subject. (Note that this mock-up differs from reality, in which each subimage would be rotated 180 degrees, but this makes the idea easier to grasp.)
Instead of devoting the entire sensor for one big representation of the image, Fife's 3-megapixel sensor prototype breaks the scene up into many small, slightly overlapping 16x16-pixel patches called subarrays. Each subarray has its own lens to view the world--thus the term multi-aperture.
Read the full article at
.:[ Stephen Shankland's blog ]:.