Try doing this on your iPhone: Researchers have developed a prototype "supercamera" that stitches together images from 98 individual cameras (each with a 14-megapixel sensor) to create a 960-million-pixel image with enough resolution to spot a 3.8-centimeter-wide object 1 kilometer away. Applied to a 120°-wide, near-fisheye view of the Seattle skyline (main image), the 93-kilogram camera (inset, upper left) captured enough detail to read the fine print on signs as much as two blocks away (bottom row, third and fourth from left). The camera's optics occupy only 3% of the volume of its 75-centimeter-by-75-centimeter-by-50- centimeter frame—a size needed both to contain the camera's circuit boards and to keep them from overheating, the researchers report online today in Nature. While other camera systems can generate gigapixel-and-larger images, those composite views are stitched together from individual images taken sequentially with one camera as it is panned across the scene; the new system takes all 98 images simultaneously, providing a "stop action" view of a scene. Future, more compact versions could inaugurate the era of handheld gigapixel photography. Such cameras could be useful for any number of military, commercial, or scientific purposes, the researchers suggest, changing the central challenge of photography from "Where should we point the camera?" to "How do we extract useful data from these superhuge images?"
No comments:
Post a Comment