Pages

Tuesday, June 25, 2013

Many should be better than one..


Multi-aperture imaging. What is that? Many cameras (lenses/apertures) that work in some synchronized fashion to give something more than a single camera.. typically wider field of view or more resolution. Multi-aperture imaging has been noticeably prominent at many places in the Imaging Congress and I’ll probably write a bit about it here.

The James Webb Space Telescope has many foldable
segments working together to form a single pupil aperture

The volume of an imaging system is proportional to the cube of its aperture diameter. Using multiple smaller cameras instead results in a shorter system. Typically in computational photography we consider multi-view images. Each image in this case is an incoherent image. They can be digitally aligned and combined to get a wider field of view or a higher resolution image.


The Very Large Array telescope at New Mexico
is a radio (not light) telescope array which uses aperture synthesis
Sam Thurman from Lockheed Martin spoke yesterday on this topic at the Computational Optics, Sensing and Imaging (COSI) meeting. He first spoke about combining the optical fields from each imager physically using delay lines and optical phasing, resulting in a single incoherent image capture. This gives better image quality and resolution than separate incoherent images. The SNR of multi-aperture systems depends on the aperture fill-factor. So choosing the arrangement of your apertures is important. If the fill factor reduces, the exposure may need to increase. Gaps in the passband of the system might result in loss of resolution.

Then Sam discussed coherent multi-aperture imaging with active laser illumination and digital holography. In this case each aperture forms a digital hologram. A Fourier-transform, shift, crop and inverse Fourier transform of this hologram gives the reconstructed object field. Obtaining the object fields from each such aperture, digitally phase-aligning and combining them eliminates the physical delay line related hardware. This results in an even lighter system. Sam showed great images taken with such coherent multi-aperture systems. (Note: I'll add images of systems like Sam's if I find free images.. so far the JWST and VLA radio array were the most easily accessible ones online which somewhat illustrate the idea.)

The Applied Industrial Optics meeting has an excellent session planned for Wednesday afternoon that has several folks speaking about the industrial design, manufacture and application of multi-aperture imaging systems. Interested? I’ll be there!

Monday, June 24, 2013

Adding color..

Much of my time is spent with light – coaxing it to do what I need. Most often I just pop a sensor like a CCD or a CMOS in front of my beam, get the image out and get to work processing the data to reconstruct, estimate, optimize. But when we live in the world of computational imaging, combining optics and image processing, there’s no way that we can take this detector for granted. So in today’s post, it’s hats-off to the folks that made a device that enables so many things!

Bayer Color Filter Array
Ref: http://en.wikipedia.org/wiki/File:Bayer_pattern_on_sensor.svg

Light detection is a complicated, ever changing world. From projecting light temporarily on screens, to films, to digital detectors, the history of image capture has been spectacular! 

On the first day at OSA’s Imaging Congress at Arlington, VA, we saw Michael Kriss give tribute to Bryce Bayer, who is credited with the Bayer color filter – the distinctive green-red-blue-green filter pattern on sensor pixels in most digital color cameras. The choice of wavelengths, their passbands, relative proportions and arrangement that are distinctive of the Bayer pattern, were developed to enable digital cameras to emulate the tri-color vision response that we humans take so easily for granted. Since the pattern was first developed it has been used everywhere, spawning associated work on demultiplexing the colors, displaying color images, interpolating other colors based on the three captured wavelengths.. the list goes on.

With the many advances in patterning and fabrication processes, and with the explosion in the demand for images, many different filters and structures have since found their way on to sensors. People are now interested in more than RGB wavelengths. There is need for polarization imaging, IR+RGB imaging, high-dynamic range imaging, and many more modalities.

Still… the basic color image continues to be the most loved and shared medium of capturing a personal moment.

How nice would it be, to have your work reach so many across the world.. inspire more and better!