sLecture

Topic 1: Optical Imaging Systems
Intro
Lenses
Space Domain Models for Optical Imaging Systems


The Bouman Lectures on Image Processing

A Slecture by Maliha Hossain

Subtopic 3: Space Domain Models for Optical Imaging Systems

© 2013




Excerpt from Prof. Bouman's Lecture



Accompanying Lecture Notes


PSF, OTF and MTF

Imaging systems are well approximated by linear space invariant systems theory, which is why we will use it to describe them below.

Fig 1: Imaging a Point Source


I was a little thrown off by the professor stating that the "the aperture is the H(u,v)" during lecture so after a bit of reading, I concluded that the size of the aperture and the bandwidth of the PSF of the system are directly related. None of what I am about to say in the next two paragraphs will be on the test. Think about it as a review.

If you want to look at it in the frequency domain, A larger aperture corresponds to greater bandwidth. Higher frequencies are allowed to pass through the system resulting in greater resolution in the space domain.

If you want to think about it in the space domain and are familiar with Rayleigh's criterion and diffraction, you will recall that for parallel rays incident on a circular aperture, the angular resolution $ \theta $, and the spatial resolution $ \Delta l $ are given by

$ sin\theta = 1.22\frac{\lambda}{A} $

$ \Delta l = 1.22\lambda N = 1.22\lambda \frac{d_f}{A} $

where $ \lambda $ is the wavelength of the light rays entering the imaging system. So we see that the larger the aperture, the smaller the values of $ \theta $ and $ \Delta l $. So the larger the aperture, the smaller the minimum resolvable detail.

We can characterize the lens for a given aperture by its impulse response. Its Point Spread Function (PSF) is analogous to its impulse response since the PSF describes the system's response to a point input (think about a point input as $ \delta (x,y) $).

The PSF will be denoted by the function $ h(x,y) $ in the space domain. Its CSFT will be given by $ H(u,v) $.

Ideally, a point input, would be represented in an image as a single point, or in a digital system, as a single pixel. Let this ideal image be $ f(x,y) $. Consider this to be the input to your imaging system.

Your actual image of the point input however, will be reproduced as something other than a single pixel. This is the output of your imaging system. Let the output of the system be $ g(x,y) $.

Let us assume, for simplicity, that your system is space-invariant. This means that input impulses at different $ x, y $ co-ordinates yield the same PSF, $ h(x,y) $. The image you form on the focal plane array is given by the convolution of the ideal image you should have formed with the PSF of the system.

$ \begin{align} g(x,y) &= \int_{-\infty}^{\infty}\int_{-\infty}^{\infty}f(\xi,\eta)h(x-M\xi,y-M\eta)d\xi d\eta \\ &= \frac{1}{M^2} \int_{-\infty}^{\infty}\int_{-\infty}^{\infty}f(\frac{\xi}{M},\frac{\eta}{M})h(x-\xi,y-\eta)d\xi d\eta \end{align} $

So if you take away the magnification factor (or assume that $ |M| = 1 $), the resulting image is like the convolution of what the image should have been with the PSF of the system.

Alternatively, you could define the function

$ \tilde{f}(x,y) := f(\frac{\xi}{M},\frac{\eta}{M}) $

Then the imaging system acts like a 2-D convolution where

$ g(x,y) = \frac{1}{M^{2}} h(x,y)* \tilde{f} (x,y) $

Real imaging systems are not perfectly space invariant. Consequently, as you move around on the image plain the PSF will vary. You can observe this if you go out on a starry evening and place a camera on a tripod and look at some of the starts in the sky. The stars are like perfect point sources. The image you capture with your camera will have points on it but if you zoom in on a point using your computer, you will notice that the image is not really a point; it will be a blur spanning a cluster of pixels. If you use different apertures, you will also notice that for larger f-stop, you get a bigger blur and vice versa.

So now you have essentially put the delta function, $ \delta (x,y) $, through your system to obtain the PSF because the star is like a delta function. If you take its Fourier transform, you will get the frequency response of the system. Given that your system is space-variant, you will notice that stars close to the edge of your photo produce PSFs that are different from those of stars near the center of you the photo.

At this point, let's define two other functions to characterize the performance of an imaging system. People are often interested in the magnitude of $ H(u,v) $ normalized by $ H(0,0) $. This function is called the Modular Transfer Function (MTF) of the system. It is the absolute value of the Optical Transfer Function (OTF) for the system. So we have that

$ \begin{align} OTF &= \frac{H(u,v)}{H(0,0)} \\ \Rightarrow MTF &= |\frac{H(u,v)}{H(0,0)}| \end{align} $

Recall that

$ H(0,0) = \int_{-\infty}^{\infty} h(x,y) dxdy $

Figure 4 shows the one-sided MTFs for an imaging system along the individual axes in the frequency domain. The units on the horizontal axes are in cycles per pixel but they are also often measured in cycles per inch. MTFs may be quoted in terms of their 3 dB cutoff frequencies. A system with a wider MTF will have higher resolution since it can pass higher frequencies corresponding to a narrow PSF function in the space domain (less blur). Conversely, systems with narrow MTF functions will have lower resolution corresponding to a wider PSF function in the space domain (blurry image).

Fig 2: One_Sided MTFs of a Hypothetical Imaging System along Different Axes in the Frequency Domain



References

  • C. A. Bouman. ECE 637. "Class Lecture. Digital Image Processing I". Faculty of Electrical Engineering, Purdue University, Spring 2013.
  • J. Allebach. ECE 438. "Digital Signal Processing". Legacy Lecture Notes. Faculty of Electrical Engineering, Purdue University.
  • L. Guo, Z Wu, L. Zhang, J. Ren. "New Approach to Measure the On-Orbit Point Spread Function for Spaceborne Imagers" in Opt. Eng. 52(3), 033602. Mar 04, 2013.



Questions and comments

If you have any questions, comments, etc. please post them on this page



Back to the "Bouman Lectures on Image Processing" by Maliha Hossain

Alumni Liaison

Correspondence Chess Grandmaster and Purdue Alumni

Prof. Dan Fleetwood