Binary Detection Using Multi-Hypothesis Log- Likelihood, Image Processing

Size: px
Start display at page:

Download "Binary Detection Using Multi-Hypothesis Log- Likelihood, Image Processing"

Transcription

1 Air Force Institute of Technology AFIT Scholar Theses and Dissertations Binary Detection Using Multi-Hypothesis Log- Likelihood, Image Processing Brent H. Gessel Follow this and additional works at: Recommended Citation Gessel, Brent H., "Binary Detection Using Multi-Hypothesis Log-Likelihood, Image Processing" (2014). Theses and Dissertations This Thesis is brought to you for free and open access by AFIT Scholar. It has been accepted for inclusion in Theses and Dissertations by an authorized administrator of AFIT Scholar. For more information, please contact

2 BINARY DETECTION USING MULTI-HYPOTHESIS LOG-LIKELIHOOD, IMAGE PROCESSING THESIS Brent H. Gessel, Captain, USAF AFIT-ENG-14-M-34 DEPARTMENT OF THE AIR FORCE AIR UNIVERSITY AIR FORCE INSTITUTE OF TECHNOLOGY Wright-Patterson Air Force Base, Ohio DISTRIBUTION STATEMENT A: APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED

3 The views expressed in this thesis are those of the author and do not reflect the official policy or position of the United States Air Force, the Department of Defense, or the United States Government. This material is declared a work of the U.S. Government and is not subject to copyright protection in the United States.

4 AFIT-ENG-14-M-34 BINARY DETECTION USING MULTI-HYPOTHESIS LOG-LIKELIHOOD, IMAGE PROCESSING THESIS Presented to the Faculty Department of Electrical and Computer Engineering Graduate School of Engineering and Management Air Force Institute of Technology Air University Air Education and Training Command in Partial Fulfillment of the Requirements for the Degree of Master of Science in Electrical Engineering Brent H. Gessel, B.S.E.E. Captain, USAF March 2014 DISTRIBUTION STATEMENT A: APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED

5 AFIT-ENG-14-M-34 BINARY DETECTION USING MULTI-HYPOTHESIS LOG-LIKELIHOOD, IMAGE PROCESSING Brent H. Gessel, B.S.E.E. Captain, USAF Approved: //signed// Stephen C. Cain, PhD (Chairman) Date 20 Feb 2014 //signed// Keith T. Knox, PhD (Member) Date 13 Feb 2014 //signed// Mark E. Oxley, PhD (Member) Date 20 Feb 2014

6 AFIT-ENG-14-M-34 Abstract One of the United States Air Force missions is to track space objects. Finding planets, stars, and other natural and synthetic objects are all impacted by how well the tools of measurement can distinguish between these objects when they are in close proximity. In astronomy, the term binary commonly refers to two closely spaced objects. Splitting a binary occurs when two objects are successfully detected. The physics of light, atmospheric distortion, and measurement imperfections can make binary detection a challenge. Binary detection using various post processing techniques can significantly increase the probability of detection. This paper explores the potential of using a multi-hypothesis approach. Each hypothesis assumes one two or no points exists in a given image. The loglikelihood of each hypothesis are compared to obtain detection results. Both simulated and measured data are used to demonstrate performance with various amounts of atmosphere, and signal to noise ratios. Initial results show a significant improvement when compared to a detection via imaging by correlation. More work exists to compare this technique to other binary detection algorithms and to explore cluster detection. iv

7 Table of Contents Abstract Page iv Table of Contents List of Figures v vii List of Tables x List of Acronyms xi I. Introduction Binary detection Space situational awareness Research objectives Organization II. Background Post-Process Imaging Atmospheric Turbulence Deconvolution Knox-Thompson Bispectrum Imaging by Correlation Binary detection by post-process image reconstruction summary Multi-Hypothesis Detection Chapter Summary III. Methodology Derivation of multi-hypothesis algorithms Zero point source derivation Single point source derivation Two point source derivation Software implementation Point source threshold zero point source implementation v

8 Page Single point source implementation Two point source implementation Decision process Simulation test model Modeling atmospheric effects Test variables Performance measurements for simulated data Simulated comparison model Measured data test model Description of measured data Deriving a valid PSF Determining success of measured data test IV. Results Multi-hypothesis binary detection simulation results Image generation Log-likelihood bias correction Probability of Detection (P D ) sample results Probability of false alarm (P f a ) simulation results P D and P f a result summary Comparison with imaging by correlation technique Probability of False Alarm (P f a ) comparison results Probability of Detection (P D ) comparison results Comparison result summary Measured data processing results First image: two points spaced far away Second image: two points in close proximity Third image: two points touching Fourth image: single point Measured data result summary V. Future work Summary Future work Bibliography vi

9 List of Figures Figure Page 2.1 Basic adaptive optic process Speckle interferometry simulation with 200 independent frames. (a) The binary source image, (b) the simulated average of short exposure images, (c) speckle transfer function, (d) speckle transfer function showing fringe spacing Stack of 50 short exposure images, each simulated through a unique random phase screen with r o = 30 cm Result of cross spectrum method with r o = 30cm, original image (a), cross section of original image (b), reconstructed image (c), cross section of reconstructed image (d) Results with different numbers of iterations: (a) 1, (b) 5, (c) 10, (d) 15, (e) 80, and (f) 100 iterations Results of the correlation technique: (a) original image, (b) original image cross section, (c) reconstructed image, and (d) reconstructed image cross section Overview of binary hypothesis method. Chapter 3 will work through derivations of each step Sample simulated detected images of binary source with 500 photons for each point. Samples taken for various background noise and D/r 0 values as given next to each image. Zoomed in to 31x31 pixels Sample simulated detected images of single point sources used in calculating false alarm rates. Samples taken for various background noise and D/r 0 values as given next to each image. Zoomed in to a 31x31 pixel area vii

10 Figure Page 4.3 False Alarm rate versus D/r 0 for a point source with an intensity of 1000 photons for, (a) background noise level=1, (b) background noise level=2, (c) background noise level= False Alarm rate versus D/r 0 for a point source with an intensity of 1500 photons for, (a) background noise level=1, (b) background noise level=2, (c) background noise level= False Alarm rate versus D/r 0 for a point source with an intensity of 2000 photons for, (a) background noise level=1, (b) background noise level=2, (c) background noise level= Detection rate versus D/r 0 for a binary source with an intensities of 500 and 500 photons for, (a) background noise level=1, (b) background noise level=2, (c) background noise level= Detection rate versus D/r 0 for a binary source with an intensities of 1000 and 500 photons for, (a) background noise level=1, (b) background noise level=2, (c) background noise level= Detection rate versus D/r 0 for a binary source with an intensities of 1000 and 1000 photons for, (a) background noise level=1, (b) background noise level=2, (c) background noise level= First measured image (a) Detected image, (b) Result of hypothesis one, (c) Result of hypothesis two. The log-likelihood of hypothesis two was larger and therefore a binary was detected First measured image (a) Detected image, (b) Result of hypothesis one, (c) Result of hypothesis two. The log-likelihood of hypothesis two was larger and therefore a binary was detected viii

11 Figure Page 4.11 Third measured image (a) Detected image, (b) Result of hypothesis one, (c) Result of hypothesis two. The log-likelihood of hypothesis two was larger and therefore a binary was detected Forth measured image (a) Detected image, (b) Result of hypothesis one, (c) Result of hypothesis two. The log-likelihood of hypothesis one was greater and so a binary was not detected ix

12 List of Tables Table Page 2.1 First 10 Zernike Circular Polynomials [33] Common symbols Probability of Detection (P D ) results using binary source with 500/500 photons Probability of Detection (P D ) results using binary source with 1000/500 photons Probability of Detection (P D ) results using binary source with 1000/1000 photons Probability of False Alarm (P f a ) results using point source with 1000 photons Probability of False Alarm (P f a ) results using point source with 1500 photons Probability of False Alarm (P f a ) results using point source with 2000 photons.. 57 x

13 List of Acronyms Acronym Definition CCD Charge-Coupled Device P D Probability of Detection P f a Probability of False Alarm PSF Point Spread Function OTF Optical Transfer Function FT Fourier Transform PSD Power Spectral Density LiDAR Light Detection And Ranging PMF Probability Mass Function SST Space Surveillance Telescope DARPA Defense Advanced Research Projects Agency GEO Geostationary Earth Orbit xi

14 BINARY DETECTION USING MULTI-HYPOTHESIS LOG-LIKELIHOOD, IMAGE PROCESSING I. Introduction Comparing statistical hypotheses to determine the likelihood of a given event is a proven technique used in many fields, particularly digital communication. The application of a multi-hypothesis test algorithm to the detection of binary stars or other space objects is an area of new exploration. This thesis will explore the usefulness of using a multi-hypothesis technique to resolve close binary objects in space. The first task is to derive a multi-hypothesis algorithm specifically for binary detection and then provide results for various simulated imaging conditions as well as measured binary images. Simulated results will be compared with another technique to explore potential detection improvements. 1.1 Binary detection The ability to discriminate between closely-spaced objects in space has and continues to be a challenge. Finding planets, stars, and other natural celestial objects, as well as trying to keep track of satellites and space debris are all impacted by how well the tools of measurement can distinguish between these objects in close proximity. In astronomy, the term binary commonly refers to two stars in the same system. Splitting a binary occurs when two stars are successfully identified. Some binaries are easily split using basic equipment. As the source intensity, object distance, and/or atmospheric distortion varies, the binaries will appear as just a single object. In this paper, the term binary refers to any two closely spaced objects. Several image post processing techniques have been developed 1

15 to increase spatial resolution and/or look for patterns common to binary objects. Most of the commonly used methods today focus on image reconstruction and deblurring. This thesis will show that if an image s Point Spread Function (PSF) is known and the source is a single or double point source, the log-likelihood of the most probable hypothesis provides a statistical comparison of how likely a blurred image contains a binary. 1.2 Space situational awareness One of the United States Air Force missions is to track space objects, particularly of the synthetic kind. Due to the physics of spatial resolution, it is extremely difficult to resolve satellites in geosynchronous orbit (35,786 km). At this distance the size of a satellite is typically smaller than one pixel on a high quality Charge-Coupled Device (CCD) camera. For example, if a satellite in geosynchronous orbit is 15 meters wide (the length of a full size school bus) and a telescope with an aperture of 4 meters and focal length of 64 meters is focused on it, the size on the CCD would be: size = (sat.size)( f ocal.length) distance = (15m)(64m) 35, 786, 000m = 2.64µm. (1.1) This is much smaller than a typical CCD pixel used in astronomical telescopes. This is also smaller than the Raleigh criteria for spatial resolution for this same setup [14]: Resolution (1.22)(λ)( f /#) = (1.22)(550nm)(16) = 8.48µm (1.2) where λ is the wavelength of light and f /# is the f-number, which is the focal length divided by the diameter of the optical system. The atmosphere will further blur this image and after a certain amount of exposure time the image will typically be a few to several pixels of light. At this distance, it can be extremely difficult to tell if there is one, two or several objects in the spot of light. The multi-hypothesis method discussed in this thesis has the potential to increase the probability of correctly detecting binaries at geosynchronous orbit and other scenarios important to the USAF. 2

16 1.3 Research objectives The question posed in this thesis is how well, if at all, can a multi-hypothesis model correctly detect a binary pair in a blurred image? To answer that question four research objectives are documented. First, a statistical model is derived in Section 3.1. Here the derivation of algorithms for zero, one, and two point sources are shown. The second objective is to successfully implement the mathematical algorithms into a software simulation model. This model includes atmospheric effects, background noise, and creates a test environment to work out threshold parameters that maximize the probability of detection while minimizing the false alarm rate. The simulation will be done in Matlab and is covered in Section 3.2. Once the binary hypothesis method is successfully implemented and results measured, it is important to compare them to another modern technique. The third objective is to compare results from another image detection method, specifically imaging by correlation with a threshold set for binary detection. Section 3.3 walks through an implementation of imaging by correlation in Matlab and results are provided in Chapter 4. The final objective is to use measured imagery data where the number of objects is known and see how well the algorithm correctly detects when a binary is present. The imagery data used is focused on a satellite in geosynchronous orbit with a dim star passing by at various distances, including the situation where the two appear as a single object. The decision to select a binary from the algorithm is compared against the truth data and presented in Chapter Organization This research document is organized in accordance with AFIT s thesis guidelines. Chapter 2 will discuss current methods used to detect binaries and contrast them with the proposed, unpublished, multi-hypothesis binary detection method. Chapter 3 provides the 3

17 derivation and simulation methods used to meet the thesis objectives outlined in the previous section. Chapter 4 contains the results of several of the simulation tests as well as the results of processing measured data. Chapter 5 discusses conclusions and opportunities for continued research and operational testing. Complete references of all sources cited are contained in the bibliography. Every attempt has been made to use consistent variables to enhance readability. 4

18 II. Background High spatial resolution of an imaging system is key to achieving detection of binary objects. A standard telescope system cannot achieve diffraction-limited resolution due to factors such as atmospheric and optical aberration. Several techniques have been developed to close the gap between diffraction-limited resolution and a system s actual resolution. This chapter will discuss some of the most common post-processing techniques to increase spatial resolution in astronomical imaging. Additionally, a brief overview of multi-hypothesis statistics is provided. Binary objects do not all behave in the same way. Although the imaging techniques discussed in this chapter can be used on all binaries, specific methods have been developed to look for planets and binary star systems. As gravitational fields of large planets and stars interact, a periodic movement, often referred to as wobble, can be detected [23]. Other indicators such as periodic eclipsing and Doppler-like wavelength patterns can infer the presence of a binary [6]. These methods have been successful on a subset of binaries but cannot be applied to binaries in general. This paper will not focus on gravitational, orbital or other spectroscopic measurements techniques, rather the focus will be on postprocessing techniques useful in detecting any binary. Aside from the natural effect of diffraction, the earth s atmosphere plays a large role in limiting spatial resolution. Virtually all real-time and post-processing de-blurring techniques require an understanding of how the atmosphere is changing the light. Because we know what a diffraction-limited point source looks like, we can compare it to a measured point source through the same atmosphere and use that information to correct the atmospheric distortion [14][12]. Lightwaves from distant point source(s) can be estimated as plane waves right before reaching earth s atmosphere [14]. The atmosphere will randomly distort this wave and the distortion can be measured by a wavefront sensor 5

19 [33]. Unfortunately, it is impossible to have a perfect natural point source everywhere in the night sky at visible wavelengths [11][37][38][39]. To overcome this limitation, one or more lasers can be used to create point sources, or guide beacons, in the area of interest [7][9][10][11][15][37]. A wavefront sensor takes the point source, or guide star, information and determines corrective adjustments then sends that data to an adaptive optics system [32]. Real world performance of adaptive optics can vary from near diffraction limited correction to no visible improvement depending on a host of factors [1][8][16][17][28][36][40]. The need for a valid point source cannot be understated since it is foundational to adaptive optics and the post-processing techniques described in this chapter. A great deal of research still remains in the field of generating and measuring quality point sources. Although adaptive optics is an important technique in moving closer to diffraction limited imaging, it is not currently a practical solution for all imaging sites. Having one or more image post-processing software solutions is a relatively affordable way to augment or supplement adaptive optic systems. It is the area of software post-processing that this paper will focus from this point on. The following sections will discuss two of the most common post-processing methods for binary detection, deconvolution and speckle interferometry. Section 2.2 finishes the chapter with a brief and general look at how multi-hypothesis statistics can be used in binary detection. 6

20 POINT SOURCE LIGHTWAVES RANDOM ATMOSPHERIC TURBULENCE TELESCOPE BEAM SPLITTER CCD ADAPTIVE MIRROR WAVEFRONT SENSOR MIRROR CONTROL Figure 2.1: Basic adaptive optic process. 7

21 2.1 Post-Process Imaging Image post-processing can be an effective and affordable way to increase image resolution. This section will look at two deconvolution methods and a speckle interferometry technique useful for visual binary detection. These processes attempt to reconstruct a higher resolution image from measured data. It is important to note that the multi-hypothesis method does not produce a reconstructed image so it is a fundamentally different approach but still fits within the post-processing category of binary detection Atmospheric Turbulence. Before discussing specific imaging techniques it is important to explain how atmospheric turbulence is modeled in this thesis. The most common methods utilize the approximation that atmospheric effects can be represented as wavefront errors in the pupil plane. If A(u, v) represents the two-dimensional, clear pupil and W(u, v) represents wavefront error as a function of position with respect to a fixed Cartesian coordinate system, (u, v), then the atmospherically blurred image, i(x, y), can be represented as: i(x, y) = F { A(u, v)e jw(u,v)} 2 (2.1) where F is the symbol denoting the Fourier Transform. The wavefront error as a function of position, commonly called a phase screen, can be simulated in various ways. One of the most common methods of phase screen generation utilizes Power Spectral Density (PSD) models, such as von Kármán and modified von Kármán [31]. These are referred to as Fourier Transform (FT) based methods [31]. The model used in the simulations of this paper are based upon another method that uses Zernike polynomials to generate phase screens. It has been shown that the wavefront error, W(u, v), can be expanded with basis functions based on the geometry of the aperture [33]. Common basis functions include, Zernike circular, Zernike annular, Gaussian-circular, and Gaussian-annular. A Zernike circular set of basis functions were used in this paper to match the geometry of the 8

22 simulated aperture. Expansion of the wavefront error, or phase screen, using Zernike circular polynomials is described below. Note the shift to polar coordinates where, ρ = u 2 + v 2 and θ =arctangent(u, v), also note u and v represent grid locations in the pupil plane. Zernike expansion equations are as follows: W(u, v) = W z (ρ, θ) = α i Z i (ρ, θ) 2(n + 1)R m n (ρ)g m (θ) if m 0 Z i (ρ, θ) = R 0 n(ρ) if m = 0 (n m)/2 R m ( 1) s (n s)! n (ρ) = s!( n+m s)!( n m s=0 2 sin(mθ) if i odd G m (θ) = cos(mθ) if i even i 2 )!ρn 2s (2.2) where combinations of the index variables m and n will produce a specific aberration effect, also the index i is a numerical index. Table 2.1 below shows the mapping of the first 10 Zernike circular polynomials and the index mapping of m, n, and the numerical index, i. The wavefront error, W(ρ, θ) can be measured or simulated. It is worth repeating that the wavefront error is using Zernike polynomials: W z (ρ, θ) = α i Z i (ρ, θ). (2.3) To simulate phase screens we need to generate the coefficients, α i, to weight each polynomial at a given polar coordinate, ρ, θ. To do this the work of Roddier was utilized, who demonstrated that by using a Cholesky decomposition of the covariance matrix of the Zernike coefficients, statistically accurate atmospheric phase screens can be generated [30]. The following discussion will provide a basic explanation of this method. First we note that α in Equation 2.3 will be an N 1 vector where N is the number of Zernike polynomials used to form the basis. Given the covariance matrix, C i, j, for two Zernike polynomials and associated amplitudes, α i and α j : i 9

23 Table 2.1: First 10 Zernike Circular Polynomials [33]. i m n Zn m (ρ, θ) Name piston ρ cos(θ) x tilt ρ sin(θ) y tilt (2ρ 2 1) 6ρ 2 sin(2θ) 6ρ 2 cos(2θ) 8(3ρ 3 2ρ) sin(θ) 8(3ρ 3 2ρ) cos(θ) 8ρ 3 sin(3θ) 8ρ 3 cos(3θ) 5(6ρ 4 6ρ 2 + 1) defocus y primary astigmatism x primary astigmatism y primary coma x primary coma y primary trefoil x primary trefoil x primary spherical C i, j = E[α i, α j ] (2.4) C = LL T (2.5) where L T denotes the conjugate transpose of the lower triangular matrix L. The covariance matrix generated from two Zernike polynomials Z i and Z j has been derived by Knoll [27][30]: C i, j = E[α i, α j ] = where: K Zi Z j δ Z Γ[(n i + n j 5/3)/2](D/r 0 ) 5/3 Γ[(n i n j 17/3)/2]Γ[(n i n j 17/3)2]Γ[(n i n j 23/3)/2] (2.6) K Zi Z j = Γ(14/3)[(24/5)Γ(6/5)]5/6 [Γ(11/6)] 2 2π 2 ( 1) (n i+n j 2m i )/2 (n i + 1)(n j + 1) (2.7) and: δ Z = [(m i = m j )] [parity(i, j) (m i = 0)]. (2.8) 10

24 If we generate a vector of zero mean with unit variance uncorrelated numbers, n, then we can solve for the amplitudes, α, that weight each Zernike polynomial by applying the properties of the Cholesky decomposition such that: L = C 1 2 (2.9) and: α = Ln. (2.10) Thus, given a randomly generated zero mean, unit variance vector n, the Freid seeing parameter, r 0, the diameter of the aperture, D, and the number of Zernike polynomials desired, a wavefront error phase screen can be calculated. Again, this is the method used in all simulation conducted as part of this thesis. For more information on this method please refer to [30] Deconvolution. Deconvolution is a de-blurring technique widely used in many fields including astrophotography. Basically, if an image is distorted with spatially invariant blur, e.g. the same atmospheric distortion is applied to the entire image, it can be modeled as the convolution of the measured point spread function and the true image [19]. Or, using the Convolution Theorem, the Fourier Transform of the image is equal to the Fourier Transform of the object multiplied by the Optical Transfer Function (OTF): i(x) = o(x) h(x) F {i(x)} = F {o(x)} F {h(x)}. (2.11) In this equation, is the convolution operation. Typically, the only information known is the blurred image and an imperfect point spread function this is known as blind deconvolution [19]. By using multiple frames with their respective measured point spread functions the number of solutions to the blind deconvolution can be reduced [35]. This is known as multiframe blind deconvolution and is an important technique used for image 11

25 restoration [5][35]. In 1970, Antoine Labeyrie observed that the speckles in a short exposure image contained more spatial frequency data when compared to a long exposure image [20]. The processes that use this speckle information to reconstruct an image is referred to as speckle imaging. There are two main steps to speckle imaging. First, to estimate the object and reference star intensity and second, to recover the phase that is lost from the first step [3]. Step one will be described in this section and two methods of phase recovery will be covered in the next two subsections. Speckle interferometry is a technique used to find the expected value of the modulus of the Fourier Transform of the object. If the source object happens to be two points the cosine fringe patterns can be seen (see Figure 2.2). If α s represents the angular separation of a binary pair and, λ D α s λ r 0, where λ D is the approximate smallest angular separation two points are detectable in a diffraction limited environment and λ r 0 is the smallest approximate angular separation two points can be detected through atmospheric turbulence then speckle interferometry can be useful [31], where r 0 is the Fried s seeing parameter and λ is the wavelength of light. If α s < λ, then D the angular separation is too small to resolve. If α s > λ r 0, then speckle interferometry will not improve the resolution. It is therefore assumed going forward that the binary separation angle, α s, falls within the range above, where deconvolution is helpful. First consider the irradiance incident on a detector. If the imaging system is properly focused on the object, we have the incident irradiance equal to the object irradiance as observed from geometry alone convolved with the PSF: d(x) = h(x y)o(y) (2.12) y where d(x) is a single measured short exposure image, h(y) is the PSF, and o(y) is the diffraction-limited object irradiance. If the source object is a binary, let o(y) = 12

26 o 1 δ(y) + o 2 δ(y y 1 ), where δ is the Dirac function and o 1 and o 2 are the intensities of the binary points. Taking the convolution and applying the sifting property of integrating a Dirac function yields: d(x) = o 1 h(x y)δ(y) + o 2 h(x y)δ(y y 1 ) y y = o 1 h(x) + o 2 h(x y 1 ). (2.13) Now take the Fourier transform of d(x): F {d(x)} = o 1 H(f) + o 2 H(f)e j2πy 1 f x. (2.14) Here we take the modulus squared of the result: F {d(x)} 2 = ( ) ( ) O 1 H(f) + O 2 H(f)e j2πy 1 f x O1 H(f) + O 2 H(f) e j2πy 1 f x = O 2 1 H(f) 2 + O 2 2 H(f) REAL { } (2.15) O 1 O 2 H(f) 2 e j2πy 1 f x noting that REAL { } e j2πy 1 f x = cos (2πy1 f x ) reducing Equation 2.15 to: F {d(x)} 2 = O 2 1 H(f) 2 + O 2 2 H(f) 2 + 2O 1 O 2 H(f) 2 cos(2πy 1 f x ) (2.16) divide both sides by H(f) 2, yields: F {d(x)} 2 H(f) 2 = O O O 1O 2 cos(2πy 1 f x ). (2.17) Let Q(f) = F {d(x)} 2 K, where K is the photon noise bias governed by Poisson statistics and Q(f) is the unbiased speckle interferometry estimator [13]. The signal-to-noise ratio, S NR Q, of Q(f) improves as follows: S NR N Q (f) = N S NR Q (f) (2.18) where S NR N Q is the signal-to-noise ratio of N averaged independent realizations of Q(f) [31]. Values of N, the number of short exposure images, range from a few hundred to 13

27 several thousand [31]. Assuming N > 1, it is necessary to take the expected value of both the numerator and denominator of Equation 2.17: E{ F {i(x)} 2 } E{ H(f) 2 } = O O O 1O 2 cos(2πy 1 f x ) (2.19) where i(x) is the detection plane irradiance of N images, H(f) is the OTF and y 1 is the separation of the binary stars, O 1 (f) and O 2 (f) is the image spectrum of the binary stars. Plotting the results of Equation 2.19 can reveal a cosine pattern if the angular separation of the binary pair is large enough [14][20][31]. The following is a simulated example of speckle interferometry using 200 independent images of two point sources. 105 (a) (b) P1:124,128 P2:132, (c) (d) , , Figure 2.2: Speckle interferometry simulation with 200 independent frames. (a) The binary source image, (b) the simulated average of short exposure images, (c) speckle transfer function, (d) speckle transfer function showing fringe spacing. In Figure 2.2 (a) the binary source image is shown, note the separation is 8 pixels; (b) is the simulated average of 200 short exposure images; (c) is the calculated unbiased speckle 14

28 interferometry estimator, Q(f) (log scale); finally, (d) shows the log scale of Q(f) E{ H(f) 2 }. The binary separation from the original image, y 1 can be calculated from the results. First looking at Equation 2.19, the period of the fringe pattern detected depends on cos(2πy 1 f x ). Where y 1 is a 2-tuple and denotes the location of the second binary point in the image plane and f x is the size of a pixel in the frequency plane. The peak-to-peak period in pixel count of the cosine fringes in Figure 2.2 (d) is 32: P = Period = f x = 1 N 1 f requency = 1 y 1 f x (2.20) where N = 256, the number of pixels and f x is the sample size in the frequency plane. P = Period = 256 y 1 from measurements, P=32 pixels : 32 = 256 y 1 y 1 = = 8 pixels. (2.21) Looking at the simulation parameters, 8 pixels was the binary separation used to generate the image. The actual physical interpretation of 8 pixels of separation will depend on the characteristics of the imaging system and distance to the object being viewed. Thus, by measuring the pixel separation of the binary fringe pattern an estimation can be made as to the actual binary separation in the object plane. Observing these cosine fringe patterns is a proven method of finding binary point sources. However, as mentioned before, the phase data is lost after taking the second moment of the image spectrum. This phase data needs to be recovered for image reconstruction. The next two subsections will discuss two common methods of phase retrieval. 15

29 Knox-Thompson. As stated before, to properly reconstruct an image using spectral imaging the phase of the source object needs to be recovered. The first technique commonly used is the Knox- Thompson or cross spectrum method. Dr. K. T. Knox and B. J. Thompson published a paper in the Astrophysics Journal in 1974 describing a method of recovering images from atmospherically-degraded short exposure images [18][31]. This method is now called the Knox-Thompson Technique or the cross spectrum technique. In their paper, they defined the cross spectrum, C(f, f), as: C(f, f) = I(f)I (f + f) (2.22) where I(f) = O(f)H(f) and O(f) is the object spectrum, and H(f) is the OTF [31][18]. The cross spectrum of the detected image is not directly proportional to the cross spectrum of the object as a bias term needs to be accounted for to properly estimate the phase [2][4][31]. If we assume individual pixels in the detection plane are statistically independent and the photon arrival is governed by Poisson statistics, then the unbiased cross spectrum for a single measured image, d(x), can be written as [31]: C u (f, f) = D(f)D (f + f) D ( f). (2.23) The term D ( f) is the conjugate of the image spectrum at f, and is defined by: D ( f) = d(x)e j2π f x (2.24) x which will be different from image to image and needs to be subtracted out before taking the average of the short exposure images. Each image also needs to be centered as the cross spectrum method is not shift invariant [2][4][18][31]. Typical values of f = ( f 1, f 2 ), the spatial frequency offset, are, f < r 0 /(λd), where r 0 is the Fried seeing parameter, λ is the wavelength of the light and d is the distance 16

30 from the pupil plane to the imaging plane [2][31]. Taking the average cross spectrum over multiple short exposure images yields the following equation [2][31]: E[C(f, f)] = O(f) O(f + f) e j[φ o(f) φ o (f+ f)] E[H(f)H (f + f)] (2.25) where second moment of the OTF, E[H(f)H (f+ f)], is the cross spectrum transfer function and relates the object spectrum, O(f) to the cross spectrum. The cross spectrum transfer function is real-valued, so the phase of the average cross spectrum is [2][4][31]: φ C (f, f) = φ o (f) φ o (f + f). (2.26) The object phase, φ o can be extracted from this equation. Let the offset vector in the x direction be f x and the offset vector in the y direction be f y, that is f = ( f x, f y ). The phase differences generated by these offset vectors are [2][31]: φ x ( f x, f y ) = φ o ( f x, f y ) φ o ( f x + f x, f y ) (2.27) φ o(f) f x f x (2.28) φ y ( f x, f y ) = φ o ( f x, f y ) φ o ( f x, f y + f y ) (2.29) φ o(f) f y f y (2.30) The partial derivatives form the orthogonal components of the gradient of the object phase spectrum, φ o (f). This angle data can be combined with the magnitude data retrieved from speckle interferometry methods to reconstruct the image [2][31]. Calculating φ o (f) can be accomplished by the following equation: N x 1 φ o (N x f x, N y f y ) = i=0 φ x (i f x, 0) + N y 1 j=0 φ y (0, j f y ) (2.31) where N x and N y are the number of pixels in the x and y direction in the image plane. For the simulations in this paper, f x = f y = 1, which provides a small offset constant without doing sub-pixel manipulations. Looking at Equation 2.31, each point in the reconstructed 17

31 object phase, φ o, can be obtained by taking the angle of the average cross spectrum in both x and y directions and then summing along the x and y axis to the desired phase coordinate to reconstruct [31]. Many summing paths can be taken to get to a particular phase coordinate. In a noise free environment all paths to a particular point will yield the same result. In a real-world system, each path will yield slightly different results depending on random noise effects. It is, therefore, a standard practice to calculate the object phase at a particular point by averaging the results of summing several paths to that point [2][31]. The result of implementing Equation 2.31 is an unwrapped phase 2D-matrix containing the reconstructed object phase in the Fourier domain. To get the reconstructed image, φ o needs to be wrapped and then multiplied by the Fourier transform of the intensity information obtained through, in this case, speckle interferometry. The following is a simulated example of a basic implementation of the Knox-Thompson or cross spectrum method. The reconstructed image is formed from the following equation: o(x, y) = F 1 { O e jφ o } (2.32) where O is the modulus of the average object intensities calculated using speckle interferometry and φ o is the object phase recovered by the cross spectrum method and o(x, y) is the reconstructed object. Figure 2.3 shows the sum of 50 short exposure images of a binary point source with a separation of 4 pixels on a 255 pixel square grid. Each image passed through a randomly generated phase screen with an r o value of 30 cm to simulate atmospheric blur. Poisson noise was added to the data calculated at the image plane. A focused telescope with a square aperture of 1 meter was used for this simulation. Figure 2.4 is the result of my implementation of the cross spectrum method described in this section when applied to the image data from Figure

32 Figure 2.3: Stack of 50 short exposure images, each simulated through a unique random phase screen with r o = 30 cm. Implementing a robust cross spectrum phase retrieval algorithm requires extensive fine tuning to remove as much noise as possible. Please refer to Ayers work for more information on implementation [2]. Reconstructing a higher resolution image is typically what is desired in astronomical imaging. A major difference between the multi-hypothesis method and phase reconstruction is the focus on detecting binaries versus producing higher resolution images. One other phase reconstruction method should be mentioned and that is bispectrum technique Bispectrum. The bispectrum is another effective method used to reconstruct the phase of an image. It is invariant to image shift, which is a valuable property when looking at multiple images of potential binaries [2][22]. It is defined as [31]: B(f 1, f 2 ) = D(f 1 )D(f 2 )D (f 1 + f 2 ). (2.33) Note the phase of the object spectrum is contained in the phase of the bispectrum at three points in frequency space (f 1, f 2, and f 1 + f 2 ) compared to the cross spectrum which needs two points to reconstruct the object phase. Many different techniques have been and 19

33 Figure 2.4: Result of cross spectrum method with r o = 30cm, original image (a), cross section of original image (b), reconstructed image (c), cross section of reconstructed image (d). are continued to be published on how best to calculate the phase using the bispectrum [2][21][24][25][26]. I will highlight one such method which is the unit amplitude phasor recursive reconstructor. To begin, the unbiased bispectrum for a single short exposure image is: B u (f 1, f 2, ) = D(f 1 )D(f 2 )D (f 1 + f 2 ) D(f 1 ) 2 D(f 2 ) 2 D(f 1 + f 2 ) 2 2K + 3Pσ 2 n (2.34) 20

34 where K is the bias caused by the random arrival of photons governed by Poisson statistics and Pσ 2 n represents additive noise caused by the imaging device [31]. The unbiased bispectrum is calculated for each short exposure image and then the average bispectrum is computed. The phase of the resulting mean bispectrum is: φ B (f 1, f 2 ) (2.35) which is equal to [2][22][31]: φ B (f 1, f 2 ) = φ O (f 1 ) + φ O (f 2 ) φ O (f 1 + f 2 ). (2.36) Given we have calculated the bispectrum phase, φ B (f 1, f 2 ), we need to know two other values of the object phase spectrum to then iteratively calculate all other remaining values for the object phase. A typical approach is to set: φ O (0, 0) = φ O (1, 0) = φ O ( 1, 0) = φ O (0, 1) = φ O (0, 1) = 0. (2.37) Thus: φ O ((0, 0) + (0, 0)) = φ B ((0, 0), (0, 0)) φ O ((1, 0) + (0, 0)) = φ B ((1, 0), (0, 0)) φ O (( 1, 0) + (0, 0)) = φ B (( 1, 0), (0, 0)) (2.38) φ O ((0, 0) + (0, 1)) = φ B ((0, 0), (0, 1)) φ O ((0, 0) + (0, 1)) = φ B ((0, 0), (0, 1)). Much like the cross spectrum, many different combinations of known values can be used to find a value at an unknown location. For example, if the point object phase spectrum φ O (4, 5) is desired, then: φ O (4, 5) = φ O (1, 0) + φ O (3, 5) φ B ((1, 0), (3, 5)) = φ O (1, 0) + φ O (3, 5) φ B ((1, 0), (3, 5)) = φ O (2, 3) + φ O (2, 2) φ B ((2, 3), (2, 2)) (2.39) = φ O (3, 4) + φ O (1, 1) φ B ((3, 4), (1, 1)) 21

35 and so forth. Each linear combination does not necessarily give the same results if noise is present. Thus, like the cross spectrum method, many paths are typically calculated and then averaged [21][31]. Lastly, due to a potential 2π bias when taking different paths, the calculations are typically done as unit phasors, thus, the final algorithm for reconstructing phase using the bispectrum is [21]: e jφ O(f 1 +f 2 ) = e jφ O(f 1 ) e jφ O(f 2 ) e jφ B(f 1,f 2 ). (2.40) No example of implementing the bispectrum method is provided in this paper. The reader can refer to the following references for examples and more information, [2][21][24][25][26]. Both the cross spectrum and bispectrum method have proven to be effective at reconstructing atmospherically blurred images to reveal binary pairs. However, as has been discussed before, if the object is only binary detection, then complete image reconstruction is unnecessary. By comparing the statistics of two hypothetical sources, i.e., a single and binary, better results for binary detection can be had then by visually inspecting reconstructed images. The next section will discuss another method of image reconstruction useful in binary detection, imaging by correlation Imaging by Correlation. This method of image recovery utilizes a process developed to recover meaningful information from random data and applies it to the problem of image recovery from second and third order correlation. This technique is unique in the fact that it simultaneously recovers the Fourier magnitude and phase as compared with speckle imaging in which amplitude and phase are recovered separately [34]. In general, correlation is an N th order process so thus for the purposes of binary detection N = 2, referred to as autocorrelation, will be used. The general strategy is to take the autocorrelation of the measured image data and the autocorrelation of an estimated image and then iterate through a log likelihood cost function to reduce the estimated image 22

36 to the most likely true image [34]. Let R(y) be the autocorrelation function of the measured image data, d(x), where R(y) = N x=1 d(x)d(x + y), the summation representing the sum over all pixels in d. Let R λ (y) be the autocorrelation of the estimated image, λ(x). Any cost function can be used, however in this work the I-divergence D(R λ, R) function used by Schulz and Snyder is adopted [34]: D(R, R λ ) = [R λ (y) R(y)] + R(y)ln R(y) R λ (y). (2.41) y By minimizing the I-divergence cost function we can solve for a λ(x) that is the most likely true image. Taking the derivative of D(R λ, R) with respect to a single point in the estimated image and then setting that equal to zero yields the necessary optimality condition: y D(R λ, R) λ(x o ) R(y) = [λ(x o + y) + λ(x o y)] R λ (y) [λ(x o + y) + λ(x o y)] = 0. (2.42) y y Schulz and Snyder then setup an algorithm that iteratively solves for an updated λ(x) based on a previous one [34]. For k iterations: λ k+1 (x) = λ k (x) 1 R 1/2 o y [R(y) + R( y)] λ k (x + y) 2R λk (y) where R o is the autocorrelation evaluated at y = 0 of the measured data. (2.43) Using the convolution and correlation theorems, multiplication can be used in the Fourier domain. For k iterations, an estimated reconstructed image, λ k (x) is found [34]. The autocorrelation is similar to the bispectrum in that image tilt does not need to be removed before processing. Figure 2.5 shows a simulated result after 100 iterations. Figure 2.6 shows the simulated result compared to the original image. The same simulation parameters used in the cross spectrum example in Section were used. The software implementation of this method is discussed more in Chapter 3, as it is used as a comparison to the multihypothesis technique proposed in this paper. 23

37 Figure 2.5: Results with different numbers of iterations: (a) 1, (b) 5, (c) 10, (d) 15, (e) 80, and (f) 100 iterations. 24

38 Figure 2.6: Results of the correlation technique: (a) original image, (b) original image cross section, (c) reconstructed image, and (d) reconstructed image cross section. 25

39 2.1.4 Binary detection by post-process image reconstruction summary. Image reconstruction via cross correlation, bispectrum and autocorrelation are all proven techniques that can enhance image spatial resolution and thus greatly impact binary detection. This section provided information for a basic understanding of how these methods can enhance images. This thesis will compare one of the methods listed above, image reconstruction by autocorrelation, to a statistical approach that does not provide a reconstructed image but focuses on the question Was this image created by a single or binary point source(s)? The next section will provide an overview of how a multi-hypothesis technique can be setup to answer this question. 2.2 Multi-Hypothesis Detection By assuming a source signal is either a zero, one, et cetera and calculating the expected value of each hypothesis, the most likely original signal can be determined. This very simple yet powerful logic is foundational to digital communication and other areas of electo optics such as Light Detection And Ranging (LiDAR) [29]. The same logic can be used in detecting binaries that have been distorted by the atmosphere. If we can measure how the atmosphere distorts a point source and simulate how that same atmosphere would distort various combinations of binary sources we can predict what an image should look like if it was the result of a binary or single point source, as perceived from the pupil plane of an imaging system. We can then compare what the image should look like given a single or binary point source to what was actually imaged through that same atmosphere. In this way a binary can be detected based on what we expect to see given two different scenarios, or hypothesis. Figure 2.7 shows the basic concept behind this idea. One of the most important assumptions made for using this technique is that a valid point spread function (PSF) can be measured that correlates to the image being analyzed for a potential binary. Another 26

40 important assumption, and an area for future research, is to assume more than just two scenarios to detect larger clusters then just a binary. Hypothesis for three, four, and so forth, as well as different shapes can all be used as comparisons for the image that was actually detected. The derivation of and detailed analysis of a proposed multi-hypothesis algorithm to detect binaries is given in Chapter 3. Results of simulated and measured data testing will be given in Chapter 4. Multi-Hypothesis for Binary Detection Overview 1 Measure Data 2 Calculate Log-likelihoods 3 Binary Decision Measured Image Hypothesis Zero Determine if image intensity is below minimum threshold for detection. Based on detection criteria determine which hypothesis is most likely Measured PSF Hypothesis One Calculate most likely intensity and position of single point source to produce measured image Hypothesis Two Calculate most likely intensity and position of two point sources to produce measured image Figure 2.7: Overview of binary hypothesis method. Chapter 3 will work through derivations of each step. 27

41 2.3 Chapter Summary As binary objects become closer and dimmer, detection becomes impossible for a standard imaging system. Using adaptive optics, looking for patterns, extracting higher resolution from speckle images, and estimating the image using correlation are current methods for binary detection. This thesis will explore a method that has not been used in astronomy for binary detection, referred to herein as the multi-hypothesis technique. Basically, by measuring the PSF we can calculate what a single point source and a binary source would look like, then compare that to what is detected. The result of this comparison can be used to make a statistical determination on whether the image detected was the result of a single point source or a binary point source. 28

Lecture 9: Speckle Interferometry. Full-Aperture Interferometry. Labeyrie Technique. Knox-Thompson Technique. Bispectrum Technique

Lecture 9: Speckle Interferometry. Full-Aperture Interferometry. Labeyrie Technique. Knox-Thompson Technique. Bispectrum Technique Lecture 9: Speckle Interferometry Outline 1 Full-Aperture Interferometry 2 Labeyrie Technique 3 Knox-Thompson Technique 4 Bispectrum Technique 5 Differential Speckle Imaging 6 Phase-Diverse Speckle Imaging

More information

AIR FORCE INSTITUTE OF TECHNOLOGY

AIR FORCE INSTITUTE OF TECHNOLOGY MEASURING ANGULAR RATE OF CELESTIAL OBJECTS USING THE SPACE SURVEILLANCE TELESCOPE THESIS Anthony J. Sligar, Captain, USAF AFIT-ENG-MS-15-M-019 DEPARTMENT OF THE AIR FORCE AIR UNIVERSITY AIR FORCE INSTITUTE

More information

1. Abstract. 2. Introduction/Problem Statement

1. Abstract. 2. Introduction/Problem Statement Advances in polarimetric deconvolution Capt. Kurtis G. Engelson Air Force Institute of Technology, Student Dr. Stephen C. Cain Air Force Institute of Technology, Professor 1. Abstract One of the realities

More information

Wavefront Sensing using Polarization Shearing Interferometer. A report on the work done for my Ph.D. J.P.Lancelot

Wavefront Sensing using Polarization Shearing Interferometer. A report on the work done for my Ph.D. J.P.Lancelot Wavefront Sensing using Polarization Shearing Interferometer A report on the work done for my Ph.D J.P.Lancelot CONTENTS 1. Introduction 2. Imaging Through Atmospheric turbulence 2.1 The statistics of

More information

A Comparison Between a Non-Linear and a Linear Gaussian Statistical Detector for Detecting Dim Satellites

A Comparison Between a Non-Linear and a Linear Gaussian Statistical Detector for Detecting Dim Satellites A Comparison Between a on-linear and a Linear Gaussian Statistical Detector for Detecting Dim Satellites Lt Stephen Maksim United States Air Force Maj J. Chris Zingarelli United States Air Force Dr. Stephen

More information

Response of DIMM turbulence sensor

Response of DIMM turbulence sensor Response of DIMM turbulence sensor A. Tokovinin Version 1. December 20, 2006 [tdimm/doc/dimmsensor.tex] 1 Introduction Differential Image Motion Monitor (DIMM) is an instrument destined to measure optical

More information

Contents Preface iii 1 Origins and Manifestations of Speckle 2 Random Phasor Sums 3 First-Order Statistical Properties

Contents Preface iii 1 Origins and Manifestations of Speckle 2 Random Phasor Sums 3 First-Order Statistical Properties Contents Preface iii 1 Origins and Manifestations of Speckle 1 1.1 General Background............................. 1 1.2 Intuitive Explanation of the Cause of Speckle................ 2 1.3 Some Mathematical

More information

AOL Spring Wavefront Sensing. Figure 1: Principle of operation of the Shack-Hartmann wavefront sensor

AOL Spring Wavefront Sensing. Figure 1: Principle of operation of the Shack-Hartmann wavefront sensor AOL Spring Wavefront Sensing The Shack Hartmann Wavefront Sensor system provides accurate, high-speed measurements of the wavefront shape and intensity distribution of beams by analyzing the location and

More information

AIR FORCE INSTITUTE OF TECHNOLOGY

AIR FORCE INSTITUTE OF TECHNOLOGY ENHANCING GROUND BASED TELESCOPE PERFORMANCE WITH IMAGE PROCESSING DISSERTATION John C. Zingarelli, Major, USAF AFIT-ENG-DS-3-D-04 DEPARTMENT OF THE AIR FORCE AIR UNIVERSITY AIR FORCE INSTITUTE OF TECHNOLOGY

More information

Imaging Geo-synchronous Satellites with the AEOS Telescope

Imaging Geo-synchronous Satellites with the AEOS Telescope Imaging Geo-synchronous Satellites with the AEOS Telescope Douglas A. Hope 1, Stuart M. Jefferies 1,2 and Cindy Giebink 1 1 Institute for Astronomy, University of Hawaii, Advanced Technology Research Center

More information

Leveraging Machine Learning for High-Resolution Restoration of Satellite Imagery

Leveraging Machine Learning for High-Resolution Restoration of Satellite Imagery Leveraging Machine Learning for High-Resolution Restoration of Satellite Imagery Daniel L. Pimentel-Alarcón, Ashish Tiwari Georgia State University, Atlanta, GA Douglas A. Hope Hope Scientific Renaissance

More information

Joint R&D and Ops: a Working Paradigm for SSA

Joint R&D and Ops: a Working Paradigm for SSA Joint R&D and Ops: a Working Paradigm for SSA 23 July 2017 Stacie Williams Program Officer Air Force Office of Scientific Research Integrity «Service «Excellence Air Force Research Laboratory 1 2 Joint

More information

Using a Membrane DM to Generate Zernike Modes

Using a Membrane DM to Generate Zernike Modes Using a Membrane DM to Generate Zernike Modes Author: Justin D. Mansell, Ph.D. Active Optical Systems, LLC Revision: 12/23/08 Membrane DMs have been used quite extensively to impose a known phase onto

More information

A Comparison of Multi-Frame Blind Deconvolution and Speckle Imaging Energy Spectrum Signal-to-Noise Ratios-- Journal Article (Preprint)

A Comparison of Multi-Frame Blind Deconvolution and Speckle Imaging Energy Spectrum Signal-to-Noise Ratios-- Journal Article (Preprint) AFRL-RD-PS- TP-2008-1005 AFRL-RD-PS- TP-2008-1005 A Comparison of Multi-Frame Blind Deconvolution and Speckle Imaging Energy Spectrum Signal-to-Noise Ratios-- Journal Article (Preprint) Charles L. Matson

More information

International Journal of Scientific & Engineering Research, Volume 4, Issue 7, July ISSN

International Journal of Scientific & Engineering Research, Volume 4, Issue 7, July ISSN International Journal of Scientific & Engineering Research, Volume 4, Issue 7, July-2013 96 Performance and Evaluation of Interferometric based Wavefront Sensors M.Mohamed Ismail1, M.Mohamed Sathik2 Research

More information

Optical/IR Observational Astronomy Telescopes I: Optical Principles. David Buckley, SAAO. 24 Feb 2012 NASSP OT1: Telescopes I-1

Optical/IR Observational Astronomy Telescopes I: Optical Principles. David Buckley, SAAO. 24 Feb 2012 NASSP OT1: Telescopes I-1 David Buckley, SAAO 24 Feb 2012 NASSP OT1: Telescopes I-1 1 What Do Telescopes Do? They collect light They form images of distant objects The images are analyzed by instruments The human eye Photographic

More information

Static telescope aberration measurement using lucky imaging techniques

Static telescope aberration measurement using lucky imaging techniques DOI 10.1007/s10686-012-9291-4 ORIGINAL ARTICLE Static telescope aberration measurement using lucky imaging techniques Marcos López-Marrero Luis Fernando Rodríguez-Ramos José Gil Marichal-Hernández José

More information

Sky demonstration of potential for ground layer adaptive optics correction

Sky demonstration of potential for ground layer adaptive optics correction Sky demonstration of potential for ground layer adaptive optics correction Christoph J. Baranec, Michael Lloyd-Hart, Johanan L. Codona, N. Mark Milton Center for Astronomical Adaptive Optics, Steward Observatory,

More information

Designing a Space Telescope to Image Earth-like Planets

Designing a Space Telescope to Image Earth-like Planets Designing a Space Telescope to Image Earth-like Planets Robert J. Vanderbei Rutgers University December 4, 2002 Page 1 of 28 Member: Princeton University/Ball Aerospace TPF Team http://www.princeton.edu/

More information

An Example of Telescope Resolution

An Example of Telescope Resolution An Example of Telescope Resolution J. Kielkopf September 23, 2012 1 Principles Light leaves a distant source with the properties of a spherical wave. That is, the phase of the wave is constant on the surface

More information

Image Reconstruction Using Bispectrum Speckle Interferometry: Application and First Results

Image Reconstruction Using Bispectrum Speckle Interferometry: Application and First Results October 1, 018 Page 750 Image Reconstruction Using Bispectrum Speckle Interferometry: Application and First Results Roberto Maria Caloi robime@iol.it Abstract: 1. Introduction For rapid prototyping and

More information

A NEW METHOD FOR ADAPTIVE OPTICS POINT SPREAD FUNCTION RECONSTRUCTION

A NEW METHOD FOR ADAPTIVE OPTICS POINT SPREAD FUNCTION RECONSTRUCTION Florence, Italy. May 2013 ISBN: 978-88-908876-0-4 DOI: 10.12839/AO4ELT3.13328 A NEW METHOD FOR ADAPTIVE OPTICS POINT SPREAD FUNCTION RECONSTRUCTION J. Exposito 1a, D. Gratadour 1, G. Rousset 1, Y. Clénet

More information

What is Image Deblurring?

What is Image Deblurring? What is Image Deblurring? When we use a camera, we want the recorded image to be a faithful representation of the scene that we see but every image is more or less blurry, depending on the circumstances.

More information

Optics and Telescopes

Optics and Telescopes Optics and Telescopes Guiding Questions 1. Why is it important that telescopes be large? 2. Why do most modern telescopes use a large mirror rather than a large lens? 3. Why are observatories in such remote

More information

Sky Projected Shack-Hartmann Laser Guide Star

Sky Projected Shack-Hartmann Laser Guide Star Sky Projected Shack-Hartmann Laser Guide Star T. Butterley a, D.F. Buscher b, G. D. Love a, T.J. Morris a, R. M. Myers a and R. W. Wilson a a University of Durham, Dept. of Physics, Rochester Building,

More information

AIR FORCE INSTITUTE OF TECHNOLOGY

AIR FORCE INSTITUTE OF TECHNOLOGY EFFICIENT PHASE RETRIEVAL FOR OFF-AXIS POINT SPREAD FUNCTIONS THESIS S. Esteban Carrasco, CIV DR-II, USAF AFIT-ENG-MS-18-J-084 DEPARTMENT OF THE AIR FORCE AIR UNIVERSITY AIR FORCE INSTITUTE OF TECHNOLOGY

More information

End-to-end model for the Polychromatic Laser Guide Star project (ELP-OA)

End-to-end model for the Polychromatic Laser Guide Star project (ELP-OA) 1st AO4ELT conference, 04006 (2010) DOI:10.1051/ao4elt/201004006 Owned by the authors, published by EDP Sciences, 2010 End-to-end model for the Polychromatic Laser Guide Star project (ELP-OA) N. Meilard

More information

Science Insights: An International Journal

Science Insights: An International Journal Available online at http://www.urpjournals.com Science Insights: An International Journal Universal Research Publications. All rights reserved ISSN 2277 3835 Original Article Object Recognition using Zernike

More information

Near Earth Space Object Detection Utilizing Parallax as Multi-hypothesis Test Criterion

Near Earth Space Object Detection Utilizing Parallax as Multi-hypothesis Test Criterion Air Force Institute of Technology AFIT Scholar Theses and Dissertations 3-3-08 Near Earth Space Object Detection Utilizing Parallax as Multi-hypothesis Test Criterion Joseph C. Tompkins Follow this and

More information

A novel laser guide star: Projected Pupil Plane Pattern

A novel laser guide star: Projected Pupil Plane Pattern A novel laser guide star: Projected Pupil Plane Pattern Huizhe Yang a, Nazim Barmal a, Richard Myers a, David F. Buscher b, Aglae Kellerer c, Tim Morris a, and Alastair Basden a a Department of Physics,

More information

A Comparison Between a Non-Linear and a Linear Gaussian Statistical Detector for Detecting Dim Satellites

A Comparison Between a Non-Linear and a Linear Gaussian Statistical Detector for Detecting Dim Satellites A Comparison Between a on-linear and a Linear Gaussian Statistical Detector for Detecting Dim Satellites Lt Stephen Maksim United States Air Force Maj J. Chris Zingarelli United States Air Force Dr. Stephen

More information

Design and Correction of optical Systems

Design and Correction of optical Systems Design and Correction of optical Systems Part 10: Performance criteria 1 Summer term 01 Herbert Gross Overview 1. Basics 01-04-18. Materials 01-04-5 3. Components 01-05-0 4. Paraxial optics 01-05-09 5.

More information

Speckles and adaptive optics

Speckles and adaptive optics Chapter 9 Speckles and adaptive optics A better understanding of the atmospheric seeing and the properties of speckles is important for finding techniques to reduce the disturbing effects or to correct

More information

Measuring tilt and focus for sodium beacon adaptive optics on the Starfire 3.5 meter telescope -- Conference Proceedings (Preprint)

Measuring tilt and focus for sodium beacon adaptive optics on the Starfire 3.5 meter telescope -- Conference Proceedings (Preprint) AFRL-RD-PS-TP-2008-1008 AFRL-RD-PS-TP-2008-1008 Measuring tilt and focus for sodium beacon adaptive optics on the Starfire 3.5 meter telescope -- Conference Proceedings (Preprint) Robert Johnson 1 September

More information

Phase Retrieval for the Hubble Space Telescope and other Applications Abstract: Introduction: Theory:

Phase Retrieval for the Hubble Space Telescope and other Applications Abstract: Introduction: Theory: Phase Retrieval for the Hubble Space Telescope and other Applications Stephanie Barnes College of Optical Sciences, University of Arizona, Tucson, Arizona 85721 sab3@email.arizona.edu Abstract: James R.

More information

Point spread function reconstruction at W.M. Keck Observatory : progress and beyond

Point spread function reconstruction at W.M. Keck Observatory : progress and beyond Point spread function reconstruction at W.M. Keck Observatory : progress and beyond Olivier Beltramo-Martin Aix-Marseille Univ., LAM, A*MIDEX, Extra November 9th, 217 - LAM Olivier Beltramo-Martin (LAM)

More information

EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6)

EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) EE 367 / CS 448I Computational Imaging and Display Notes: Image Deconvolution (lecture 6) Gordon Wetzstein gordon.wetzstein@stanford.edu This document serves as a supplement to the material discussed in

More information

Lecture 2. September 13, 2018 Coordinates, Telescopes and Observing

Lecture 2. September 13, 2018 Coordinates, Telescopes and Observing Lecture 2 September 13, 2018 Coordinates, Telescopes and Observing News Lab time assignments are on class webpage. Lab 2 Handed out today and is due September 27. Observing commences starting tomorrow.

More information

How Light Beams Behave. Light and Telescopes Guiding Questions. Telescopes A refracting telescope uses a lens to concentrate incoming light at a focus

How Light Beams Behave. Light and Telescopes Guiding Questions. Telescopes A refracting telescope uses a lens to concentrate incoming light at a focus Light and Telescopes Guiding Questions 1. Why is it important that telescopes be large? 2. Why do most modern telescopes use a large mirror rather than a large lens? 3. Why are observatories in such remote

More information

AIR FORCE INSTITUTE OF TECHNOLOGY

AIR FORCE INSTITUTE OF TECHNOLOGY IMPROVING THE PERFORMANCE OF THE SPACE SURVEILLANCE TELESCOPE AS A FUNCTION OF SEEING PARAMETER THESIS Jae H. Jeon, Captain, USAF AFIT-ENG-MS-15-M-050 DEPARTMENT OF THE AIR FORCE AIR UNIVERSITY AIR FORCE

More information

7. Telescopes: Portals of Discovery Pearson Education Inc., publishing as Addison Wesley

7. Telescopes: Portals of Discovery Pearson Education Inc., publishing as Addison Wesley 7. Telescopes: Portals of Discovery Parts of the Human Eye pupil allows light to enter the eye lens focuses light to create an image retina detects the light and generates signals which are sent to the

More information

PAPER 338 OPTICAL AND INFRARED ASTRONOMICAL TELESCOPES AND INSTRUMENTS

PAPER 338 OPTICAL AND INFRARED ASTRONOMICAL TELESCOPES AND INSTRUMENTS MATHEMATICAL TRIPOS Part III Monday, 12 June, 2017 1:30 pm to 3:30 pm PAPER 338 OPTICAL AND INFRARED ASTRONOMICAL TELESCOPES AND INSTRUMENTS Attempt no more than TWO questions. There are THREE questions

More information

Basic Theory of Speckle Imaging

Basic Theory of Speckle Imaging 1 Basic Theory of Speckle Imaging Elliott Horch, Southern Connecticut State University 1 arcsec BU 151AB 2 Speckle Often Means Binary Stars Stellar Masses. Mass-Luminosity Relation (MLR) Initial Mass Function

More information

Detection of Artificial Satellites in Images Acquired in Track Rate Mode.

Detection of Artificial Satellites in Images Acquired in Track Rate Mode. Detection of Artificial Satellites in Images Acquired in Track Rate Mode. Martin P. Lévesque Defence R&D Canada- Valcartier, 2459 Boul. Pie-XI North, Québec, QC, G3J 1X5 Canada, martin.levesque@drdc-rddc.gc.ca

More information

Fig. 2 The image will be in focus everywhere. It's size changes based on the position of the focal plane.

Fig. 2 The image will be in focus everywhere. It's size changes based on the position of the focal plane. Instruments 1. Basic Optics 1. Rays of Light 2. Waves of light 3. Basic Imaging Systems 4. A Basic Telescope 5. Aberrations 6. Mirrors 2. Some Real Instruments 1. Galileo's Telescope 2. Keplerian Optics

More information

IMPROVING BEAM QUALITY NEW TELESCOPE ALIGNMENT PROCEDURE

IMPROVING BEAM QUALITY NEW TELESCOPE ALIGNMENT PROCEDURE IMPROVING BEAM QUALITY NEW TELESCOPE ALIGNMENT PROCEDURE by Laszlo Sturmann Fiber coupled combiners and visual band observations are more sensitive to telescope alignment problems than bulk combiners in

More information

Using Speckle Interferometry to Resolve Binary Stars

Using Speckle Interferometry to Resolve Binary Stars Spectroscopically and Spatially Resolving the Components of Close Binary Stars ASP Conference Series, Vol. 318, 2004 R. W. Hilditch, H. Hensberge, and K. Pavlovski (eds.) Using Speckle Interferometry to

More information

Atmospheric Turbulence Effects Removal on Infrared Sequences Degraded by Local Isoplanatism

Atmospheric Turbulence Effects Removal on Infrared Sequences Degraded by Local Isoplanatism Atmospheric Turbulence Effects Removal on Infrared Sequences Degraded by Local Isoplanatism Magali Lemaitre 1, Olivier Laligant 1, Jacques Blanc-Talon 2, and Fabrice Mériaudeau 1 1 Le2i Laboratory, University

More information

n The visual examination of the image of a point source is one of the most basic and important tests that can be performed.

n The visual examination of the image of a point source is one of the most basic and important tests that can be performed. 8.2.11 Star Test n The visual examination of the image of a point source is one of the most basic and important tests that can be performed. Interpretation of the image is to a large degree a matter of

More information

Chapter 5 Telescopes

Chapter 5 Telescopes Chapter 5 Telescopes Units of Chapter 5 Telescope Design Images and Detectors The Hubble Space Telescope Telescope Size High-Resolution Astronomy Radio Astronomy Interferometry Space-Based Astronomy Full-Spectrum

More information

AIR FORCE RESEARCH LABORATORY Directed Energy Directorate 3550 Aberdeen Ave SE AIR FORCE MATERIEL COMMAND KIRTLAND AIR FORCE BASE, NM

AIR FORCE RESEARCH LABORATORY Directed Energy Directorate 3550 Aberdeen Ave SE AIR FORCE MATERIEL COMMAND KIRTLAND AIR FORCE BASE, NM AFRL-DE-PS-JA-2007-1004 AFRL-DE-PS-JA-2007-1004 Noise Reduction in support-constrained multi-frame blind-deconvolution restorations as a function of the number of data frames and the support constraint

More information

Error Budgets, and Introduction to Class Projects. Lecture 6, ASTR 289

Error Budgets, and Introduction to Class Projects. Lecture 6, ASTR 289 Error Budgets, and Introduction to Class Projects Lecture 6, ASTR 89 Claire Max UC Santa Cruz January 8, 016 Page 1 What is residual wavefront error? Telescope AO System Science Instrument Very distorted

More information

OPTICAL PHOTOMETRY. Observational Astronomy (2011) 1

OPTICAL PHOTOMETRY. Observational Astronomy (2011) 1 OPTICAL PHOTOMETRY Observational Astronomy (2011) 1 The optical photons coming from an astronomical object (star, galaxy, quasar, etc) can be registered in the pixels of a frame (or image). Using a ground-based

More information

50%-50% Beam Splitters Using Transparent Substrates Coated by Single- or Double-Layer Quarter-Wave Thin Films

50%-50% Beam Splitters Using Transparent Substrates Coated by Single- or Double-Layer Quarter-Wave Thin Films University of New Orleans ScholarWorks@UNO University of New Orleans Theses and Dissertations Dissertations and Theses 5-22-2006 50%-50% Beam Splitters Using Transparent Substrates Coated by Single- or

More information

Lecture 9: Indirect Imaging 2. Two-Element Interferometer. Van Cittert-Zernike Theorem. Aperture Synthesis Imaging. Outline

Lecture 9: Indirect Imaging 2. Two-Element Interferometer. Van Cittert-Zernike Theorem. Aperture Synthesis Imaging. Outline Lecture 9: Indirect Imaging 2 Outline 1 Two-Element Interferometer 2 Van Cittert-Zernike Theorem 3 Aperture Synthesis Imaging Cygnus A at 6 cm Image courtesy of NRAO/AUI Very Large Array (VLA), New Mexico,

More information

Final Announcements. Lecture25 Telescopes. The Bending of Light. Parts of the Human Eye. Reading: Chapter 7. Turn in the homework#6 NOW.

Final Announcements. Lecture25 Telescopes. The Bending of Light. Parts of the Human Eye. Reading: Chapter 7. Turn in the homework#6 NOW. Final Announcements Turn in the homework#6 NOW. Homework#5 and Quiz#6 will be returned today. Today is the last lecture. Lecture25 Telescopes Reading: Chapter 7 Final exam on Thursday Be sure to clear

More information

On the FPA infrared camera transfer function calculation

On the FPA infrared camera transfer function calculation On the FPA infrared camera transfer function calculation (1) CERTES, Université Paris XII Val de Marne, Créteil, France (2) LTM, Université de Bourgogne, Le Creusot, France by S. Datcu 1, L. Ibos 1,Y.

More information

Residual phase variance in partial correction: application to the estimate of the light intensity statistics

Residual phase variance in partial correction: application to the estimate of the light intensity statistics 3 J. Opt. Soc. Am. A/ Vol. 7, No. 7/ July 000 M. P. Cagigal and V. F. Canales Residual phase variance in partial correction: application to the estimate of the light intensity statistics Manuel P. Cagigal

More information

Speckle Interferometry

Speckle Interferometry Contrib. Astron. Obs. Skalnaté Pleso 43, 229 236, (2014) Speckle Interferometry R. Köhler Max-Planck-Institut für Astronomie, Königstuhl 17, 69117 Heidelberg, Germany, (E-mail: koehler@mpia.de) Received:

More information

Exoplanets Direct imaging. Direct method of exoplanet detection. Direct imaging: observational challenges

Exoplanets Direct imaging. Direct method of exoplanet detection. Direct imaging: observational challenges Black body flux (in units 10-26 W m -2 Hz -1 ) of some Solar System bodies as seen from 10 pc. A putative hot Jupiter is also shown. The planets have two peaks in their spectra. The short-wavelength peak

More information

Why Use a Telescope?

Why Use a Telescope? 1 Why Use a Telescope? All astronomical objects are distant so a telescope is needed to Gather light -- telescopes sometimes referred to as light buckets Resolve detail Magnify an image (least important

More information

A search for binary stars using speckle interferometry

A search for binary stars using speckle interferometry Rochester Institute of Technology RIT Scholar Works Theses Thesis/Dissertation Collections 2000 A search for binary stars using speckle interferometry Matthew Hoffmann Follow this and additional works

More information

Chapter 6 Lecture. The Cosmic Perspective. Telescopes Portals of Discovery Pearson Education, Inc.

Chapter 6 Lecture. The Cosmic Perspective. Telescopes Portals of Discovery Pearson Education, Inc. Chapter 6 Lecture The Cosmic Perspective Telescopes Portals of Discovery 2014 Pearson Education, Inc. Telescopes Portals of Discovery CofC Observatory 6.1 Eyes and Cameras: Everyday Light Sensors Our goals

More information

1. Give short answers to the following questions. a. What limits the size of a corrected field of view in AO?

1. Give short answers to the following questions. a. What limits the size of a corrected field of view in AO? Astronomy 418/518 final practice exam 1. Give short answers to the following questions. a. What limits the size of a corrected field of view in AO? b. Describe the visibility vs. baseline for a two element,

More information

IMPROVING THE DECONVOLUTION METHOD FOR ASTEROID IMAGES: OBSERVING 511 DAVIDA, 52 EUROPA, AND 12 VICTORIA

IMPROVING THE DECONVOLUTION METHOD FOR ASTEROID IMAGES: OBSERVING 511 DAVIDA, 52 EUROPA, AND 12 VICTORIA IMPROVING THE DECONVOLUTION METHOD FOR ASTEROID IMAGES: OBSERVING 511 DAVIDA, 52 EUROPA, AND 12 VICTORIA Z Robert Knight Department of Physics and Astronomy University of Hawai`i at Hilo ABSTRACT Deconvolution

More information

The IPIE Adaptive Optical System Application For LEO Observations

The IPIE Adaptive Optical System Application For LEO Observations The IPIE Adaptive Optical System Application For LEO Observations Eu. Grishin (1), V. Shargorodsky (1), P. Inshin (2), V. Vygon (1) and M. Sadovnikov (1) (1) Open Joint Stock Company Research-and-Production

More information

Orthonormal vector polynomials in a unit circle, Part I: basis set derived from gradients of Zernike polynomials

Orthonormal vector polynomials in a unit circle, Part I: basis set derived from gradients of Zernike polynomials Orthonormal vector polynomials in a unit circle, Part I: basis set derived from gradients of ernike polynomials Chunyu hao and James H. Burge College of Optical ciences, the University of Arizona 630 E.

More information

Imaging through Kolmogorov model of atmospheric turbulence for shearing interferometer wavefront sensor

Imaging through Kolmogorov model of atmospheric turbulence for shearing interferometer wavefront sensor Imaging through Kolmogorov model of atmospheric turbulence for shearing interferometer wavefront sensor M.Mohamed Ismail 1 M.Mohamed Sathik 2 1Research Scholar, Department of Computer Science, Sadakathullah

More information

AIR FORCE INSTITUTE OF TECHNOLOGY

AIR FORCE INSTITUTE OF TECHNOLOGY OPTIMAL ORBITAL COVERAGE OF THEATER OPERATIONS AND TARGETS THESIS Kimberly A. Sugrue, Captain, USAF AFIT/GA/ENY/07-M17 DEPARTMENT OF THE AIR FORCE AIR UNIVERSITY AIR FORCE INSTITUTE OF TECHNOLOGY Wright-Patterson

More information

Wavefront errors due to atmospheric turbulence Claire Max

Wavefront errors due to atmospheric turbulence Claire Max Wavefront errors due to atmospheric turbulence Claire Max Page 1 Kolmogorov turbulence, cartoon solar Outer scale L 0 Inner scale l 0 h Wind shear convection h ground Page Atmospheric Turbulence generally

More information

Astronomy 114. Lecture 26: Telescopes. Martin D. Weinberg. UMass/Astronomy Department

Astronomy 114. Lecture 26: Telescopes. Martin D. Weinberg. UMass/Astronomy Department Astronomy 114 Lecture 26: Telescopes Martin D. Weinberg weinberg@astro.umass.edu UMass/Astronomy Department A114: Lecture 26 17 Apr 2007 Read: Ch. 6,26 Astronomy 114 1/17 Announcements Quiz #2: we re aiming

More information

WAVEFRONT SENSING FOR ADAPTIVE OPTICS

WAVEFRONT SENSING FOR ADAPTIVE OPTICS WAVEFRONT SENSING FOR ADAPTIVE OPTICS A thesis submitted for the Degree of Doctor of Philosophy in the Faculty of Physics Bangalore University By J.P.Lancelot Chellaraj Thangadurai Photonics Division Indian

More information

Space Surveillance using Star Trackers. Part I: Simulations

Space Surveillance using Star Trackers. Part I: Simulations AAS 06-231 Space Surveillance using Star Trackers. Part I: Simulations Iohan Ettouati, Daniele Mortari, and Thomas Pollock Texas A&M University, College Station, Texas 77843-3141 Abstract This paper presents

More information

Phase-Referencing and the Atmosphere

Phase-Referencing and the Atmosphere Phase-Referencing and the Atmosphere Francoise Delplancke Outline: Basic principle of phase-referencing Atmospheric / astrophysical limitations Phase-referencing requirements: Practical problems: dispersion

More information

Chapter 6 Lecture. The Cosmic Perspective Seventh Edition. Telescopes Portals of Discovery Pearson Education, Inc.

Chapter 6 Lecture. The Cosmic Perspective Seventh Edition. Telescopes Portals of Discovery Pearson Education, Inc. Chapter 6 Lecture The Cosmic Perspective Seventh Edition Telescopes Portals of Discovery Telescopes Portals of Discovery 6.1 Eyes and Cameras: Everyday Light Sensors Our goals for learning: How do eyes

More information

Astronomical Seeing. Northeast Astro-Imaging Conference. Dr. Gaston Baudat Innovations Foresight, LLC. April 7 & 8, Innovations Foresight

Astronomical Seeing. Northeast Astro-Imaging Conference. Dr. Gaston Baudat Innovations Foresight, LLC. April 7 & 8, Innovations Foresight Astronomical Seeing Northeast Astro-Imaging Conference April 7 & 8, 2016 Dr. Gaston Baudat, LLC 1 Seeing Astronomical seeing is the blurring of astronomical objects caused by Earth's atmosphere turbulence

More information

Today. MIT 2.71/2.710 Optics 11/10/04 wk10-b-1

Today. MIT 2.71/2.710 Optics 11/10/04 wk10-b-1 Today Review of spatial filtering with coherent illumination Derivation of the lens law using wave optics Point-spread function of a system with incoherent illumination The Modulation Transfer Function

More information

Deformable mirror fitting error by correcting the segmented wavefronts

Deformable mirror fitting error by correcting the segmented wavefronts 1st AO4ELT conference, 06008 (2010) DOI:10.1051/ao4elt/201006008 Owned by the authors, published by EDP Sciences, 2010 Deformable mirror fitting error by correcting the segmented wavefronts Natalia Yaitskova

More information

Exoplanets Direct imaging. Direct method of exoplanet detection. Direct imaging: observational challenges

Exoplanets Direct imaging. Direct method of exoplanet detection. Direct imaging: observational challenges Black body flux (in units 10-26 W m -2 Hz -1 ) of some Solar System bodies as seen from 10 pc. A putative hot Jupiter is also shown. The planets have two peaks in their spectra. The short-wavelength peak

More information

Technical Note Turbulence/Optical Quick-Reference Table

Technical Note Turbulence/Optical Quick-Reference Table Quick Reference Table of Turbulence and Optical Relations and Equivalents Matthew R. Whiteley, Ph.D. MZA Associates Corporation, 36 Technology Court, Suite 937-684-4 x, Matthew.Whiteley@mza.com General

More information

Techniques for direct imaging of exoplanets

Techniques for direct imaging of exoplanets Techniques for direct imaging of exoplanets Aglaé Kellerer Institute for Astronomy, Hawaii 1. Where lies the challenge? 2. Contrasts required for ground observations? 3. Push the contrast limit Recycle!

More information

THE DYNAMIC TEST EQUIPMENT FOR THE STAR TRACKERS PROCESSING

THE DYNAMIC TEST EQUIPMENT FOR THE STAR TRACKERS PROCESSING THE DYNAMIC TEST EQUIPMENT FOR THE STAR TRACKERS PROCESSING ABSTRACT Sergey Voronkov Space Research Institute, Russian Academy of Sciences, 117997, Profsoyuznaya str., 84/32, Moscow, Russia Phone: +7 095

More information

Working with Zernike Polynomials

Working with Zernike Polynomials AN06 Working with Zernike Polynomials Author: Justin D. Mansell, Ph.D. Active Optical Systems, LLC Revision: 6/6/0 In this application note we are documenting some of the properties of Zernike polynomials

More information

RADAR-OPTICAL OBSERVATION MIX

RADAR-OPTICAL OBSERVATION MIX RADAR-OPTICAL OBSERVATION MIX Felix R. Hoots + Deep space satellites, having a period greater than or equal to 225 minutes, can be tracked by either radar or optical sensors. However, in the US Space Surveillance

More information

Adaptive Optics for the Giant Magellan Telescope. Marcos van Dam Flat Wavefronts, Christchurch, New Zealand

Adaptive Optics for the Giant Magellan Telescope. Marcos van Dam Flat Wavefronts, Christchurch, New Zealand Adaptive Optics for the Giant Magellan Telescope Marcos van Dam Flat Wavefronts, Christchurch, New Zealand How big is your telescope? 15-cm refractor at Townsend Observatory. Talk outline Introduction

More information

Probing the orbital angular momentum of light with a multipoint interferometer

Probing the orbital angular momentum of light with a multipoint interferometer CHAPTER 2 Probing the orbital angular momentum of light with a multipoint interferometer We present an efficient method for probing the orbital angular momentum of optical vortices of arbitrary sizes.

More information

Analysis of Shane Telescope Aberration and After Collimation

Analysis of Shane Telescope Aberration and After Collimation UCRL-ID- 133548 Analysis of Shane Telescope Aberration and After Collimation Before Don Gavel January 26,1999 This is an informal report intended primarily for internal or limited external distribution.

More information

Properties of Thermal Radiation

Properties of Thermal Radiation Observing the Universe: Telescopes Astronomy 2020 Lecture 6 Prof. Tom Megeath Today s Lecture: 1. A little more on blackbodies 2. Light, vision, and basic optics 3. Telescopes Properties of Thermal Radiation

More information

ASTRONOMICAL SPECKLE INTERFEROMETRY. Review lecture by Y.Balega from the Special Astrophysical Observatory, Zelentchuk, Russia (preliminary version)

ASTRONOMICAL SPECKLE INTERFEROMETRY. Review lecture by Y.Balega from the Special Astrophysical Observatory, Zelentchuk, Russia (preliminary version) NATO ADVANCED STUDY INSTITUTE on OPTICS IN ASTROPHYSICS ASTRONOMICAL SPECKLE INTERFEROMETRY Review lecture by Y.Balega from the Special Astrophysical Observatory, Zelentchuk, Russia (preliminary version)

More information

1 Lecture, 2 September 1999

1 Lecture, 2 September 1999 1 Lecture, 2 September 1999 1.1 Observational astronomy Virtually all of our knowledge of astronomical objects was gained by observation of their light. We know how to make many kinds of detailed measurements

More information

Shadow Imaging of Geosynchronous Satellites

Shadow Imaging of Geosynchronous Satellites Shadow Imaging of Geosynchronous Satellites AFOSR Workshop, Maui Dennis Douglas Bobby Hunt David Sheppard Integrity Applications Incorporated Sept 17 th, 2016 United States Air Force Office of Scientific

More information

Determining absolute orientation of a phone by imaging celestial bodies

Determining absolute orientation of a phone by imaging celestial bodies Technical Disclosure Commons Defensive Publications Series October 06, 2017 Determining absolute orientation of a phone by imaging celestial bodies Chia-Kai Liang Yibo Chen Follow this and additional works

More information

AIR FORCE INSTITUTE OF TECHNOLOGY

AIR FORCE INSTITUTE OF TECHNOLOGY Statistical Image Recovery From Laser Speckle Patterns With Polarization Diversity DISSERTATION Donald B. Dixon, Lieutenant Colonel, USAF AFIT/DEO/ENG/10-11 DEPARTMENT OF THE AIR FORCE AIR UNIVERSITY AIR

More information

Astronomy. Optics and Telescopes

Astronomy. Optics and Telescopes Astronomy A. Dayle Hancock adhancock@wm.edu Small 239 Office hours: MTWR 10-11am Optics and Telescopes - Refraction, lenses and refracting telescopes - Mirrors and reflecting telescopes - Diffraction limit,

More information

Atmospheric Turbulence and its Influence on Adaptive Optics. Mike Campbell 23rd March 2009

Atmospheric Turbulence and its Influence on Adaptive Optics. Mike Campbell 23rd March 2009 Atmospheric Turbulence and its Influence on Adaptive Optics Mike Campbell 23rd March 2009 i Contents 1 Introduction 1 2 Atmospheric Turbulence 1 Seeing..................................................

More information

1. INTRODUCTION ABSTRACT

1. INTRODUCTION ABSTRACT Simulations of E-ELT telescope effects on AO system performance Miska Le Louarn* a, Pierre-Yves Madec a, Enrico Marchetti a, Henri Bonnet a, Michael Esselborn a a ESO, Karl Schwarzschild strasse 2, 85748,

More information

High (Angular) Resolution Astronomy

High (Angular) Resolution Astronomy High (Angular) Resolution Astronomy http://www.mrao.cam.ac.uk/ bn204/ mailto:b.nikolic@mrao.cam.ac.uk Astrophysics Group, Cavendish Laboratory, University of Cambridge January 2012 Outline Science Drivers

More information

Transiting Exoplanet in the Near Infra-red for the XO-3 System

Transiting Exoplanet in the Near Infra-red for the XO-3 System Transiting Exoplanet in the Near Infra-red for the XO-3 System Nathaniel Rodriguez August 26, 2009 Abstract Our research this summer focused on determining if sufficient precision could be gained from

More information

Astr 2310 Thurs. March 3, 2016 Today s Topics

Astr 2310 Thurs. March 3, 2016 Today s Topics Astr 2310 Thurs. March 3, 2016 Today s Topics Chapter 6: Telescopes and Detectors Optical Telescopes Simple Optics and Image Formation Resolution and Magnification Invisible Astronomy Ground-based Radio

More information

PS210 - Optical Techniques. Section VI

PS210 - Optical Techniques. Section VI PS210 - Optical Techniques Section VI Section I Light as Waves, Rays and Photons Section II Geometrical Optics & Optical Instrumentation Section III Periodic and Non-Periodic (Aperiodic) Waves Section

More information