Contains:  Gear
More Progress on my 20” Telescope, John Hayes

More Progress on my 20” Telescope

More Progress on my 20” Telescope, John Hayes

More Progress on my 20” Telescope

Description

It’s been a while since my last update on the PW20 project and a lot has happened since then.

Starting where I left off in my last report, Gaston Baudat at IFI helped me out by rebuilding my new, wide-field ONAG unit using the newly redesigned parts I showed in my last post. It’s important to clarify that even though it looks similar from the outside, this is a very different animal from the normal ONAG units that IFI sells. It is designed to operate with a faster beam, a larger sensor over a larger field of view with no vignetting. The assembled unit is shown on the bench in rev A. Gaston first corrected a couple of minor errors he discovered in my design and then assembled all the parts. He was very careful to set the beam splitter angle to minimize the chance of a tilted focal plane. On the day that I got it back, I quickly re-mounted it on the scope so that I could roll it out under clear skies that evening. Once it was set up, I started the procedure to co-focus the guide and imaging cameras and that’s when I started to see some unusual effects. First, when I put a B-mask on the scope, I could immediately see obvious evidence of astigmatism. Worse, when I took my first image, I discovered astigmatic stars—everywhere. Oh, oh… we have a problem Houston! I defocused the main imaging camera and took rev B so that I could share my findings with Gaston.

We both scratched our heads. On my side, I had changed the scope baffles and that might have knocked the secondary a little out of alignment. I wondered if I had somehow moved the field lenses when I replaced the primary baffle so I pulled the baffle to examine the assembly and quickly saw that there was no way that I could have moved anything. Gaston was equally baffled. He told me that he had been super careful to align everything and that the ONAG should be working perfectly. Over the next two weeks I had the scope out numerous times to try to figure out what in the world was going on. The aberration appeared no matter where I pointed the scope, it appeared both on and off-axis, it varied with position, and it didn’t change as the scope cooled down. I resisted the urge to adjust anything and several times I had to tell myself to leave it alone until I find the problem! I really wanted to tweak the secondary but I knew that that was a bad idea—and it was!

After about two weeks, Gaston and I were discussing this problem when he casually mentioned that he remembered that he had made one “small improvement” to the ONAG but that it shouldn’t have any effect on its performance. Whoa…say what? Exactly what was the change? In order to more positively set the beam splitter angle, Gaston had added some spring plungers to push against the optical carrier. He described the design and the care he took to ensure that the plungers where in a position so that they shouldn’t distort anything mechanically. Drawing on my 25+ years of experience designing interferometer equipment, I quickly pointed out that at the level of 10-500 nm, everything is made of rubber. You have to be extremely careful when you push on anything to avoid distorting it on the order of tens of nanometers. We concluded that that innocent “small” change had to be the source of the problem and Gaston agreed to rebuild the unit using his standard mechanical configuration. So, the ONAG made another cross-country trip via FedX for more work.

When I got the ONAG back, things looked a lot better but the whole episode re-emphasized the importance of optical quality in the ONAG to ensure good imaging performance when used with a faster, wide field system. Image B shows a full field defocused image demonstrating that the system was operating with MUCH less aberration as shown in rev C. After imaging M109 for 3-4 nights, I pulled the ONAG off the system so that I could run it down to Arizona where I have access to a WYKO 6000 interferometer at the university. With the F/6.8 beam in this scope, the data shows that the wavefront errors over the beam for the whole field could be a little better for operation under 0.5” or better sky conditions. Gaston and I agreed that the problem might still be due to a residual optics mounting issue so we are working on a fix and a way to verify the performance. Remember that we are using a lot more of the beam splitter in this wide field version of the ONAG compared to the standard IFI ONAG that I have on my C14 Edge system. I may share some of the data once we get the system tuned up. I’m sold on ONAG as an excellent way to guide and maintain perfect focus so we still have one more problem to solve before this new “wide-field/fast optics” version is ready to go.

Vignetting Solved

The good news is that the changes that Planewave made to the baffling system and those that I made to the new wide-field ONAG to virtually eliminate vignetting worked perfectly to fix the flat calibration problem. Revision D shows a raw image next to a calibrated image and it’s as good as anything that I’ve ever produce with the C14. The crazy thing is that I can’t say for sure how much vignetting is too much but in this case, flat calibration was consistently failing with around 50% of the exit pupil vignetted. There may be some subtleties involving the way that PI normalizes the flat frames that I may be missing and that may play into the limits of how well vignetting gets corrected. I’d like to understand this better but at this point, I can’t provide a clear model to predict what will work and what won’t. Just remember that if calibration doesn’t work, the two most important things are to check are 1) that there isn’t any stray light and 2) that the level of vignetting is as “small” as possible. My sense is that troubles probably start when the level of vignetting gets above around 30%. Anything at or above 50% has a good chance of causing problems.

Camera Considerations

One of the things that’s been a concern ever since I configured the scope was the decision to use a FLI-ML16803 camera. With 9-micron pixels, the signal will be improved by 248% over my C14 system; but, at F/6.8 those 9-micron pixels will not take advantage of the increase in aperture to reveal additional image detail. At the time, I picked the FLI camera because of the size of the sensor more than anything else, but now that new, large format CMOS cameras have become available, I went back to revisit the issue.

A key goal of this whole project is to take advantage of 0.5 arc-sec seeing conditions that I understand is the median sky condition at Obstech. It can be better or worse than that, but the design goal is to make sure that the scope can at least take advantage of 0.5” conditions when they occur. The way to gain some understanding of imaging quality is to first understand the MTF function. (You can learn more about MTF here: https://www.youtube.com/watch?v=_d7mMNlZxRQ.) All optical systems form a bandpass filter and the MTF shows the maximum spatial frequency that the system can pass. The problem is that the MTF of the optical system shows only what the optics can do. In order to understand the impact of seeing and the sensor itself, we have to figure out how MTF is affected by both of these factors. This site really isn’t a good place for me to go through the math or the whole calculation, but starting with Kolmogorov’s and Fried’s turbulence theory, I was able to produce a model of MFT due to atmospheric turbulence. That model is useful for initially determining the range for optimum sampling relative to three different seeing conditions, shown in Rev E. From that, I’ve did a full MTF analysis of the telescope under those seeing conditions using two sensors (KAF-16803 and Sony IMX455, shown in Rev F and compared those results to what I get from my C14.

Although the results are somewhat intuitive, they are interesting because they rise above hand-waving by providing quantified results. Here are the five big takeaways:

1) First off, when seeing improves, you get better results with any given camera—maybe not by a lot, but image quality will improve. We knew that…so that’s no biggie.

2) Second, we can see that under all seeing conditions, using a camera with 9-micron pixels will do very little to produce a sharper image with the 20” Planewave relative to the 14” Celestron. In fact, under 0.5” conditions, the 20” with 9-micron pixels has a lower peak frequency response, suffers from mild aliasing, and operates with a Strehl of only 48%, which is worse than the C14 at 58% Strehl! It costs a lot of money to buy all this equipment, ship it, and import it to Chile so the lesson is that if you want to use a 16803 sensor and you think that the bigger scope will produce sharper images, don’t bother with a 20” (or 17")! The 14” Celestron Edge will work as well--and even a little better! Since both cameras operate at nearly the same pixel scale, that conclusion might not be completely obvious. Of course, the 20” will produce images with much less exposure time than the 14” and that’s a valid improvement but there’s no reason to ship it to a place with 0.5” conditions to get that benefit. This conclusion is born out in my recent M109 image taken with the 20” using the FLI-ML16803 camera. That image is no sharper than what the C14 would produce under the same conditions--as shown in Rev K. Remember, this analysis is relevant specifically to the PW20. Similar analysis for larger scopes like the 24” or 39” Planewave scopes will depend on their optical parameters and is very likely to be different.



3) Third, moving to the more sensitive CMOS IMX455 sensor that can be software binned to a 2x2, 7.52-micron super pixel does almost nothing for image sharpness compared to a 9 micron pixel but it does achieve a 245% signal boost over the C14 signal—virtually the same as the FLI camera. This is a very viable way to operate either for 1”-2” seeing conditions, very faint objects, or for large field nebula lacking intricate detail.

4) Fourth, the IMX455 with no binning produces 61% of the C14 signal but it nearly triples the peak frequency response under median 0.5” conditions compared to the C14 under median 2.0” conditions. Exposures will have to go up by about 60% and it may be necessary to crop the data for bandwidth reasons, but this looks like an excellent option to maximize detail in small galaxies. Under these conditions, images from the 20” should be noticeably more detailed than what can ever be achieved with the 14” system.

5) Finally, the model shows that the improvement in image sharpness won’t be very significant with smaller pixels until the seeing reaches about 1” or better. This is an important takeaway and it’s one that I’ve seen argued endlessly among imagers on other forums. Simply using smaller pixels under poor conditions does nothing more than decrease signal, which results in low SNR in the final result. This analysis clearly shows that using small pixels only becomes a significant benefit when combined with good seeing!

I took a quick look at finding the optimum sensor and for this scope, it would be one with 4.8-micron pixels. The signal would be the same as it is for the 14” scope and the impact on MTF relative to the 3.76-micron pixel case is minor. So, there you have it. This study actually converged to a single “best” solution. Unfortunately, no one makes a sensor with that perfect dimension!

Star Testing

For the last year or so Gaston and I have been working on a new method for analyzing star test data using AI. To be clear, Gaston has done most of the heavy lifting to write code and implement this whole idea, which at its core was his idea in the first place. Last summer we wrote a paper together that we presented at SPIE and we finally had a chance to try this technique on the 20” under the sky. Rev G shows a schematic diagram of how the system works. The idea is to generate a random set of Zernike terms that are used to generate a large set of wavefront data. Each wavefront is then apodised, defocused and “diffracted” using the far-field Fraunhofer diffraction equation to produce a set of defocused star-test images. The data has to compensate for coherence and include information about seeing and obscuration ratio and decenter (as a part of the apodization process) so there’s a lot that goes into computing the learning data, which can contain up to 600,000 256x256 images. Once trained, the neural net (NN) does an amazing job of matching results. Rev H shows a 20 second star-test image that I took with a broadband red filter after visually aligning the secondary. Next to the actual star image is the estimated solution found by the NN. Rev I shows the wavefront computed from the estimated Zernike polynomial terms compared to the PhaseCam wavefront data that I took of the telescope when I took delivery. As you can see, the agreement is quite good--surprisingly good actually! This data was taken on-axis with the ONAG and it shows with a pretty high degree of confidence that the telescope is operating as well as it did when it rolled out of the factory. The beauty of this method is that it doesn’t require any special equipment beyond what you would normally use for imaging. On top of that, we’ve been able to demonstrate that it has sensitivity on the order of 1/100 wave. With the appropriate training, the NN can identify minor high order zones and trefoil errors. The sensitivity is far beyond anything that anyone can do “by eye” (as described in Suiter’s book). We have been able to show that when it is carefully implemented, the accuracy of this method can approach the accuracy of digital interferometry.

It may be taking a really, really long time to get this scope configured and shipped off to the southern hemisphere, but in the meantime, I’m having a LOT of fun with it! For me getting the engineering right is half the fun. I know that a lot of folks will glaze over with all this technical stuff, but I post it for those of you who have an interest in what goes into an optical engineering project. So I hope that at least a few of you will enjoy it.

Finally let me mention that for anyone interested in this stuff, I’ll be giving an open Zoom seminar to the optical design class at the Wyant College of Optical Sciences on May 3rd (2021). If you want to check it out, you can find out more information here: https://www.optics.arizona.edu/news-events/events/opti617-public-talk-john-hayes-optics-adventures-during-pandemic-engineering-remo

John

PS I forgot to include the signal strength comparison calculation that shows how the PW20 compares to the C14. If you are interested in how it works out, you can find it in Rev J.

I've had some discussion with folks behind the scenes about how much of an improvement the 20" might provide in image sharpness. So I did this comparison to demonstrate how seeing limits the amount of detail that can be achieved with both my C14 and the PW20. Hopefully when I get the 20" running under better seeing conditions, I'll be able to demonstrate the improvement in image detail.

For those of you who would like to view the presentation that I did for the college, you can find it here:

https://youtu.be/te4UVYi6n44

The talk doesn't start until about the 3:35 mark so you can skip past everyone signing in. Hopefully they will leave it posted for a while.

Comments

Revisions

  • More Progress on my 20” Telescope, John Hayes
    Original
  • More Progress on my 20” Telescope, John Hayes
    B
  • More Progress on my 20” Telescope, John Hayes
    C
  • More Progress on my 20” Telescope, John Hayes
    D
  • More Progress on my 20” Telescope, John Hayes
    E
  • More Progress on my 20” Telescope, John Hayes
    F
  • More Progress on my 20” Telescope, John Hayes
    G
  • Final
    More Progress on my 20” Telescope, John Hayes
    H
  • More Progress on my 20” Telescope, John Hayes
    I
  • More Progress on my 20” Telescope, John Hayes
    J
  • More Progress on my 20” Telescope, John Hayes
    K

B

Description: Defocused image with the wide-field ONAG showing a problem with astigmatic errors varying over the whole field.

Uploaded: ...

C

Description: Defocused star field after fixing the ONAG. This shows a significant improvement! If you closely examine the defocused images at the outer part of the field, you can see vignetting due to internal apertures that limit off axis rays through the field lenses. So, in that case, the out-of-round images are not an indication of astigmatic errors. It's just vignetting.

Uploaded: ...

D

Description: Before and after image calibration with the new wide-field ONAG and telescope baffles. This data is shown with a very aggressive stretch to emphasize signal fall off and uniformity. If you look with a really sharp eye, you might be able to spot a slight vertical gradient with the lightest side toward the bottom. That would be due to a slight sky gradient from the town of Redmond that sits to the north of my location. When an image is properly calibrated, you should be able to accurately render small variations due to real effects like this one across the image.

Uploaded: ...

E

Description: MTF analysis showing PW20 response with 9, 7.52 and 3.76 micron pixels under seeing conditions of 2", 1", and 0.5".

Uploaded: ...

F

Description: MTF performance comparison between PW20 with QHY600 and C14 with FLI-ML16803. Note: There is a typo in this image. The sensor should read as KAF-16803 (not KAI-16803.)

Uploaded: ...

G

Description: Neural net training schematic for AI spot testing

Uploaded: ...

H

Description: Actual star test images (on left) next to the best fit synthetic star test images (on right) estimated by the AI system.

Uploaded: ...

I

Description: A comparison of AI estimated system wavefront derived from star test data to wavefront data measured by a PhaseCam interferometer

Uploaded: ...

J

Description: This image shows how to compare the relative signals between two imaging systems.

Uploaded: ...

K

Description: An apples-to-apples comparison of a C14 image with a PW20 image of M109. The C14 image (on the left) is a stack of 10x1200s subs and this is a mostly unprocessed, stacked Lum image. Stray light is flooding the left side of the image so I've zoomed way in to compare only galaxy details. The PW20 image is from stacking 24x600s Lum subs. So, there is more signal in the PW20 data and with a larger aperture, more faint stars will always be recorded. Both images were taken under 1.7"-2.0" seeing conditions, which is the key limit that determines how much detail each system can produce. Under these conditions, using the same camera, the PW20 gathers signal faster than the C14, but it can't produce any more image detail. Better seeing conditions are required to be able to use the aperture advantage of the 20" to bring out more image detail.

Uploaded: ...

Histogram

More Progress on my 20” Telescope, John Hayes

In these public groups

Cloudy Nights