The need for REAL signal - Thoughts on true image quality [Deep Sky] Processing techniques · Jon Rista · ... · 79 · 2148 · 0

jrista 8.59
...
· 
Dan Kearl:
Jon Rista:
To be clear here...I am not saying not to use AI processing tools. I am also trying to be careful about target audience here, by explicitly stating people who are interested in creating high quality images, as I know there is a large (and ever growing, and probably majority) segment of imagers now, who are NOT in the hobby to produce high quality images, but more for...well, other reasons.

I really have no idea what your are talking about now.... Who exactly are you referring to?
Is there something nefarious going on here that people are unaware of?
I browse and post on this site, It seems like most people are tying to image and learn and produce the best images they can.
There are also real pros here who astonish me with there craft.
I guess I miss the people who you claim are up to something else?
This is getting weird .....

I don't think everyone here has an interest in really improving their skill. I think some portion of ABin does, but there are a LOT of images here that just seem to be quickly processed snaps of the night sky, and that's all they ever are, and there seem to be quite a number of imagers that post that kind of thing. Now, I'm not really talking about the main page...just, searching for an object and browsing, is largely where I find that. It is also not just here on ABin, I've noticed this all over the net. Some places a lot more heavily than others, where astrophotography seems to in large part have become another "snap a photo and share it and get some likes" thing. 

I wouldn't call it nefarious or anything like that... Just, a large shift in the hobby. One that admittedly disappoints me, one I believe, based on a lot of threads I've read, is in large part fueled by AI...and an OVEREXTENDED USE of AI. I'm a consumer of AP as much as a producer, and I like quality images of deep space that are well rendered, detailed, interesting, intriguing, showing me incredible views of deep space. There is SO MUCH out there now, that...isn't really much of any of that, excessively noisy, oddly contrasting colors, neither intriguing, nor particularly interesting. Its something I've noticed.

For those people who DO have a deeper interest in astrophotography than snap, share, get likes........ If that's weird to you, ok. I am not the kind of person to give out rewards to everyone, say everyone is doing a sublime job all the time no matter the results, etc. I think there are levels and gradations, good and bad, nice and not, etc. I don't believe everyone is in this hobby for the craft itself, either. For those who ARE...maybe this threat will give them some insight.
Like
rockstarbill 11.02
...
· 
Jon Rista:
There is the Mach 2, which is an evolution of the Mach 1. Its even more pricey. Not many people are going to be using high end mounts like that. The common equipment, maybe is more accessible, but I would dispute that its all that much better.




The Mach 2 sells faster than they can make them. The waiting list for one is about 2 years long, give or take a few months. They exploded in popularity mostly because of the Absolute Encoders they have, the significant move forward for Sky Modeling and unguided imaging, and the fact that you can run up to a 12.5" scope on one of them, with the right care and balancing. So yes the price is high on them, but unless you want to move to a 14"+ scope, it will last you forever.

The performance gain is mostly in the encoders, which as you know will eliminate any backlash in DEC, remove any periodic error in RA, and provide minor wind resistance (very minor in my experience). All of that does lend itself to image quality, especially when seeing is fantastic -- as any error at all (especially at today's imaging scales) will appear rather easily in images. While, BXT can correct for this, the old adage of the data should be as good as possible before processing it, applies.

None of that is news though, encoders have been on mounts for a long time. They just were not as popular until the rise of the Mach 2 and more widespread unguided imaging. There are 10 Micron mounts as well, which also do the same. I would argue that at least in the states, the Mach 2 was more responsible for the boom due to its lower price (without the need for added cost of import from Italy) and AP's renowned customer support. 

The other big change was the advent of the Strain Wave geared mounts (improperly referred to as Harmonic Drive mounts) which greatly increase portability and can shed counterweights entirely -- at the tradeoff of extremely high periodic error. Some have RA encoders on them to reduce the PE, but it never goes to zero and still requires rather short guide pulses to tame it. 

The Mach 1 is still awesome. I have one myself and would never sell it.
Like
jrista 8.59
...
· 
·  1 like
Dan Kearl:
Jon Rista:
The only way to truly improve a digital signal, is with MORE SIGNAL. A giant aperture and a giant sensor, can certainly do that. They are also most assuredly NOT accessible to the majority of imagers. But, the topic I tried to start here, is more interested with astrophotographers who have a distinct interest in creating high quality images...vs. the average imager these days just looking to "snap" some "space photos" (which seems to have pervaded places like FaceBook.)

I am not a face book person but have seen some images posted and I have to ask WHY you have a problem with it?
People are in hobbies for all kinds of reasons. I don't think the majority here  are into this hobby for 
"other reasons" as you stated. If someone wants to take images and post them on Facebook or instagram is that a big problem for you?
I still have no idea who you are referring to with your cryptic posts.
You seem to be just putting down people who you feel should apparently just not be in this hobby because they are not "Pure" enough for you?

You seem to be entirely missing the point. Its not about where people post... I'm not saying certain people shouldn't be in the hobby. I said this thread was really intended for people who have a dedicated interest in producing the best quality images, and that don't think AI is really going to get them there on its own.

I do NOT agree with you that everyone in the hobby, IS in it for that purpose (creating high quality images). Not at all. I believe a growing majority are in it for the quick snaps and quick likes. Ok, fine, but, imagers with those goals weren't why I posted this thread. If you just want quick processes, quick shares and quick likes, I have no problem with that. BUT, you weren't the target audience of the thread I started. If you don't like the thread, or how I post, then you have no obligation to stay here and misinterpret what I'm saying.
Like
SemiPro 7.67
...
· 
·  1 like
Jon Rista:
FWIW I am not saying AI tools shouldn't be used... Just that they should be used effectively, and not overused, and in particular to not become a crutch or in particular as a replacement for real signal.


Yes, I agree with this.
Jon Rista:
Clean noise characteristic (not necessarily the absence of, which IMO is terrible, but clean.)


Even something like this is subjective, and I think it betrays your longer tenure in the hobby; newer imagers are moving away from having noise in their images if they can. Older imagers and those who are on the more conservative side see this as a negative. There is no objective line to be drawn here. I tend to wax and wane on my opinion when it comes to noise. Sometimes I like a bit of fine grain to an an image, other times I like to see it nice and noise free.
Jon Rista:
Regarding mounts and telescopes getting better... Have they? I've heard about some of this new mount technology, and about some of this more plug-and-play technology. I will certainly agree that increases the accessibility and ease of use of the equipment... Is it actually BETTER though? I own an AP Mach 1. Its a darn good mount. I don't know that most of the mounts I'd say most people are using these days, even come close to it. There is the Mach 2, which is an evolution of the Mach 1. Its even more pricey. Not many people are going to be using high end mounts like that. The common equipment, maybe is more accessible, but I would dispute that its all that much better.


I would say, yes, on average the equipment is better. It is certainly more accessible. The biggest revolution has been in imaging cameras which I will get to in just a second.
Jon Rista:
I don't agree that you can do more with less, not if your goal is to create a high quality image.

You can, by mathematical certainty, do more with less just by switching from an old CCD or even 1600MM to a newer camera. You talked a lot about signal, of which I am saying you can now collect more efficiently. If that is not enough, you already said it yourself;
Jon Rista:
(I like the power of NXT even with 50 hours of overall signal...it allowed me to be just a bit more aggressive than I would have otherwise been


This is the very definition of doing more with less. Perhaps with older processing techniques, you would not have been able to push that 50 hours all the way and would had to go out and get 60 or 70 hours.
Like
jrista 8.59
...
· 
Álvaro Méndez:
I agree with your thoughts on this. Although I’m not a great photographer and my objects usually lack a lot of hours (I live in a cloudy/rainy province North of Spain where we have 2 clear nights a month tops and I’m only two years into the hobby), I hate the AI sharpening look and the blobs/orange peel noise reductions. At first it is tempting to over do it but you quickly realize that it is not a nice result and that everybody is overdoing it. The worst part is that I myself do not know when I have crossed the line.

Since I usually work with short integration data totals taken with an OSC (again, weather does not allow me to shoot many hours, go figure going the mono route), I devised a noise reduction method that I use so I can keep the grain -whenever possible- and skip any other noise reduction process. It consists on the following. After the initial stretch, I save a copy of the luminance. Then I go on working on the colour. Obviously this will bring out a lot of noise. When I have achieved my result, I separate the color channels, convolute them ( I won’t need the detail) and create an LRGB using the unaltered luminance. This is the best way I can think of for showing the actual data that I have gathered. Of course there are times in which noise is so high that I use ACDNR or TGV, for example in very dim objects with loads of sky background (I.E. I have reached the 24 hour exposure mark without being able to naturally reduce that hard noise, so then I resort to NR. )

And I think BXT is a great tool but I use it in Correct Only mode. I believe in deconvolution because it is mathemathically precise and even though I always keep my scopes perfectly collimated, it helps with tiny imperfections.

And I’m not a fan of background extraction because I believe it destroys a good part of my precious signal, so I use it only when it is strictly necessary.

Bottom line is I wish I could get more hours and use less cosmetic tricks. I think they are there to help but we need to learn to use them wisely. And I’m the first one who needs to remind this. The eye gets trained with time, now for example I am in the “saturate less, boy!!” phase because I tend to go overboard with that. hopefully in the future I’ll be able to have a nice portfolio of photographs that I can be proud of, but I will only be able to feel that if there’s no cheating involved (and the wearher behaves lol).

Also, that being said I also think there needs to be a balance between perfectionism and joy. I am not having a good time if I have to dedicate 6 months into one object. 2 nights in a month means one night because I usually discard half the data. This is a complicated and sometimes frustrating hobby, so I admire those who dedicate 50+ hours to one object. I just can’t. But I am okay with that.

I actually use BXT in its full mode...but, I do greatly reduce the degree to which it affects stars. I like the overall shape corrective aspect of BXT, but I try not to let it overly reduce the stars. I'm still working on that...I'm still new to all of these tools, so I'm having to figure out how to use them to moderate effect on my own.

Regarding integration times... Maybe I have a tip for you there. Do you image ONLY on TOTALLY clear nights? If so, I would change that. Most of my imaging time involves nights that are not 100% totally clear. I learned a long time ago that to really get lots of integrated exposure time, you need to take advantage of every wholly clear night, as well as every hole. Even on nights where you might have some passing clouds, there are holes in the clouds where you can get some good data. Most of my data sets end up getting culled. I started using 600s subs for my NB imaging the last year or so that I was still imaging, and I would STILL discard tens of subs from the full data set. I often acquired 300, 400 subs across all three narrow band channels, and in some cases I would discard as many as 70 or 80 or more. That can be as much as 13-15 hours of data. BUT!! If I hadn't imaged on those nights that weren't totally clear...I would have had even LESS total data to integrate.

So one of the best ways to get more total data, is to image on those less than perfect nights. The main thing is, is it gonna rain or snow, or will the wind be blowing hard? Obviously, on those nights, keep it packed in. But there are plenty of nights that are mostly just patchy clouds, and you can often get quite a lot of usable data from such nights.
Like
rockstarbill 11.02
...
· 
Luka Poropat:
I can tell you that the image looks soft, the stars are "muted", background is not neutral with splotchy blues and yellows (overall nonuniformity), noisy for the integration time and the core too orange/brown in terms of saturation. Nothing bad in the image as a whole,  just nothing special to deserve some kind of award, distinction or anything. Its just "another" M31 for me.




The image was taken with a 6e~ Read Noise CCD camera, the FLI ML16200 to be exact, in my backyard in the PNW. That is why it is noisy, cameras from those days have 4x-8x the amount of read noise modern cameras do. 

I am not sure what you mean by soft, or that the stars are muted. The core matches the color I see in scientific images, although I am not looking at this on a wide-gamut screen, which is something I always wonder about when people remark on or view images. I do not see these blue and yellows in the background either.
Like
HotSkyAstronomy 2.11
...
· 
Bill Long - Dark Matters Astrophotography:
V.M Legary:
Among the upper echelon of imagers there is a tremendous 'arms race' going on that is pushing the hobby further along and as you mention a consequence of that is long integration times becoming more commonplace, or in absence of that an increasing amount of collaboration to combine multiple individual efforts. The expectations of award winning images are getting higher and higher. That alone leads me to question the entire first part of your post.

Beating you all to full-NIR color images, sorry



NIR images used to be really popular. I forget who it was, but someone posted an NIR enhanced image of the Horsehead Nebula years back (maybe 2017-ish) that was pretty amazing. Taken with a Sony CCD camera from QSI if I am remembering right.

I say go for it. Could have some cool results.

Already am! Very successful, IMX410 is insanely good for NIR.
Like
AstroLux 8.03
...
· 
Jon Rista:
I don't think everyone here has an interest in really improving their skill. I think some portion of ABin does, but there are a LOT of images here that just seem to be quickly processed snaps of the night sky, and that's all they ever are, and there seem to be quite a number of imagers that post that kind of thing. Now, I'm not really talking about the main page...just, searching for an object and browsing, is largely where I find that. It is also not just here on ABin, I've noticed this all over the net. Some places a lot more heavily than others, where astrophotography seems to in large part have become another "snap a photo and share it and get some likes" thing.


This is evident in the IOTD statistics, indicating that approximately  ~10%* of users receive distinct awards. To put it simply, ~10%* of astrobin users contribute high-quality images, while the remaining ~90%* follow a more casual approach of "snap a photo, share it, and receive some likes."


*just the contribution index has 10000 contributers, with at least double the total users on Astrobin and 1820 distinct users with awards (09.03.2024.)
Edited ...
Like
AstroLux 8.03
...
· 
Bill Long - Dark Matters Astrophotography:
Luka Poropat:
I can tell you that the image looks soft, the stars are "muted", background is not neutral with splotchy blues and yellows (overall nonuniformity), noisy for the integration time and the core too orange/brown in terms of saturation. Nothing bad in the image as a whole,  just nothing special to deserve some kind of award, distinction or anything. Its just "another" M31 for me.




The image was taken with a 6e~ Read Noise CCD camera, the FLI ML16200 to be exact, in my backyard in the PNW. That is why it is noisy, cameras from those days have 4x-8x the amount of read noise modern cameras do. 

I am not sure what you mean by soft, or that the stars are muted. The core matches the color I see in scientific images, although I am not looking at this on a wide-gamut screen, which is something I always wonder about when people remark on or view images. I do not see these blue and yellows in the background either.

I do not want to stray from the topic and if you want, you can send me a personal message if you want me to show you what im seeing.
Like
rockstarbill 11.02
...
· 
Luka Poropat:
I do not want to stray from the topic and if you want, you can send me a personal message if you want me to show you what im seeing.




Oh, the image itself I am not too worried about. It's backyard data from a high LP zone and taken with a 92mm refractor and an old CCD chip. I am more concerned about what differentiates a good image from a great image, and what that looks like to others. That's in the spirit of "true image quality" from my perspective. I am also very interested in the actual differences sRGB panels vs Wide Gamut panels show, as that certainly impacts the perspective one would have of an image. I have had cases where someone with a wide gamut panel had a much different take than I did by looking at an image on my MacBook Pro, or on my Windows Desktop, for example.
Like
HegAstro 11.91
...
· 
·  5 likes
Luka Poropat:
while the remaining ~90%* follow a more casual approach of "snap a photo, share it, and receive some likes."


I'm going to make the statement that it is a reach and borderline disrespectful to state that, just because an imager did not have their images selected for an award, they are simply taking an approach of snapping photos. It is the competitive aspect of this, and the flawed notion that an image has to win some award to be "worthy", that turns a lot of people off.
Edited ...
Like
rockstarbill 11.02
...
· 
·  4 likes
Arun H:
Luka Poropat:
while the remaining ~90%* follow a more casual approach of "snap a photo, share it, and receive some likes."


I'm going to make the statement that it is a reach and borderline disrespectful to state that, just because an imager did not have their images selected for an award, they are simply taking an approach of snapping photos....



Also, stating that only 10% of imagers contribute quality images to Astrobin is also not a good look, as that is solely based on the IOTD process -- which no matter what I do (including reading Sal's writeup multiple times) I still cannot understand what it is the IOTD staff is actually evaluating. This is based on my own experience and reviewing images at all three award levels on a consistent basis.

I see images with background issues that are IOTD. Star issues, IOTD. Noise issues, IOTD. Extreme over-use of BXT, IOTD. All of those issues, IOTD. Since the process does not have a feedback mechanism in it, you have no idea what was liked or not liked about an image. In the interest of "true image quality" (which again I applaud Jon for opening this discussion) it would be good to know -- from the IOTD staff or from other interested parties, exactly what that means.

It is also good to know what people think on the opposite side. Those who disagree with the images selected by that criteria and think they are less than great. As well as those that do not care about the IOTD process itself, but do care about "true image quality". There is a middle ground where all of these perspectives come together -- and that is yet another learning I am interested in.

-Bill
Like
jrista 8.59
...
· 
Jon Rista:
FWIW I am not saying AI tools shouldn't be used... Just that they should be used effectively, and not overused, and in particular to not become a crutch or in particular as a replacement for real signal.


Yes, I agree with this.
Jon Rista:
Clean noise characteristic (not necessarily the absence of, which IMO is terrible, but clean.)


Even something like this is subjective, and I think it betrays your longer tenure in the hobby; newer imagers are moving away from having noise in their images if they can. Older imagers and those who are on the more conservative side see this as a negative. There is no objective line to be drawn here. I tend to wax and wane on my opinion when it comes to noise. Sometimes I like a bit of fine grain to an an image, other times I like to see it nice and noise free.
Jon Rista:
Regarding mounts and telescopes getting better... Have they? I've heard about some of this new mount technology, and about some of this more plug-and-play technology. I will certainly agree that increases the accessibility and ease of use of the equipment... Is it actually BETTER though? I own an AP Mach 1. Its a darn good mount. I don't know that most of the mounts I'd say most people are using these days, even come close to it. There is the Mach 2, which is an evolution of the Mach 1. Its even more pricey. Not many people are going to be using high end mounts like that. The common equipment, maybe is more accessible, but I would dispute that its all that much better.


I would say, yes, on average the equipment is better. It is certainly more accessible. The biggest revolution has been in imaging cameras which I will get to in just a second.
Jon Rista:
I don't agree that you can do more with less, not if your goal is to create a high quality image.

You can, by mathematical certainty, do more with less just by switching from an old CCD or even 1600MM to a newer camera. You talked a lot about signal, of which I am saying you can now collect more efficiently. If that is not enough, you already said it yourself;
Jon Rista:
(I like the power of NXT even with 50 hours of overall signal...it allowed me to be just a bit more aggressive than I would have otherwise been


This is the very definition of doing more with less. Perhaps with older processing techniques, you would not have been able to push that 50 hours all the way and would had to go out and get 60 or 70 hours.

Regarding noise, its not so much about how much or little... Its the characteristic. Does that make sense? You could have clearly visible noise, but it might still look better than other images, because the characteristic of the noise is pleasant. Some images may have lower noise, but a poorer quality characteristic. 

Now, one thing I think is terrible these days, is the utter obliteration of noise. That is, in a nutshell, one of the radical overreliance on AI issues that I'm alluding to. I think obliterating noise is terrible! Maybe its a trend, maybe newer imagers like it...if you think its producing a better quality image, that's one of the things I'm trying to call attention to. I would strongly disagree. This is one of the kind of sad and depressing trends I've seen...OBLITERATION. Not just of the noise, but the finer details as well. It seems the newer generation of imagers may have...well, I guess not lost, maybe never developed...an eye for the NUANCES of their images. Nuances which seem to get obliterated rather readily these days, between excessive NR and what I believe is, based on my newfound recognition of these particular artifacts, is star removal and star addition. I've rarely done star removal, as it always seemed to be destructive to the finer details and nuances of the image. Supposedly there are some more manual approaches these days that can preserve the details, I still have to try them out. But with things like SXT, there is a characteristic in the artifacts of the image that are destructive to the details. I see it all over the place. Obliteration. It's really disappointing, and in this case I am not in alignment with the majority...I don't think it looks good, I don't think it will ever look good. It looks damaged, to be perfectly honest. 

I think one of the most intriguing things about astrophotography, is the exploration of the fine details. There are some amazing structures up in space...and I love exploring a quality, high resolution image of space, even for objects I'm quite familiar with, as you never know what new details you might come across. These days, a heck of a lot of the time, when I view the full size image...there just ARE NO DETAILS. That's been really sad and disappointing. Part of the reason I started this thread. I honestly don't know if it is intentional...my assumption, honestly, is that it is not...but more of a side effect of how current generation AI processing tools work. They work a certain way. People use them (and even recommend using them) a certain way. The results, look a certain way. Obliteration of details seems common enough now...I'm not saying people are doing that totally intentionally. I honestly wonder, do people even know they are obliterating so many details?

I actually am working with RC-Astro to try and improve NXT's recognition of fine, dark details, which is one of the kinds of details that is so readily destroyed by NXT. I'm feeding him carefully selected exemplars to try and help future NXT trainings identify and recognize fine, dark details, so that darker structures don't just get totally smoothed over. I have always found that the details you can find in images with bright backgrounds, and lots of dark foreground dust, are some of the most intriguing. These days, dark dust is almost always rendered very smooth, flat, largely structureless (utterly on a finer level, somewhat on coarser scales) and largely lacking in any interesting detail. 

Regarding technology. Some of the best images ever produced, are still from CCDs. In particular the KAF-16803 CCD, which is still the king of the best images I've ever seen. SOME CMOS images are starting to get there, but I'm still not quite sure I've come across a CMOS image that really surpassed the best 16803 images I've ever encountered. If you only account for the pixel, then sure, maybe you can come up with some math that is "certain" about the superiority of CMOS. But, we don't image with just cameras. We image with telescopes. Those telescopes are under skies of a certain brightness. There are more factors than just the sensor. 

FWIW, overall signal collection efficiency, isn't just about Q.E. A lot of it is about aperture and image scale. An old noisy sensor, with big pixels, and a big aperture, even if it had say 60% Q.E. could still be a more efficient system. Read noise is also even a factor that can be negated, if you can expose long enough per frame. Q.E. differences can be overcome with often small adjustments to image scale. Newer CMOS cameras are amazing, can't wait to get a QHY600, but...there are images produced with CCDs paired with systems that were still incredible light guzzlers, in their time and today. 

To your final point, touche. I guess I can do a little bit more with...well, I wouldn't say less. I did more with the SAME, for sure. Such is the benefit of AI! I will still invest the time to get tens of hours of integration, regardless. I would not, however, reduce my integration times by half, or more, and assume that AI could give me the same image in the end. In fact, I know it cannot, because with my latest image, I originally started with about a third of my total data, and even with AGGRESSIVE AI use, and extensive processing, I was still not able to produce the same quality as the full data set (over 47 hours after discarding over 12). Further, with the use of AI, the final image required much less processing and far less aggressive use of AI, to produce a vastly superior result. That, too, is one of the reasons I've posted this thread... I was pretty darned amazed at what modern processing tools were doing for me at first. Then I realized I was severely handicapped on the data side, and when I resolved that issue. Well, AI is still very intriguing, but not quit as much as it was.

I'm not trying to diss anyone using AI. Just offer that, AI isn't actually a replacement for more real signal. It can do amazing things...but, it can do even more amazing things with more real signal. Another way to put it, I guess...if you never captured photons on something in the first place, AI could never reveal it. It simply isn't there in the data. If you actually acquire some signal on something, then AI can certainly help you reveal it. If you get even more signal on something, AI can help you reveal it and make it look amazing. But without any signal at all, AI can't help you at all...
Edited ...
Like
AstroLux 8.03
...
· 
Bill Long - Dark Matters Astrophotography:
Oh, the image itself I am not too worried about. It's backyard data from a high LP zone and taken with a 92mm refractor and an old CCD chip. I am more concerned about what differentiates a good image from a great image, and what that looks like to others. That's in the spirit of "true image quality" from my perspective. I am also very interested in the actual differences sRGB panels vs Wide Gamut panels show, as that certainly impacts the perspective one would have of an image. I have had cases where someone with a wide gamut panel had a much different take than I did by looking at an image on my MacBook Pro, or on my Windows Desktop, for example.


What differentiates a good image from a great image is being unique, never seen before perfect colors, no noise, great detail etc. (with realism and being scientfically correct, in terms of colors Ha regions ruby red, Ha+Hb regions being pink, OIII being cyan/green, dark dusts being dark, IFN being gray etc. ). True image quality is that amongs other technical requirements. Yes different panels show same image differently, I see it all the time on images I look at daily including my own. The best you can do if you take astrophotography seriously have a serious panel at minimum 100% sRGB, its the same for example refractors if you want the best you get the best you get flourite, FPL 53, FCD 100 and not a FPL51 or undisclosed glass that produces purple, blue friging etc. 


But I can tell you confidently that those dont 100% matter if "flaws" are easy to notice (the panels, monitors etc.). And if I watch on my phone, my monitor and other panels, your M31 will still have the same technical "flaws" I already mentioned. 

Realistically speaking not everyone will have the best panel or even a good one. The goal is to produce the best from your own view and have others point stuff you might have missed and improve upon that.
Like
njr95 1.43
...
· 
·  2 likes
Hi @Jon your advice on CloudyNights has helped me massively on what to prioritise when shooting from the city (narrowband targets) and in dark skies (true RGB).

I agree that nothing beats long integration times although new tools like AI allow us to push images harder. I have a couple of principles that help adhere to this belief:
1. If I’m going to be spending lots of time trying to fix images with bad/insufficient data, I might as well use that same time to wait and acquire more data. Staring at a monitor too long gives me strained eyes and a headache.
2. There’s a quote from Adam Block that I keep close. To paraphrase “if certain details cannot be pulled out from an image without breaking it, that means I haven’t earned the right to bring out those details”. Which means getting more or better quality data.
Like
AstroLux 8.03
...
· 
Arun H:
Luka Poropat:
while the remaining ~90%* follow a more casual approach of "snap a photo, share it, and receive some likes."


I'm going to make the statement that it is a reach and borderline disrespectful to state that, just because an imager did not have their images selected for an award, they are simply taking an approach of snapping photos. It is the competitive aspect of this, and the flawed notion that an image has to win some award to be "worthy", that turns a lot of people off.

I was trying to just say that its statistically correct that astrophotographers exist that do not really care about perfection and similar things in an image and are having fun in astrophotography. That post is directly an oversimplification to Jons post on how there are multiple types of astrophotographers. I completely agree with you that its a flawed notion when taken 100% seriously, even I think others, including me, on Astrobin have great images that have not been awarded but are great in their quality.
Edited ...
Like
jrista 8.59
...
· 
·  1 like
Luka Poropat:
Jon Rista:
I don't think everyone here has an interest in really improving their skill. I think some portion of ABin does, but there are a LOT of images here that just seem to be quickly processed snaps of the night sky, and that's all they ever are, and there seem to be quite a number of imagers that post that kind of thing. Now, I'm not really talking about the main page...just, searching for an object and browsing, is largely where I find that. It is also not just here on ABin, I've noticed this all over the net. Some places a lot more heavily than others, where astrophotography seems to in large part have become another "snap a photo and share it and get some likes" thing.


This is evident in the IOTD statistics, indicating that approximately  ~10%* of users receive distinct awards. To put it simply, ~10%* of astrobin users contribute high-quality images, while the remaining ~90%* follow a more casual approach of "snap a photo, share it, and receive some likes."


*just the contribution index has 10000 contributers, with at least double the total users on Astrobin and 1820 distinct users with awards (09.03.2024.)

I haven't looked into the actual statistics of IOTD.  I also don't want to claim I know what the other 90% "ARE" doing... I am sure some of that 90% are trying to improve their craft, for sure. I would also say that, here on ABin, I think it is more likely that an imager has a deeper interest in the hobby. Some of what I've come across is on other sites. Instagram, Facebook....there are so many images shared there, and the mentality you often see there, is clearly geared towards "getting likes" ... Such is the nature of those platforms. 

I was just disputing the notion that "everyone" here was interested in improving their craft. I think its stating the obvious that, that couldn't be the case. What I have seen is more readily visible if you hit the ABin search, and just search for some object (notably popular ones), and then just start browsing through. The main page is a different story, IOTD is a different story, I guess I wasn't really looking at those as a basis for the trends I've been observing.
Like
jrista 8.59
...
· 
Joon Ren:
Hi @Jon your advice on CloudyNights has helped me massively on what to prioritise when shooting from the city (narrowband targets) and in dark skies (true RGB).

I agree that nothing beats long integration times although new tools like AI allow us to push images harder. I have a couple of principles that help adhere to this belief:
1. If I’m going to be spending lots of time trying to fix images with bad/insufficient data, I might as well use that same time to wait and acquire more data. Staring at a monitor too long gives me strained eyes and a headache.
2. There’s a quote from Adam Block that I keep close. To paraphrase “if certain details cannot be pulled out from an image without breaking it, that means I haven’t earned the right to bring out those details”. Which means getting more or better quality data.

Both of these are excellent! The tradeoff inherent in #1, is very real. Boy, I've spent some HOURS trying to make do with insufficient data... Time I wish I could have back.

I hadn't heard Adam Block's quote before. It's about as fundamental as it gets though, I think.
Like
rockstarbill 11.02
...
· 
Luka Poropat:
What differentiates a good image from a great image is being unique, never seen before perfect colors, no noise, great detail etc. (with realism and being scientfically correct, in terms of colors Ha regions ruby red, Ha+Hb regions being pink, OIII being cyan/green, dark dusts being dark, IFN being gray etc. ). True image quality is that amongs other technical requirements. Yes different panels show same image differently, I see it all the time on images I look at daily including my own. The best you can do if you take astrophotography seriously have a serious panel at minimum 100% sRGB, its the same for example refractors if you want the best you get the best you get flourite, FPL 53, FCD 100 and not a FPL51 or undisclosed glass that produces purple, blue friging etc.




I think this is a reasonable take, except for "no noise" as you are saying that a great image is impossible. Not even the Hubble and JWST images are completely free of any noise at all. The rest of this though, I do not want the "no noise" part to take away from. I have mulled over the panel I use to process images with, as I never really thought it would be all that impactful, but it sounds like it can be. While off-topic, you may have helped me make a decision about the new 2024 Dell 40" 5k screen...

Back to true image quality though. I think your analogy about optical systems is a good one. I guess the question I would have is, if one took a Wide Gamut screen (Jon would know which of the standards are best to align with, maybe Abobe RGB? DCI-P3?) and did all of their processing and image creation on one -- what would that then look like on an sRGB screen? Better? Worse? All of this would certainly impact the perspective of "true imaging quality" I would imagine?
Like
AstroLux 8.03
...
· 
Bill Long - Dark Matters Astrophotography:
I think this is a reasonable take, except for "no noise"


 I more mean great SNR of course we dont live in a perfect world so "no noise" is impossible to achieve. 

Bill Long - Dark Matters Astrophotography:
ll of this would certainly impact the perspective of "true imaging quality" I would imagine?


You mentioned earlier a distinction between IOTD and "true image quality". Can you elaborate on this. Specifically what you define as "true image quality", IOTD is not a perfect system, it has its guidelines and I am sure you read them and seen the examples of stuff to avoid in images, but what would you say is different from a regular IOTD and a true image for what you call it.
Like
rockstarbill 11.02
...
· 
Luka Poropat:
Bill Long - Dark Matters Astrophotography:
I think this is a reasonable take, except for "no noise"


 I more mean great SNR of course we dont live in a perfect world so "no noise" is impossible to achieve. 

Bill Long - Dark Matters Astrophotography:
ll of this would certainly impact the perspective of "true imaging quality" I would imagine?


You mentioned earlier a distinction between IOTD and "true image quality". Can you elaborate on this. Specifically what you define as "true image quality", IOTD is not a perfect system, it has its guidelines and I am sure you read them and seen the examples of stuff to avoid in images, but what would you say is different from a regular IOTD and a true image for what you call it.



The "true image quality" I referred to, is the topic of the thread. I do not have a definition.
Like
rockstarbill 11.02
...
· 
·  1 like
Luka Poropat:
I more mean great SNR of course we dont live in a perfect world so "no noise" is impossible to achieve.


Okay, so from an image by image perspective, how do you look at an image and determine its SNR? What yardstick are you using to make that call?
Like
SemiPro 7.67
...
· 
·  1 like
Jon Rista:
Now, one thing I think is terrible these days, is the utter obliteration of noise. That is, in a nutshell, one of the radical overreliance on AI issues that I'm alluding to. I think obliterating noise is terrible! Maybe its a trend, maybe newer imagers like it...if you think its producing a better quality image, that's one of the things I'm trying to call attention to. I would strongly disagree. This is one of the kind of sad and depressing trends I've seen...OBLITERATION. Not just of the noise, but the finer details as well. It seems the newer generation of imagers have...well, I guess not lost, maybe never developed...an eye for the NUANCES of their images. Nuances which seem to get obliterated rather readily these days, between excessive NR and what I believe is, based on my newfound recognition of these particular artifacts, is star removal and star addition. I've rarely done star removal, as it always seemed to be destructive to the finer details and nuances of the image. Supposedly there are some more manual approaches these days that can preserve the details, I still have to try them out. But with things like SXT, there is a characteristic in the artifacts of the image that are destructive to the details. I see it all over the place. Obliteration. It's really disappointing, and in this case I am not in alignment with the majority...I don't think it looks good, I don't think it will ever look good. It looks damaged, to be perfectly honest.


As you mention later in your post, AI will never reveal what was not there in the first place. In that sense, when you leave granular noise in the image, are we sure our eyes are just not being fooled into thinking there is detail when it really is just noise? Part of the reason I leave noise in images is for this reason. When I flip back and forth between no noise reduction and full noise reduction, the only thing I am 'obliterating' is the noise, but to the untrained eye - which is everyone who does not see the raw data - that noise appears as if it is detail. 

If you were talking about images with low integration times I would be inclined to agree here, but there are images out there where people have completely de-noised it while preserving a good amount of detail.

I am also not going to sit here and pretend that people don't nuke details with de-noise techniques. It lies with the skill of the processor.
Jon Rista:
Regarding technology. Some of the best images ever produced, are still from CCDs. In particular the KAF-16803 CCD, which is still the king of the best images I've ever seen. SOME CMOS images are starting to get there, but I'm still not quite sure I've come across a CMOS image that really surpassed the best 16803 images I've ever encountered.


I think this would be true before the rise of the IMX571 and friends. Respectfully I have to say this is a very dated opinion that would be true if we were comparing images from the 1600MM with CCD cameras.  The best images I see today are from CMOS users. Some of the big names are clinging onto their CCD cameras to be sure, but to say that only some CMOS images are starting to approach CCD images is a bit of a smack in the face to a lot of the high quality work that has been completed in the past few years.

Since we are just two people arguing on the internet, I suppose this is very subjective. So I will say when I look at competition winning images (not AB's Iotd), the only place where CCD's still have a lot of fight left in them is in images of galaxies, aka where their large pixels can still excel with large telescopes. However, even here they are losing ground to CMOS images.
Jon Rista:
FWIW, overall signal collection efficiency, isn't just about Q.E. A lot of it is about aperture and image scale. An old noisy sensor, with big pixels, and a big aperture, even if it had say 60% Q.E. could still be a more efficient system. Read noise is also even a factor that can be negated, if you can expose long enough per frame. Q.E. differences can be overcome with often small adjustments to image scale. Newer CMOS cameras are amazing, can't wait to get a QHY600, but...there are images produced with CCDs paired with systems that were still incredible light guzzlers, in their time and today.


Yes I am aware, and I am glad you brought this up because I think more people need to realize this! Still, the only realm where CCDs can maintain any sort of competitiveness is with large, long focal length telescopes that can fully utilize a CCD's large pixel size. Even then, I think for someone building a system from scratch they are just better off buying a CMOS these days. It's not going to get any better for CCD's as the CMOS revolution marches on.
Jon Rista:
I actually am working with RC-Astro to try and improve NXT's recognition of fine, dark details, which is one of the kinds of details that is so readily destroyed by NXT. I'm feeding him carefully selected exemplars to try and help future NXT trainings identify and recognize fine, dark details, so that darker structures don't just get totally smoothed over. I have always found that the details you can find in images with bright backgrounds, and lots of dark foreground dust, are some of the most intriguing. These days, dark dust is almost always rendered very smooth, flat, largely structureless (utterly on a finer level, somewhat on coarser scales) and largely lacking in any interesting detail.


At any rate, its good to see you getting back into things and I hope your work with RC-Astro pays off, because we all benefit!
Like
rockstarbill 11.02
...
· 
I think this would be true before the rise of the IMX571 and friends. Respectfully I have to say this is a very dated opinion that would be true if we were comparing images from the 1600MM with CCD cameras.  The best images I see today are from CMOS users. Some of the big names are clinging onto their CCD cameras to be sure, but to say that only some CMOS images are starting to approach CCD images is a bit of a smack in the face to a lot of the high quality work that has been completed in the past few years.




Some of the KAF-16803 sensors have lower read noise per micron squared than the new Sony chips do. I have a PL16803 with a measured 8e of noise, which is ridiculously low. Its currently on its way to FLI for its final service (they cease servicing them in May 2024) and once it comes back, it'll get built out and sent to New Mexico as the backup camera for me there. I currently use the IMX461 on the CDK14 there, so at some point I will be able to get the same data, with the same scope, in the same location with both the modern Sony architecture and the old behemoth. 

I expect to find that it'll trounce the Sony chip on broadband targets, and get its lunch handed to it in narrowband. 

-Bill
Like
AstroLux 8.03
...
· 
Bill Long - Dark Matters Astrophotography:
I have a PL16803 with a measured 8e of noise, which is ridiculously low.


Considering modern technology that is ridiculously high. Even IMX 455 at LCG at its highest has 3,5e of read noise whilst reaching 1,5e of readnoise at HCG.
As someone that is using both large CCD and CMOS sensors (above FF size) I can tell you the only upside in the CCD architecture in 2024 is the size of the pixels that are good match for longer focal lenght telescopes on average seeing conditions. Apart from that CMOS sensors demolish them completely, read noise, dark current, higher QE amongst other things. The future is now.
Like
 
Register or login to create to post a reply.