The need for REAL signal - Thoughts on true image quality [Deep Sky] Processing techniques · Jon Rista · ... · 79 · 2150 · 0

jrista 8.59
...
· 
·  14 likes
Having just been getting back into the astrophotography fray lately, after having been largely out of the game for a few years, I thought I'd start a discussion on the need for real signal. I kind of landed here, in 2024, right in the middle of the advent of AI tools. To lay out a key disclaimer...I'm old school. I processed everything manually in the past, from all pre-processing steps, to what were sometimes extremely extensive post-processing workflows. So I come into this modern AI-fueled world with that mindset, and a deep recollection of the nature of images and their image quality, spanning back over a decade now. 

Having found myself dropped strait into the boiling pot of AI-powered processing, I thought I would offer a key observation and one recommendation.

-=[ 1 ]=-

First, the observation. My honest, heartfelt opinion with regards to AI-powered processing, is that it has resulted in a marked, notable decline in general image quality. I suspect because the probably mistaken assumption is that it can REPLACE something it cannot (more in a bit.) While AI certainly makes some processing easier, and I am even a fan of some tools such as NoiseXTerminator for noise reduction, I think that AI tools have the GREAT potential to allow for poorer-quality results, if the tools are not used sparingly, with a measured or tempered hand. I think AI powered processing tools have GREAT beneficial potential as well...but, I think they need to be used sparingly, and less aggressively, along with some spice of traditional processing techniques, if they are to produce exceptional quality images. If you are interested in creating high quality images, then I have one key recommendation for you down below.

I've spent a lot of time browsing images (all around the web, not just here on ABin, which is actually a repository of some of the best images from recent years!) that people have been creating with a lot of these AI processing tools, and there are certain characteristics that float to the top that have become key markers to me, for when an image was processed with AI, vs. processed for maximum quality. AI-powered star removal in particular, tends to produce certain kind of artifacts in images that are destructive to finer details. Excessive AI-powered noise reduction has made the "plastic look" or "orange peel" even more common today, than it was back in 2019-2020. AI-powered star reduction has lead to what is often a highly neutralized and normalized look to starfields, where stars often have little diversity in terms of size, which diminishes one of the key aesthetic aspects of astro images (a diverse starfield, with sizes AND colors that are representative of the actual stars, does more for an image than such extensive star reduction that results in a field of highly normalized pinpoints of light.) 

AI has potential to help us produce better images. The ability to correct stars that have issues, such as coma in the periphery of the field, due to the real-world challenges of achieving exactly the right spacing, tilt correction, etc. can be a helpful bonus until such issues can be properly corrected (IF they can be...the advent of CMOS with its ultra tiny pixels can make this a very tall order for some scopes!) AI powered NR has great potential to greatly simplify one of the more complex and challenging linear-stage processing challenges, if we can avoid taking it to excess. A little bit of fine grain, IMHO, can be a key factor in preserving fine details. 

-=[ 1.1 ]=-

I think that traditional processing techniques still have a place today. In fact, they may hold their place more strongly now than ever, given the advent of AI and its excessive power to be destructive (i.e. AI NR can utterly wipe out fine details if overdone). I think a blend of traditional processing combined with measured and effectively constrained AI powered processing, should lead to optimal high quality images. In fact, given the astounding quality of many images going back a decade or more, I strongly believe if you have strong traditional processing skills, AI becomes a means to exerting LESS EFFORT on your PROCESSING, even though you could still produce the same quality image even if you did not have access to AI processing tools. The techniques and tools to extract incredible IQ from even frustratingly scratchy data quality have been around for quite some time. I see AI as a means to reducing the effort, rather than anything that could replace fundamental data quality. 

-=[ 2 ]=-

Now, the recommendation. The advent of AI powered processing tools, seems to have lead to some misconceptions, that in turn have resulted in a portion of the astrophotography community thinking that AI can REPLACE something that it simply cannot replace: REAL SIGNAL. I've read many times since I got back into this hobby, that AI processing tools can produce high SNR results from less data. I will be blunt: 

This is FALSE!!!!

No amount of processing can actually increase SNR. We can improve apparent image quality, by say reducing the appearance, characteristic and strength of noise and increasing the smoothness of the image...however even with AI processing, this is almost always destructive to real details to one degree or another. Noise reduction will usually soften something (even NoiseXTerminator and its AI friends!!) Artificial enhancement of details, will rarely reconstruct them accurately, and will often introduce details that never existed before (and don't exist in reality.) Artificial reduction of stars can improve the overall aesthetic appearance of an image, but it is often intrinsically destructive to the stars themselves. 

SNR can only really be defined relative to some reference point. If we don't actually have a reference point, then we cannot really define SNR. Most image processing is done to some form of subjective aesthetic goal, and said processing, while it may increase image quality on subjective factors, will rarely improve the Signal to Noise Ratio of the data. In fact, I'll go so far as to say that one of the very very few ways we could actually improve SNR, is to acquire more data. More REAL SIGNAL. In other words, spend more time under the real night sky, exposing. There is no replacement for this...not even AI. IMHO, no amount of AI could ever replace the acquisition of real signal, no matter how good AI gets. Discussion of the pros and cons, and potential pitfalls that AI (especially GAI) might bring to the world of astrophotography is a whole different can of worms, and I won't get into that here.

Mainly, my key recommendation, for those astrophotographers interested in producing high quality, accurate representations of the objects and structures in deep space, is to get more real signal. Don't let AI become a crutch, that you must rely on, in order to produce an image. Ideally, I would say, maintain your ability to create high quality astrophotography on your own, so that AI is simply a means to achieving your goals with LESS EFFORT...so that it does not become a crutch. Instead of focusing on AI, focus on time under the night sky, exposing real objects in space with real detectors, producing real data, containing real signal. There is, IMHHO, no better way to improve the quality of this craft, than that. 

--=[ P.S. ]=-

Anyway...in some cases, overall integration times have increased. I've come across a number of team images produced from data from 2-3 or more individual astrophotographers, and these images often reach 100 hours or more of total integrated exposure time. That said, I have come across significantly more images with less than 10 hours of total integrated exposure time, and perhaps even more with less than 5-7 hours. I've also come across a number of posts where people have literally stated that AI can replace integration time, which is IMHO a readily falsifiable notion. AI can simplify some of the more challenging aspects of image processing, it will never be able to REPLACE the acquisition of real signal. I felt that someone should mention that the only way SNR can truly be improved (outside of a few processing tasks such as removal of junk pixel data such as cosmic ray strikes, sat/aircraft trails, rejection of notably bad frames, etc.) is to actually acquire more real signal. I think that in an era where AI is rapidly taking over tasks once dominated by humans, once directed by human thought, it was important to note this key distinction.
Edited ...
Like
battleriverobservatory 6.06
...
· 
·  5 likes
Long story short, nothing can replace integration time or dark sky. I consider 30 hours to be my acceptable minimum now. I've used AI tools for a long time and it is not magic.
Like
HotSkyAstronomy 2.11
...
· 
I have to agree, I like how you can make your images look with AI tools, but I have found myself liking my images less and less if I use them.

I've stopped using DeepSNR (Which I really only used for 3 images) after the last image I took, as it just doesn't look right. My higher SNR datasets just look better from the get-go without them. I'm also starting to use NIR Luminance as a replacement for BlurXterminator, which has been a great help in increasing true sharpness vs the AI tool.
Like
rockstarbill 11.02
...
· 
·  2 likes
Agreed on all accounts, Jon.

I would add that there's a really terrible trend occurring with images that have been "enhanced" using tools like Photoshop or Affinity Photo. I'm not talking about traditional post processing people prefer to use these tools for. I'm talking about way over the line alterations to imaging data that take it outside of the realm of actual Astrophotography and place it more in the realm of artist renditions, or CGI.

While the AI tools are here to stay, and I welcome them, the Photoshopping is destructive to the hobby and will only serve to have folks inside the hobby and outside the hobby lose faith and credibility in what is being produced.

Bill
Edited ...
Like
AstroDan500 4.67
...
· 
·  2 likes
Bill Long - Dark Matters Astrophotography:
Agreed on all accounts. 

I would add that there's a really terrible trend occurring with images that have been "enhanced" using tools like Photoshop or Affinity Photo. I'm not talking about traditional post processing people prefer to use these tools for. I'm talking about way over the line alterations to imaging data that take it outside of the realm of actual Astrophotography and place it more in the realm of artist renditions, or CGI.

While the AI tools are here to stay, and I welcome them, the Photoshopping is destructive to the hobby and will only serve to have folks inside the hobby and outside the hobby lose faith and credibility in what is being produced.

Bill

I guess I would need an example of what you are referring to because the James Webb images are very highly processed using every tool there is.
They look completely CGI to me so who is the arbiter of what is too little, too much, etc. in the Astrophotography world?
The images on this site have improved a lot in the last year with the new AI tools from what I see.
I am in awe of the Processing done using all the current tools.
This sounds like the discussions about film vs. digital in 1998 or something, the new tools are just going to be improved, they are not going away.
Like
rockstarbill 11.02
...
· 
·  1 like
Not going to engage in that level of discussion and go pointing fingers. There's plenty of examples of it on here as well. You can find them rather easily is the end of what I'll say about it. I'm happy saying my peace and moving along to other parts of the discussion.

I am not an old film guy screaming at clouds in the sky. What I called out is very real, impacts this hobby significantly more than a few years back, and can greatly damage the perception on the inside and outside of the hobby.

Bill
Like
HotSkyAstronomy 2.11
...
· 
·  2 likes
Dan Kearl:
Bill Long - Dark Matters Astrophotography:
Agreed on all accounts. 

I would add that there's a really terrible trend occurring with images that have been "enhanced" using tools like Photoshop or Affinity Photo. I'm not talking about traditional post processing people prefer to use these tools for. I'm talking about way over the line alterations to imaging data that take it outside of the realm of actual Astrophotography and place it more in the realm of artist renditions, or CGI.

While the AI tools are here to stay, and I welcome them, the Photoshopping is destructive to the hobby and will only serve to have folks inside the hobby and outside the hobby lose faith and credibility in what is being produced.

Bill

I guess I would need an example of what you are referring to because the James Webb images are very highly processed using every tool there is.
They look completely CGI to me so who is the arbiter of what is too little, too much, etc. in the Astrophotography world?
The images on this site have improved a lot in the last year with the new AI tools from what I see.
I am in awe of the Processing done using all the current tools.
This sounds like the discussions about film vs. digital in 1998 or something, the new tools are just going to be improved, they are not going away.

James Webb data does not use AI tools because the scope is in space shooting infrared, and sensors are calibrated to near-perfection. Thats why we put scopes in space- so we can have many times more detail compared to the earth. The issue is that people are getting lazy with their image collection, and are using AI tools to compensate- which when you have poor data, tends to create detail that is not there. This is a Real vs Not Real arguement, completely different from the Film vs Digital arguement.
Like
AstroDan500 4.67
...
· 
·  1 like
V.M Legary:
James Webb data does not use AI tools because the scope is in space shooting infrared, and sensors are calibrated to near-perfection. Thats why we put scopes in space- so we can have many times more detail compared to the earth. The issue is that people are getting lazy with their image collection, and are using AI tools to compensate- which when you have poor data, tends to create detail that is not there. This is a Real vs Not Real arguement, completely different from the Film vs Digital arguement.

https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&cad=rja&uact=8&ved=2ahUKEwit2P7GyeWEAxUJADQIHRL-CHwQFnoECBQQAw&url=https%3A%2F%2Fwebbtelescope.org%2Fcontents%2Farticles%2Fhow-are-webbs-full-color-images-made%23%3A~%3Atext%3DPreview%2520how%2520Webb%27s%2520full%252Dcolor%2Ccomplete%2520video%2520or%2520its%2520sections.&usg=AOvVaw1ngTx6jbb3qVBSzICNMbog&opi=89978449

Of course they have the best data one could get but the processing is done using every tool and using photoshop like methods to enhance things they want to enhance.
Much like people do here.
Like
phsampaio 3.61
...
· 
·  3 likes
Im not oldschool by any means (I started in 2021), but I learned to process before the advent of AI tools. I followed every bit of advice and tutorial online to learn the denoise, devonvolution and star reduction. Most of those tools were very laborious and required a lot of testing to get it right. Many of my images up until 2023 were processed with those tools.

With StarNet, and the the Xterminator toolkit, processing became a lot less stressful, and it was easier to get the same (and sometimes better results) than before. But as Jon says, those tools cannot substitute real data, and certainly cannot increase SNR. I experienced that first hand - when re-processing some older pictures, I used BXT instead of deconvolution process. The result was not much better than before, and the increase in detail was not much better. But that's because the data itself is of much worse quality - terrible guiding, imperfect focus, too little integration time. AI, and specially BXT, can better aproximate the equation for deconvolution, but it cannot work miracles.

And for my last images, I've even skipped star removal (though I think it still has a place in NB imaging for RGB stars). I think that even the best star removal processes (SXT) leave too much artifacts. I prefer to do a proper deconvolution with BXT and stretch the image with GHS while protecting the highlights (I.e. the stars).
Like
rockstarbill 11.02
...
· 
·  2 likes
Dan Kearl:
V.M Legary:
James Webb data does not use AI tools because the scope is in space shooting infrared, and sensors are calibrated to near-perfection. Thats why we put scopes in space- so we can have many times more detail compared to the earth. The issue is that people are getting lazy with their image collection, and are using AI tools to compensate- which when you have poor data, tends to create detail that is not there. This is a Real vs Not Real arguement, completely different from the Film vs Digital arguement.

https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&cad=rja&uact=8&ved=2ahUKEwit2P7GyeWEAxUJADQIHRL-CHwQFnoECBQQAw&url=https%3A%2F%2Fwebbtelescope.org%2Fcontents%2Farticles%2Fhow-are-webbs-full-color-images-made%23%3A~%3Atext%3DPreview%2520how%2520Webb%27s%2520full%252Dcolor%2Ccomplete%2520video%2520or%2520its%2520sections.&usg=AOvVaw1ngTx6jbb3qVBSzICNMbog&opi=89978449

Of course they have the best data one could get but the processing is done using every tool and using photoshop like methods to enhance things they want to enhance.
Much like people do here.



In the video on the page you linked to, they specifically state that they work with scientists to ensure the images produced are scientifically accurate and that they take great care to ensure the integrity of the resulting images.

That's completely inline with what I said about folks using the tooling (like Photoshop) properly. My call out was for those that go way beyond that, and the result wouldn't pass the sniff test by Webb's scientists.

They use GIMP by the way. Well that's what the page says anyhow. Mostly they correct black star cores which are good for science (no meaningful data to find) but not good for pretty photos. So they fix them. I'm okay with that.
​​​​​

Bill
​​​​
Edited ...
Like
rockstarbill 11.02
...
· 
·  1 like
Pedro A. Sampaio:
Im not oldschool by any means (I started in 2021), but I learned to process before the advent of AI tools. I followed every bit of advice and tutorial online to learn the denoise, devonvolution and star reduction. Most of those tools were very laborious and required a lot of testing to get it right. Many of my images up until 2023 were processed with those tools.

With StarNet, and the the Xterminator toolkit, processing became a lot less stressful, and it was easier to get the same (and sometimes better results) than before. But as Jon says, those tools cannot substitute real data, and certainly cannot increase SNR. I experienced that first hand - when re-processing some older pictures, I used BXT instead of deconvolution process. The result was not much better than before, and the increase in detail was not much better. But that's because the data itself is of much worse quality - terrible guiding, imperfect focus, too little integration time. AI, and specially BXT, can better aproximate the equation for deconvolution, but it cannot work miracles.

And for my last images, I've even skipped star removal (though I think it still has a place in NB imaging for RGB stars). I think that even the best star removal processes (SXT) leave too much artifacts. I prefer to do a proper deconvolution with BXT and stretch the image with GHS while protecting the highlights (I.e. the stars).



I like that you brought up skipping star removal. I did the same on my most recent image (Whirlpool Galaxy) and used GHS to stretch the image and keep my stars within reason. This worked exceptionally well. Better than I thought going in. 

BXT and NXT were used but only light doses. The data was pretty good on its own.

Bill
Like
spacetimepictures 4.07
...
· 
·  1 like
Jon,

I have been carefully lurking your posts here and there lately and since long. I share some of your frustration, but hear me out :

I would state that for the general public, in astrophotography specifically, but also as with every other form of art, and in even more inimaginable ways, IA generated content might prove to take a large share of the human intellectual landscape ever more from now on.  Not that it's a thought we should be excited about, but it's the way it's going. Technological disruptions always are scary, and in retrospect equally amazing in unexpected ways.

One possible outcome of the IA storm is that for a minority of enthusiasts, hobbyists, scientists and the likes, the quest and value for grounded, organic, handcrafted images will grow even stronger than before. Communities like ours could thrive on the need and desire to study or contemplate "proof of work" work.

This has happened in the food industry, with the advent last century of mass produced processed food. However ubiquitous artificial (and I may say, hallucinated) food became, there is still to this day, and in a growing manner, a need and desire for human-thought and made stuff to delight our still physical senses.

Laurent
Edited ...
Like
AstroLux 8.03
...
· 
·  3 likes
garbage in = garbage out

Also, to be clear you are talking about machine learning algorithms that are trained on datasets and processes that are based entirely on mathematical equations and used on maps created by the algorithm (ex. BlurX). 

AI is just a recent buzzword that has nothing to do with these and there is no real AI existing currently, only software trying to act intelligent.  

The better the raw data the better the final image, "AI" tools can only improve to some degree. 
What you are describing and seeing in images are most of inexperienced or uneducated astrophotographers pushing their datasets to degree where everyone with some experience or knowledge can spot its fake or "overprocessed". Have a look at the IOTD guidelines for quick examples. 

But also I will say that even before "AI" tools existed there were astrophotos that were overprocessed or pushed over the limit of realism. 

I personally think these tools are beneficial and will be beneficial in the future but always strive for the first thing I said in this post.
Like
AlvaroMendez 2.39
...
· 
·  3 likes
I agree with your thoughts on this. Although I’m not a great photographer and my objects usually lack a lot of hours (I live in a cloudy/rainy province North of Spain where we have 2 clear nights a month tops and I’m only two years into the hobby), I hate the AI sharpening look and the blobs/orange peel noise reductions. At first it is tempting to over do it but you quickly realize that it is not a nice result and that everybody is overdoing it. The worst part is that I myself do not know when I have crossed the line.

Since I usually work with short integration data totals taken with an OSC (again, weather does not allow me to shoot many hours, go figure going the mono route), I devised a noise reduction method that I use so I can keep the grain -whenever possible- and skip any other noise reduction process. It consists on the following. After the initial stretch, I save a copy of the luminance. Then I go on working on the colour. Obviously this will bring out a lot of noise. When I have achieved my result, I separate the color channels, convolute them ( I won’t need the detail) and create an LRGB using the unaltered luminance. This is the best way I can think of for showing the actual data that I have gathered. Of course there are times in which noise is so high that I use ACDNR or TGV, for example in very dim objects with loads of sky background (I.E. I have reached the 24 hour exposure mark without being able to naturally reduce that hard noise, so then I resort to NR. )

And I think BXT is a great tool but I use it in Correct Only mode. I believe in deconvolution because it is mathemathically precise and even though I always keep my scopes perfectly collimated, it helps with tiny imperfections.

And I’m not a fan of background extraction because I believe it destroys a good part of my precious signal, so I use it only when it is strictly necessary.

Bottom line is I wish I could get more hours and use less cosmetic tricks. I think they are there to help but we need to learn to use them wisely. And I’m the first one who needs to remind this. The eye gets trained with time, now for example I am in the “saturate less, boy!!” phase because I tend to go overboard with that. hopefully in the future I’ll be able to have a nice portfolio of photographs that I can be proud of, but I will only be able to feel that if there’s no cheating involved (and the wearher behaves lol).

Also, that being said I also think there needs to be a balance between perfectionism and joy. I am not having a good time if I have to dedicate 6 months into one object. 2 nights in a month means one night because I usually discard half the data. This is a complicated and sometimes frustrating hobby, so I admire those who dedicate 50+ hours to one object. I just can’t. But I am okay with that.
Like
SemiPro 7.67
...
· 
·  7 likes
I am not sure if I agree with your assertion completely about image quality. Today's NoiseX and BlurX artifacts are yesterday's noise reduction and deconvolution artifacts. It's not a one-to-one comparison, but let's not look at the past through rose-tinted glasses and pretend image artifacts did not exist en-masse before the advent of AI processing tools; it was just a different type of artifact plaguing images.

Obviously my experience does not go back decades, but in my own time I can attest to the massive increase in expectations from what is considered a "good" image. Let's take M31 for example. When I started in 2020 you could expect an award on this website with a really solid broadband rendition. However, soon that was not enough and you needed at the very minimum a solid Ha component. Wait a little longer and now you need the Oiii as well. But wait! Now you need the faint background Ha. NOW you need the fabled Oiii arc as well. As we speak, even that is becoming commonplace among high level M31 images.

In such a short amount of time, the telescopes have gotten better. The mounts have gotten better. The cameras have gotten better. The processing tools have gotten better. In that sense, you can do 'more' with 'less'. Maybe not a lot has changed with the average image, but the average image is no longer a 10 hour integration with a CCD, but a 5 hour integration with a CMOS.

Among the upper echelon of imagers there is a tremendous 'arms race' going on that is pushing the hobby further along and as you mention a consequence of that is long integration times becoming more commonplace, or in absence of that an increasing amount of collaboration to combine multiple individual efforts. The expectations of award winning images are getting higher and higher. That alone leads me to question the entire first part of your post.

I have a lot of respect for you because your tutorials and other posts sprinkled around the web have contributed greatly to my knowledge in this hobby.

Now, as a member of the new generation of imagers, let me offer some advice in turn; Don't get stuck in the past. Embrace what is new and use your experience to help guide the present and future instead of stifling it. Fighting change is a losing battle that will only leave you bitter, its better to contribute to change and help guide it instead.
Like
HegAstro 11.91
...
· 
·  2 likes
Obviously my experience does not go back decades, but in my own time I can attest to the massive increase in expectations from what is considered a "good" image. Let's take M31 for example. When I started in 2020 you could expect an award on this website with a really solid broadband rendition. However, soon that was not enough and you needed at the very minimum a solid Ha component. Wait a little longer and now you need the Oiii as well. But wait! Now you need the faint background Ha.


Are we seriously making the claim that the OIII arc is needed for an M31 image to be considered good? What self appointed gods of astrophotography decided that? I think Jon's point is much more basic - not using AI to push acquired data beyond the limits of acquired signal and create artifacts. I didn't read that he is against the use of AI tools in all cases. His statements are simply that these tools are not a substitute for real signal and that they should be viewed as tools to make processing easier.

I personally use the new AI tools - but I like to think I use them to save time, rather than as a substitute for data acquisition. I actually think most good imagers, or imagers whose work i respect, use them in this way, and do so regardless of integration time. It helps to have a realistic expectation of what you can achieve given constraints, and enjoy the images that you can obtain within these constraints. If someone wants to take part in an arms race for bragging rights - well that is something they are free to do with their money and for their own enjoyment.  I personally enjoy a well taken and processed M31 or M33 image, regardless of whether it wins an award or not, and regardless of whether it contains the OIII arc or not.
Edited ...
Like
HotSkyAstronomy 2.11
...
· 
Among the upper echelon of imagers there is a tremendous 'arms race' going on that is pushing the hobby further along and as you mention a consequence of that is long integration times becoming more commonplace, or in absence of that an increasing amount of collaboration to combine multiple individual efforts. The expectations of award winning images are getting higher and higher. That alone leads me to question the entire first part of your post.

Beating you all to full-NIR color images, sorry
Like
rockstarbill 11.02
...
· 
·  2 likes
Arun H:
Obviously my experience does not go back decades, but in my own time I can attest to the massive increase in expectations from what is considered a "good" image. Let's take M31 for example. When I started in 2020 you could expect an award on this website with a really solid broadband rendition. However, soon that was not enough and you needed at the very minimum a solid Ha component. Wait a little longer and now you need the Oiii as well. But wait! Now you need the faint background Ha.


Are we seriously making the claim that the OIII arc is needed for an M31 image to be considered good? What self appointed gods of astrophotography decided that? I think Jon's point is much more basic - not using AI to push acquired data beyond the limits of acquired signal and create artifacts. I didn't read that he is against the use of AI tools in all cases.

I personally use the new AI tools - but I like to think I use them to save time, rather than as a substitute for data acquisition. It helps to have a realistic expectation of what you can achieve given constraints, and enjoy the images that you can obtain within these constraints. If someone wants to take part in an arms race for bragging rights - well that is something they are free to do with their money and for their own enjoyment.  I personally enjoy a well taken and processed M31 or M33 image, regardless of whether it wins an award or not, and regardless of whether it contains the OIII arc or not.



It's the second part of what he said that really nails it. To be considered award worthy, an M31 image must have these additional components and those requirements over time have creeped more and more in that direction.

It's not just M31 though, although that's a fantastic example.

Here is an image of M31 I posted recently:

https://www.astrobin.com/k408qb/

Didn't get a single sniff from the IOTD review. What's wrong with that image? It certainly is the older take on the data, without clouds of HA in the background or the hot new OIII arc and OIII blossom. Absent of that though, it's pretty good IMO. Would love to hear from others. Feel free to pick it completely apart. Who knows, I may learn something and I love learning. That's my favorite part of this whole thing. Learning.

The arms race mentioned is extremely real. I believe that also lends itself to people juicing their images beyond what is reasonable. Like any competitive sport, for example, you'll have some participants going outside of the rules to get an advantage. In some cases these extravagant images are very real, in others they are completely fabricated -- however, when they win awards, and move the expectations needle in the wrong direction they cause very real harm to not only the integrity of the process (IOTD in this case) but also the integrity of the hobby itself.

Bill
Like
rockstarbill 11.02
...
· 
·  1 like
V.M Legary:
Among the upper echelon of imagers there is a tremendous 'arms race' going on that is pushing the hobby further along and as you mention a consequence of that is long integration times becoming more commonplace, or in absence of that an increasing amount of collaboration to combine multiple individual efforts. The expectations of award winning images are getting higher and higher. That alone leads me to question the entire first part of your post.

Beating you all to full-NIR color images, sorry



NIR images used to be really popular. I forget who it was, but someone posted an NIR enhanced image of the Horsehead Nebula years back (maybe 2017-ish) that was pretty amazing. Taken with a Sony CCD camera from QSI if I am remembering right.

I say go for it. Could have some cool results.
Like
jrista 8.59
...
· 
·  2 likes
To be clear here...I am not saying not to use AI processing tools. I am also trying to be careful about target audience here, by explicitly stating people who are interested in creating high quality images, as I know there is a large (and ever growing, and probably majority) segment of imagers now, who are NOT in the hobby to produce high quality images, but more for...well, other reasons. 

I'm trying to corral the topic largely to the imagers who have an actual interest in creating high quality images, and I guess by extension, those who may have the MEANS to create high quality images.
Like
AstroDan500 4.67
...
· 
·  5 likes
Jon Rista:
To be clear here...I am not saying not to use AI processing tools. I am also trying to be careful about target audience here, by explicitly stating people who are interested in creating high quality images, as I know there is a large (and ever growing, and probably majority) segment of imagers now, who are NOT in the hobby to produce high quality images, but more for...well, other reasons.

I really have no idea what your are talking about now.... Who exactly are you referring to?
Is there something nefarious going on here that people are unaware of?
I browse and post on this site, It seems like most people are tying to image and learn and produce the best images they can.
There are also real pros here who astonish me with there craft.
I guess I miss the people who you claim are up to something else?
This is getting weird .....
Like
rockstarbill 11.02
...
· 
·  1 like
Jon Rista:
To be clear here...I am not saying not to use AI processing tools. I am also trying to be careful about target audience here, by explicitly stating people who are interested in creating high quality images, as I know there is a large (and ever growing, and probably majority) segment of imagers now, who are NOT in the hobby to produce high quality images, but more for...well, other reasons. 

I'm trying to corral the topic largely to the imagers who have an actual interest in creating high quality images, and I guess by extension, those who may have the MEANS to create high quality images.



I think your points and intent are great. I'm very eager to hear what others have to say. I definitely have a strong interest in producing the best images I can. I always have had that goal myself. I know many others that are in the same boat.

@Chris White- Overcast Observatory may have some good thoughts on this as well. 🙂
Like
jrista 8.59
...
· 
·  1 like
I am not sure if I agree with your assertion completely about image quality. Today's NoiseX and BlurX artifacts are yesterday's noise reduction and deconvolution artifacts. It's not a one-to-one comparison, but let's not look at the past through rose-tinted glasses and pretend image artifacts did not exist en-masse before the advent of AI processing tools; it was just a different type of artifact plaguing images.

Obviously my experience does not go back decades, but in my own time I can attest to the massive increase in expectations from what is considered a "good" image. Let's take M31 for example. When I started in 2020 you could expect an award on this website with a really solid broadband rendition. However, soon that was not enough and you needed at the very minimum a solid Ha component. Wait a little longer and now you need the Oiii as well. But wait! Now you need the faint background Ha. NOW you need the fabled Oiii arc as well. As we speak, even that is becoming commonplace among high level M31 images.

In such a short amount of time, the telescopes have gotten better. The mounts have gotten better. The cameras have gotten better. The processing tools have gotten better. In that sense, you can do 'more' with 'less'. Maybe not a lot has changed with the average image, but the average image is no longer a 10 hour integration with a CCD, but a 5 hour integration with a CMOS.

Among the upper echelon of imagers there is a tremendous 'arms race' going on that is pushing the hobby further along and as you mention a consequence of that is long integration times becoming more commonplace, or in absence of that an increasing amount of collaboration to combine multiple individual efforts. The expectations of award winning images are getting higher and higher. That alone leads me to question the entire first part of your post.

I have a lot of respect for you because your tutorials and other posts sprinkled around the web have contributed greatly to my knowledge in this hobby.

Now, as a member of the new generation of imagers, let me offer some advice in turn; Don't get stuck in the past. Embrace what is new and use your experience to help guide the present and future instead of stifling it. Fighting change is a losing battle that will only leave you bitter, its better to contribute to change and help guide it instead.

I'm working with what is new... That is partly what spurred this. I have very high standards. These new tools, while good, DEFINITELY have their detracting aspects. I have spent weeks here, browsing around astro images shared all over the web. The...distinct characteristics of AI processing, PERVADE. There are notable detracting aspects to images primarily or heavily reliant on AI to process. I think there are some trends, as well, such as the utter pervasiveness of star removal, that has lead to some sad generalized outcomes (i.e. common artifacts that I can now trace right back to a couple specific tools. )

Given what I've seen, the trends, the increasingly pervasive AI-derived characteristics to a lot of processing, have made me very wary of just wholly and openly embracing all the AI stuff. Hence the post. I know some people are just going to stick with it, go full on AI, and never look back, and that's ok... Some people though, might be given pause here, and maybe AI tools will be used with more care, and maybe quality of the results for each individual imager who does take pause might increase, as they learn that maybe AI isn't the ultimate solution to "SNR" and image quality. 

FWIW I am not saying AI tools shouldn't be used... Just that they should be used effectively, and not overused, and in particular to not become a crutch or in particular as a replacement for real signal. I think trying to use AI as a replacement for real signal en masse, would lead to a long term and very great decline in astrophotography, what it represents, and what the world gains from it. 

Also, I guess I should state, I don't really care who is winning contests. Those are very subjective a lot of the time, IOTD certainly has its subjectivity for sure. 

I am just talking about general quality. I think if we wanted to, we could come up with a set of objective quality factors, to help determine quality. Some things I think are largely....well, can't think of a better term than "obvious", or perhaps more of an "innate skill" for most. I think certain particular aspects of things like color, can be highly personal...but I think there are some general aspects that can be considered more generalized. There are good combinations of colors that most humans find pleasing, and there are other combinations of colors that just clash in terrible ways for most people. Clean noise characteristic (not necessarily the absence of, which IMO is terrible, but clean.) Accurate details (i.e. not destroyed by a star removal algorithm.) A lack of notable processing artifacts. Good depth of color. Either accurate (for the filter set used) or aesthetically pleasing color rendition. There are factors of image quality, that will make almost any viewer go "wow"... There are also quality detractors that will, at the very least, result in someone moving on to the next image promptly. 

IMHO, IOTD is not the deciding factor here. Does the image have impact? Are the key quality factors there, and are they at high levels? 

Regarding mounts and telescopes getting better... Have they? I've heard about some of this new mount technology, and about some of this more plug-and-play technology. I will certainly agree that increases the accessibility and ease of use of the equipment... Is it actually BETTER though? I own an AP Mach 1. Its a darn good mount. I don't know that most of the mounts I'd say most people are using these days, even come close to it. There is the Mach 2, which is an evolution of the Mach 1. Its even more pricey. Not many people are going to be using high end mounts like that. The common equipment, maybe is more accessible, but I would dispute that its all that much better. 

I don't agree that you can do more with less, not if your goal is to create a high quality image. I'm sure that you can create AN image a lot more easily, and with a lot less, than what we might have been able to make do with in the past. But, when it comes to true quality, IMHHO, LESS, will NEVER result in an improvement. ;) Ever. Fundamentally, no matter what equipment you use, what kind of skies you have...we are working with digital signals. The only way to truly improve a digital signal, is with MORE SIGNAL. A giant aperture and a giant sensor, can certainly do that. They are also most assuredly NOT accessible to the majority of imagers. But, the topic I tried to start here, is more interested with astrophotographers who have a distinct interest in creating high quality images...vs. the average imager these days just looking to "snap" some "space photos" (which seems to have pervaded places like FaceBook.)

If you are not interested in producing quality results and just want to enjoy the hobby, I have no problem with that. Enjoy!! Space is amazing, beautiful and always overhead. Have fun with your hobby.

For those who do want to produce the best quality images they can, main thing is, get more real signal, don't let AI take over and don't use it as a crutch for lack of real signal (I like the power of NXT even with 50 hours of overall signal...it allowed me to be just a bit more aggressive than I would have otherwise been), leverage some traditional processing techniques, maintain a measured hand in all your processing (don't go overboard, I think that is one of the easiest ways to lose IQ and introduce an ever growing volume of artifacts....every image can handle a certain amount of processing before it breaks down, and the stronger the REAL signal, the more processing the image is likely to be able to handle.)
Like
AstroLux 8.03
...
· 
·  2 likes
Arun H:
Are we seriously making the claim that the OIII arc is needed for an M31 image to be considered good? What self appointed gods of astrophotography decided that?


No, you can have a good M31 image without the OIII arc, or the background Ha, or the Ha inside M31, however over the years the standard of quality work and pushing the limits of amateur images has gone up by a lot. More and more images and authors spending more and more time on their images presenting a better overall image. A broadband sharp M31 image (without Ha OIII)  without lots of noise and other technical flaws is well.. basic and average and "not unique". Which does not mean its bad, it just means that its most likely not to get awarded. Yet again because the standards are so high now. Self appointed gods are the photographers that pushed the boundries of such images with longer integration times and set the bar extremely high. If the whole point is to get better, why get bitter if for example a 5yo IOTD is now just an average image. Everyone who takes astrophotography seriously in some way wants to set the bar as high as they can go and create the best image they can.

Bill Long - Dark Matters Astrophotography:
Feel free to pick it completely apart. Who knows, I may learn something and I love learning. That's my favorite part of this whole thing. Learning.


I can tell you that the image looks soft, the stars are "muted", background is not neutral with splotchy blues and yellows (overall nonuniformity), noisy for the integration time and the core too orange/brown in terms of saturation. Nothing bad in the image as a whole,  just nothing special to deserve some kind of award, distinction or anything. Its just "another" M31 for me.
Like
AstroDan500 4.67
...
· 
·  2 likes
Jon Rista:
The only way to truly improve a digital signal, is with MORE SIGNAL. A giant aperture and a giant sensor, can certainly do that. They are also most assuredly NOT accessible to the majority of imagers. But, the topic I tried to start here, is more interested with astrophotographers who have a distinct interest in creating high quality images...vs. the average imager these days just looking to "snap" some "space photos" (which seems to have pervaded places like FaceBook.)

I am not a face book person but have seen some images posted and I have to ask WHY you have a problem with it?
People are in hobbies for all kinds of reasons. I don't think the majority here  are into this hobby for 
"other reasons" as you stated. If someone wants to take images and post them on Facebook or instagram is that a big problem for you?
I still have no idea who you are referring to with your cryptic posts.
You seem to be just putting down people who you feel should apparently just not be in this hobby because they are not "Pure" enough for you?
Like
 
Register or login to create to post a reply.