How dark may the sky be? [Deep Sky] Processing techniques · Olaf Fritsche · ... · 14 · 699 · 0

This topic contains a poll.
May the background on DSO photos be extremely dark or even black?
Yes.
No.
It depends on the picture.
Astrobird 10.16
...
· 
In the comments and on the forum, I keep reading that the background on DSO images must not be black, but a dark gray. What I never read is a reasoning for this. Is there even a comprehensible reason? 
Why is the sky not allowed to be as dark as it looks to the eye during visual observation? Is this just a matter of fashion? 
(By the way, in many Hubble images the background is very dark to black.) 

Can someone tell me a reason why the background must imperatively be gray?
Like
andreatax 7.80
...
· 
The reason for not being jet black, utterly devoid of light, is because in fact it isn't. This in absolute sense from the point of view of imaging down here rather than up there. On top of that most of us will be imaging under some or a lot of LP. One more reason for it not being black. Even if there is no artificial LP there are natural sources such zodiacal light, gegenshein and so forth. Never mind imaging an object against the background of the Milky Way. The final reason is that very few images have a nice tapering off into the black point. You just help them along aesthetically by rising the black point value to some grade of dark grey.
Like
barnold84 10.79
...
· 
·  2 likes
Hi Olaf,

First of all, you are free to set the black point in your images in the way you want unless your goal is photometric accuracy. Considering the common pretty picture approach, you might notice that images with a gray sky background look aesthetically more pleasing (to most observers).

From a physics perspective, I see following points:. Of course here on earth, we need to deal with light pollution but also effects like natural air glow, adding light to the sky background (ESA about airglow). Even if you’d get rid of it, a visual observer would observe the effect of Eigengrau (Wikipedia), which would be our perception in the absence of any incoming light.

Now to space bound observation: there should always be photons in the visual spectrum. Either it’s from objects so far away that they haven’t been resolved from the telescope (even Hubble) or it’s the Planck radiation itself. Modeling the universe as a black body, there is always non-zero contribution in all wavelengths and not total absence.

CS,
Björn
Like
stevendevet 6.77
...
· 
·  2 likes
"May"? "not allowed"? you can do whatever you want.

Unless you are taking pictures for scientific value, there are no rules on what you "may" or "may not" do.
Once the "scientific" process of data collection is done, most of us proceed to an artistic element when creating our images.. so.. like it is with every art and artist,  that's up to you to create whatever vision you want.


Personally, I don't go full on black if I can avoid it and if noise levels allow it.

Mainly to have a nice contrast between any dark dust in nebulae that I want to show and I want to have those dark bits of dust as the darkest elements of the image to draw attention to them. The same goes for fainter dust at the edges of nebulae that wouldn't be visible if I went black with my backgrounds.

why? because I think it looks the best. 
that's it, and that's all you should really care about.
Like
Astrobird 10.16
...
· 
·  2 likes
Thank you, Andrea, Björn and Steven! 

So far we have two arguments against a nearly black background: 
1. it is a matter of personal taste. One can have different opinions about this. It is just a personal view. 
2. the natural background is not really black for various reasons. I don't find this argument particularly compelling. We have no scruples about enhancing the brightness and colorfulness of extremely faint objects. Why then should we not make the dark parts completely dark? 

Personally, I have no problem making my own images look like my personal taste. But in various discussions I have often read that it is downright "forbidden" to make the background almost black. Only total beginners would do that. Even the exact RGB values for the "correct" background someone has given. Unfortunately always without giving reasons. 

So far I don't see any reason beyond personal taste.
Like
DarkStar 18.84
...
· 
·  2 likes
Since we are doing art and not science you may do what ever you want to. You may make the background red, purple or pink and publish it in "fine art group".
Whether others do like it or not is their taste. These "do" and "donts" are really subjective and I agree to your point: We do not hesitate to manipulate curves in many ways, but with the background there are suddenly strict rules? This rule is per-se void.

Also: You only have non dark background only when taking the image from ground. The Hubble images have an almost pitch black background. So the question is: Should my image look like one taken from space or ground?
Edited ...
Like
jonnybravo0311 7.83
...
· 
·  1 like
As others have stated, it's your data and your interpretation of that data. Make it purple, yellow, orange and green if you want.

Now, I typically do not bring my black point so far to the right as to completely clip the background. This, to me, makes the image appear "plastic" and "unnatural". I also feel that people tend to clip the background to hide problems in the data. For example, the easiest way to get rid of some background noise is to simply bring the black point high enough to bury it. Again, this is my opinion and my taste. Your opinion may be different, and that's perfectly fine 
Like
andreatax 7.80
...
· 
·  3 likes
The only "objective" point for avoiding a purely black background is that you don't know where exactly your data ends, both in amplitude and in distribution. The larger the sensors get the more unlikely is that your background is uniform, both in colour and in intensity. Keeping the background (a possibly tiny value) above the purely black (0,0,0) is to avoid clipping your data which is why most beginners are recommended to avoid making the background pitch black. Personally I'll consider clipping the data by raising the black point value above the background level a serious mistake. What you do with your data is however entirely up to you.
Like
bennyc 8.42
...
· 
·  2 likes
Aside from the comments above - our vision is pretty poor at noticing details on either end of the histogram. Get too close to either fully 100% (white) or 0% (black), and details just below/above the respective end get harder to notice compared to the midtones. On top of that, pretty much everyone's screen is poorly calibrated or not at all. Again, things at the edges of the histogram can almost disappear on a screen like that. 

You may not care about the background, but there's usually interesting stuff sitting there just above the background (faint distant galaxies, dust, ...). If you make the background too dark (I'd say below 10%, some say 15 - I think *that* depends on the picture) those little things are then harder to see and that detracts from the image. Also - from what I've seen, that is othen the purpose - pictures with a very black (or even clipped) background are either trying to hide poor LP gradient removal, artefacts, or both. Sometimes it is just from inexperience. Either way that makes a very black background a bit of a red flag. 

Last but not least - this is sort of a community consensus (even if there is good reason for it). You may of course do as you please, but as with all social media (and let's for a moment count AB among that, for better of worse) - going against the stream may not be the easy path.
Edited ...
Like
Krizan 5.73
...
· 
My argument against making the black point "0" in clipping.  No nebula exist with a hard edge where the nebula is and is not. It's a gradual fading into the dark Ha.   Simular is true with galaxies.  they slowly fade into the dark void.  By making the black point "0 0 0", you wil chop off the faint outer parts and give the object too hard of an edge. I try to keep mine around 20 to 30 in order to not clip very faint data. That then demands better background noise control.  Better to over smooth the faint stuff than to completely clip it.

I agree, this is often a beginner mistake, and is often done to eliminate the background noise.

I also agree that nebula usually always does not exist against a black void.  Planetary may be the exception to that. BUT, alway some imager will demonstrate there is still more outer nebulosity to that planetary nebula.  The Dumbell is a prime example.

Lynn K.
Like
DarkStar 18.84
...
· 
Hello Lynn,

I have a different understanding of background. If there is any signal of the observed objects (e.g. M27 you mentioned), or other faint objects, then it is in my point of view not the background. I consider clipping as a means of noise reduction also critical. But the question was actually the background itself.
Background is for me the areas only showing skyglow and no distinctive object signal - actually only noise.
Edited ...
Like
bennyc 8.42
...
· 
·  2 likes
Ruediger:
I have a different understanding of background. If there is any signal of the observed objects (e.g. M27 you mentioned), or other faint objects, then it is in my point of view not the background. I consider clipping as a means of noise reduction also critical. But the question was actually the background itself.

Objects have soft edges and small/faint non-target (background) objects/structures are everywhere and often just one or two percent above the "background". Things like IFN, Ha filaments, matter bridges, more distant galaxies. You may not have the data to feature them prominently but by aggressively using the black point you destroy (or hide) these - hard clipping is not the answer, these structures literally fade *into* the noise and become part of it. 
Background is for me the areas only showing skyglow and no distinctive object signal - actually only noise.

The right tools (DBE/ABE in PI, LP removal in APP, GradientXterminator in PS, ...) should be used for dealing with LP/skyglow gradients that you may have to deal with. For noise there is some mitigation (not complete removal) with the right NR algorithms, but if those are insufficient then to get a palatable result there's the option of resampling the image ("binning in post") or probably most importantly just not stretching that hard to begin with. If I have an issue with BG noise I can't deal with with NR it's usually because I stretched the image more than the data I captured supports. Means I either need to stretch less or stop the processing here and revisit after I have added more data. Clipping the noise with the black point is not the way to IMHO.
Like
DarkStar 18.84
...
· 
·  2 likes
Benny Colyn:
Ruediger:
I have a different understanding of background. If there is any signal of the observed objects (e.g. M27 you mentioned), or other faint objects, then it is in my point of view not the background. I consider clipping as a means of noise reduction also critical. But the question was actually the background itself.

Objects have soft edges and small/faint non-target (background) objects/structures are everywhere and often just one or two percent above the "background". Things like IFN, Ha filaments, matter bridges, more distant galaxies. You may not have the data to feature them prominently but by aggressively using the black point you destroy (or hide) these - hard clipping is not the answer, these structures literally fade *into* the noise and become part of it. 
Background is for me the areas only showing skyglow and no distinctive object signal - actually only noise.

The right tools (DBE/ABE in PI, LP removal in APP, GradientXterminator in PS, ...) should be used for dealing with LP/skyglow gradients that you may have to deal with. For noise there is some mitigation (not complete removal) with the right NR algorithms, but if those are insufficient then to get a palatable result there's the option of resampling the image ("binning in post") or probably most importantly just not stretching that hard to begin with. If I have an issue with BG noise I can't deal with with NR it's usually because I stretched the image more than the data I captured supports. Means I either need to stretch less or stop the processing here and revisit after I have added more data. Clipping the noise with the black point is not the way to IMHO.

Hi Benny,

1. there might be dimm structures everywhere, but only those having a signal bigger than the background noise are actually existing. If their signal is lower than the basic noise they are non existent. Additional it is questionable if you can map a 1 or 2 percent signal to a curve which lead to a distinguishable signal on low end on a standard 8 bit TFT, especially if the black point of a display is not calibrated. 

2. I did not talk about gradients. I was talking about equally distributed light glow.

3. I also stated that clipping is no suitable noise suppression approach. I also dislike the “over-clipped plastic look”. 

4. Simply adding more data is often no practicable option. When I currently look outside, I am happy to have 1 night in 2 months. 

To make it absolut clear: I can follow and agree to all the arguments given before, but in the end it is still a decision of the image creator. In the end you can like or dislike it, but it is the decision of the artist and what he wants to present to others. Without really knowing and understanding what he wants to express in his image, you cannot say it is wrong or false. It is like so many times in life: the answer is “it depends”
This is my humble opinion.
Edited ...
Like
Astrobird 10.16
...
· 
Thanks to all of you for your contributions! 

If I summarize it correctly, here are the arguments against a very dark to black background: 

- The background is not really black because there are always some photons there from different sources. 
Well, but if I see with the naked eye only the noise in my sensory cells, then it is there as black as possible. Besides, the aim of my photos is not to reproduce the actual view as exactly as possible, because then I shouldn't make anything brighter, increase contrasts, etc., either. 

- Most objects do not have a sharp border, and no artificial sharp border should be created by processing. 
Here I agree with Ruediger's argument 1: If the signal at one point is weaker than the noise, then no signal exists at that point. The gravitational force of an object also never decreases to zero, yet on Earth we do not take into account the gravity of the Andromeda Galaxy when we calculate, say, the trajectory of an asteroid. 

- Clipping artifacts also destroy faint objects. 
Yes, but stretching also distorts the information. And if I don't use the whole visual spectrum, but only narrowband filters, I also exclude information. Even more so with clone stamping, cropping, etc. 

Intermediate conclusion for me: If you make the background very dark, you move away from the natural view. This also happens with many other techniques. Everyone must decide for themselves which technique serves their goal.
Like
andreatax 7.80
...
· 
·  2 likes
The point of AP is not to represent unfiltered  DS reality but to represent DS reality as if it were perceived as a daylight reality. Therefore non-linear stretching of the image does not destroy anything but gives another, more realistic to human eyes, view of these celestial objects.
Like
 
Register or login to create to post a reply.