“Correct” Color with DSLR Shots? [Deep Sky] Processing techniques · schmaks · ... · 6 · 306 · 0

schmaks 0.00
...
Hi,

How do I know if my color is “correct” in my shots (using DSLR)?

I can surely seek to emulate other shots’ color from the same objects but am just curious if there are ways to know without doing so?

Thanks!

P.S. I know my color is off in my first photo... to be updated soon!
Like
matthew.maclean 3.97
...
oh, and on this topic, I do not claim to be an expert, but I think you will find that "colors" of DSO are in the eye of the beholder and are somewhat arbitrary. It's part of the artistic component of this hobby. And by using light pollution or narrowband filters (which many of us need to do), you are already affecting the captured color balance anyway. This is one of the main benefits of Astrobin is that you can see a lot of examples of the results of others and get a sense of the range of acceptability. But personal preference during image processing is definitely involved.

I will point out that both Astro Pixel Processor and PixInsight have Star Calibration tools that can help at least make the color palette of the background stars realistic. In APP, it's under 9) Tools => calibrate star colors.
Like
dkamen 6.89
...
Hi,

I agree with Matthew that color is arbitrary. All those things are not visible with the naked eye (at least not with colours). And some types of photography such as narrow-band are false color by definition.

However let's assume that you define "correct" color as "what the eye would see if I were really close the subject and  able to perceive color of really faint light sources as well as color from intense ones".

The key facts that allow us to answer this  question is our eyes are evolved to see in daylight (colors are very different if you use a tungsten light or a blue LED or moonlight for example, this, the light source is known as the "white reference").  Also, that our eyes lose the ability to see colour in the dark, and they are less sensitive to red to begin with (red suffers most). This is why when you take a photo in a dimly lit room things look too red: the sensor is not wrong. It is your eye that cannot see all this red.

In a nutshell:

Things generally tend to be much redder (or shifted to red, e.g. pink, purple, orange)) than we have in mind. Especially where there is a lot of dust and stars. Excluding  influences from nebulae (such as the Pleiades that are very blue), the general, default color of the deep space background is shades of red and orange.  When the sky background is shifted to blue or green, this is usually for reasons closer at home: airglow, zodiacal light, moonlight, artificial light pollution. And it can only be pitch black where you have a really dense dark nebula obscuring the background, such as the coalsack.

Now, to get "correct" colors with sunlight as the white reference, you need to calibrate. You can use Astropixel Processor which uses a statistical model to restore star colors so that they match their expected distribution, or the more advanced Photometric Color Calibration of PixInsight. This solves the image, finds stars with known colors and makes necessary adjustments so that they look they way you would expect, given a known white reference. If you want sunlight as your white reference, use G2V star. If you want Rigel-light to be your white reference, use B8 star. If you are shooting a galaxy (a heterogeneous light source), it is probably best to use the "Average Spiral Galaxy" white reference.

Or you can process your photos non-linearly from the beginning, using a known white reference (preferably Daylight). This is actually a good idea if you are using a DSLR and do not have much light pollution.

Please see this discussion for more details. It is truly a huge subject:
https://clarkvision.com/articles/color.of.the.night.sky/
Like
ODRedwine 1.51
...
From one of my Facebook posts, explaining why my photo looked so different from the Hubble photo in Wikipedia

"The famous Hubble telescope version seen in Wikipedia and elsewhere is a false color image mapping Sulphur to Red, Hydrogen-Alpha to Green and Oxygen-III to Blue. The colors in this photo are closer to the actual colors; however it is hard to be exact since most observers have a very limited ability to see the colors of any but the brightest nebula through a telescope eyepiece. The process of developing an astrophoto from the various reference and target frames can lead to color shifts."
Like
Die_Launische_Diva 11.14
...
Please see this discussion for more details. It is truly a huge subject:https://clarkvision.com/articles/color.of.the.night.sky/


I would take the advice from that particular site with a grain of salt. Please read https://forum.startools.org/viewtopic.php?f=4&t=912
Like
MortenBalling 1.20
...
Hi

Color is far from arbitrary. Color is a qualia, and cannot be measured. However, each photon has a wavelength, corresponding to a "color". In a RGB imaging system (camera, monitor, etc.) you can calibrate the whole system to reproduce color very precisely. Typically we use the correspondence between color and monochrome wavelength established in 1931 and therefore called the CIE 1931 colorspace.

https://en.wikipedia.org/wiki/CIE_1931_color_space

Color space theory is extremely complicated, but color management can definitely be done.

If you want the short version, select a part of the field in the image that has only stars and no nebulae. etc. Also select a part of the field that does not contain bright, red or blue stars. Basically just your everyday average white stars on a black background. Then ballance R, G and B gain. This will give a pretty decent "true" color balance.

Better still is to start by balancing RGB on a "black" part of the field. It can be difficult to select a part without any stars, but a few faint stars won't change much. Then after balancing the dark area (the offset), adjust the gain as described above.

I don't know what software you're running, but if you use Pixinsight, that has features for color calibration, and it even has a histogram to show the color balance. Similar tools are found in Photoshop.

If you want to be really precise, you can calibrate your image using a few known stars in the field. Look up the RGB flux for each of the stars using Aladin (or whatever), and adjust your gamma curves to correspond to the measured RGB ratios in the (linear) image. I use a calibration curve generated in Excel based on ~50-100 stars and galaxies. Once calibrated the image measures within a few percent of the flux data.

Next up is calibrating your monitor. If you don't have a probe, select a white balance of 6500K on your monitor, and hope for the best. If you can see an overclouded sky from where your monitor is (or move it), a 50% grey should look like the sky. Not redder or bluer. In a dark room, grey will look slightly "orange" on a calibrated monitor.

Good luck! 
Edited ...
Like
whwang 11.57
...
You guys have gone too far.  I think what's relevant to the OP at this stage is how to get consistent and pleasant colors, rather than the tons of small details in color science.  What will really help the OP is to keep practicing image processing and to learn from other people's workflows (via books, and articles or video tutorials on the internet).
Edited ...
Like
 
Register or login to create to post a reply.