Dithering calibration [Deep Sky] Acquisition techniques · Dave B · ... · 14 · 316 · 4

dave1968 2.81
...
For a those dithering between frames with a dslr ,what calibration frames do you take  currently I take flats and bias and also load bias as darks what do others do .
Like
GoldfieldAstro 0.90
...
When we've done shooting with a DSLR we've used Flats, Flat Darks and Darks.
Like
dave1968 2.81
...
Dithering does away with darks, So are you dithering?
Like
GoldfieldAstro 0.90
...
Dithering is definitely an important aspect but it doesn't do away with darks pers say. When you're wanting to detect faint structures with a DSLR darks help with removing the accumulation of dark current. Dithering helps with hot pixels and helps mitigate Fixed Pattern Noise but dithering doesn't replace darks.
Like
dave1968 2.81
...
I was thinking  bias had dark current  so that was the thinking of loading bias as darks, I will try dark flats next time though , thanks
Like
GoldfieldAstro 0.90
...
Bias have the base background signal but dark current accumulates over time and that's what separates a bias from a dark. The dark frame is a Bias + accumulated dark current.
Like
dkamen 6.89
...
Hi,

This is with a Nikon D7500 and without any calibration frames whatsoever, with the exception of a Bad Pixel Map:

https://www.astrobin.com/d3jo9p/?nc=user

It's 14 nights, 400 subs with dithering between subs and different camera angle in every night. If I was doing the project I again, I would even skip the Bad Pixel Map. I did correct vignetting with Rawtherapee's vignetting correction tool.

What calibration frames you need is highly dependent on your camera. I believe my D7500 is an exception among modern cameras, in that it needs dark frames when the sub is above 240 seconds or so (and you are in a very dark sky and the temperature is above 10 degrees). Most modern cameras for most bright targets do not need darks. My D3300 definitely did not.

Bias frames are completely unnecessary for my camera. If you load one and hover its pixels, you will see their value are all zeroes, except for a few that have the value 1/16535.  It is completely pointless to subtract that. Other cameras might be different, but not too much. Bias frames were quite significant with my D3300 but most of my subs were less than 30 seconds. Even at 45 seconds read noise is dwarved by signal and other kinds of noise, becoming completely insignificant above 60 seconds. I think read noise (which is what bias frames are supposed to correct) is much more of a factor in CCD and older CMOS sensors.  It just isn't so with CMOS sensors after 2013 or so.

That leaves flats. Flats correct vignetting and dust motes. I prefer to deal with vignetting in Rawtherapee since this will not introduce any noise. But if I see a dust mote that won't go away with the camera's autoclean function, I just take a few flats to be done with it, and also use them for vignetting, hitting two birds with one stone.
Like
jesco_t 1.81
...
dkamen, your D3300 was actually clipping the black value when taking darks - can't do darks with the D3x00 series.

I agree, though, with dithering and a bad pixel map you can get aways without darks - unless your camera has amp-glow.
Like
GoldfieldAstro 0.90
...
Went back into the archives and found some old data, culled it down to 2 hours and quickly processed it.

https://www.astrobin.com/full/jft0v2/0/

This is 2 hours with a Nikon D810 that has been dithered and well calibrated with darks, flats and flat darks (prefer them over bias personally).

It's rarely an apples to apples comparison when comparing different setups against one another but it does give a bit of an illustration. You do have 5x the integration of ours and we'd argue that we have nicer dusty regions and more contrast between the bright nebulosity and the darker dusty patches. There is so many factors that come into play here like light pollution (we have very little) and processing.

We do feel that confirmation bias does creep into some discussions at times.
Like
dave1968 2.81
...
Thanks for everyone’s help thoughts , I use a Nikon  D5300  which does give a clean image if you get enough subs +4 hrs , once I get my new flat panel working I will give dark flats a go and experiment with different options to see if it makes a difference.
Like
dave1968 2.81
...
https://www.astrobin.com/full/jft0v2/0/

This is 2 hours with a Nikon D810 that has been dithered and well calibrated with darks, flats and flat darks (prefer them over bias personally).

It's rarely an apples to apples comparison when comparing different setups against one another but it does give a bit of an illustration. You do have 5x the integration of ours and we'd argue that we have nicer dusty regions and more contrast between the bright nebulosity and the darker dusty patches. There is so many factors that come into play here like light pollution (we have very little) and processing.

We do feel that confirmation bias does creep into some discussions at times.[/quote]

Thanks for your input, lovely image btw 👍 I will definitely have a play with different options once I get my flats working to my satisfaction.

regards Dave
Edited ...
Like
Starstarter86 1.51
...
can't do darks with the D3x00 series.


Really? I always do darks on my D3200a, and it turned out pretty well so far: https://www.astrobin.com/full/nfhweu/E/

Also tried without darks, but there would always be a discernible pattern in the dark current, a broad stripe in the middle and glow arround the edges, so using dark frames it always looked better. On my unmodified D5300 it's pretty much the same.

CS, Marc
Like
dkamen 6.89
...
Hi David,

I think the biggest difference (apart from light pollution) is your 130mm f/5 sixtuplet vs my 60mm f/6 doublet. Combined with the 15% larger pixel pitch of the D810, I would say you might have 1/5th the exposure time but you have well over 50% more exposure. And everyone knows 1x300 seconds are not the same as 5x60seconds which is what I was struggling with, especially in terms of the faint signal.

Basically, wow.

But apart from that, the number of subs has an important impact to what processing methods are available. If you have 10 or 20 subs then dithering-like methods are not the best choice since undesired features such as hot pixels would appear at 5-10% of your subs which means you cannot get rid of them without removing significant parts of the desirable faint signal. The best approach is to try and improve the individual subs as much as possible instead. Whether this is done best with calibration frames or without them is another discussion altogether. The more subs you have, the more it makes sense to lean towards techniques that rely on statistics to separate good stuff from bad stuff. With hundreds of subs, taken from different angles etc, even large dust motes (which are the worst kind of undesired signal meant to be removed by calibration) go away.

Cheers,
D.
Edited ...
Like
GoldfieldAstro 0.90
...
I think the biggest difference (apart from light pollution) is your 130mm f/5 sixtuplet vs my 60mm f/6 doublet. Combined with the 15% larger pixel pitch of the D810, I would say you might have 1/5th the exposure time but you have well over 50% more exposure. And everyone knows 1x300 seconds are not the same as 5x60seconds which is what I was struggling with, especially in terms of the faint signal.


Without getting too far off topic or getting too nitty gritty, our setup captures 82% more photons per pixel but your D7500 has half the amount of read noise (newer technology) so you need exposures 4x shorter to swamp the read noise but taking into consideration the focal ratio difference it brings it closer to 1/3 of the exposure. So 3x100s is close to equal to 1x300s but if you have some light pollution to contend with (ours was taken under Bortle 1 skies) then you may not need longer than 60s exposures to swamp the 1.5e- read noise.

Something to remember is that if you don't use darks you are still stacking the noise term and averaging it out. Yes, signal increases faster than the noise terms but they're still there potentially corrupting the data.
Like
dkamen 6.89
...
I think the biggest difference (apart from light pollution) is your 130mm f/5 sixtuplet vs my 60mm f/6 doublet. Combined with the 15% larger pixel pitch of the D810, I would say you might have 1/5th the exposure time but you have well over 50% more exposure. And everyone knows 1x300 seconds are not the same as 5x60seconds which is what I was struggling with, especially in terms of the faint signal.
Without getting too far off topic or getting too nitty gritty, our setup captures 82% more photons per pixel but your D7500 has half the amount of read noise (newer technology) so you need exposures 4x shorter to swamp the read noise but taking into consideration the focal ratio difference it brings it closer to 1/3 of the exposure. So 3x100s is close to equal to 1x300s but if you have some light pollution to contend with (ours was taken under Bortle 1 skies) then you may not need longer than 60s exposures to swamp the 1.5e- read noise.

Something to remember is that if you don't use darks you are still stacking the noise term and averaging it out. Yes, signal increases faster than the noise terms but they're still there potentially corrupting the data.


First of all, sorry for mixing you with David, the OP. I kept wondering why you say "we", it is clear you are referring to some group of people. I would like to apologise in advance if I sound too direct, I am a Greek, I am not being hostile or anything, it is just how we speak. I do enjoy the conversation, a lot.

Now, our topic -which you are correct that we should stay on- is calibration techniques when dithering. I presented an image that was extensively dithered and used no calibration frames whatsoever. You presented a much better image that used calibration frames. But is it on topic? I don't think so, because it is only 20 subs. This excludes any meaningful dithering, indeed is at the lowest limits of any statistically based noise & undesired signal reduction method, including sigma clipping or plain averaging. This is actually a completely irrelevant picture when discussing calibration methods when dithering since it does not rely on dithering,

I concur that the difference between your picture and mine is indeed dramatic. However, the question is whether this is because of calibration frames or not. Are you suggesting that if I used calibration frames the image would magically become the equivalent of 20 hours integration instead of the 10 hours that it is? I assure you this is not the case. This was a 4 month project and I did try using darks. I decided that I didn't need them which is a completely different thing. Reversely, do you think that if you removed the darks your image would suddenly become as noisy as mine? I seriously don't think so.

The fact of the matter is your image is taken  in Bortle 1 sky with equipment that costs as much as my equipment _and_ a small car which I could use to load up my equipment and drive thirty minutes to a Bortle 4 location instead of my balcony which resides in the border between a red and a yellow zone. This and the difference in gear are order of magnitude factors when it comes to detail and contrast, not calibration frames.

I think you are giving read noise more significance that it's really worth. Swamping the read noise is indeed important, but collecting signal is more important. If you have two times the level of light pollution you need indeed half the exposure time to swamp the read noise. But you also need four times the exposure time to collect the same faint detail. Otherwise people would be chasing places with high light pollution. Why expose even 30 seconds if you can go under a street light and "swamp light pollution" in 30 milliseconds? I do not know exactly how much darker is a Bortle 1 sky from a Bortle 6-7 sky.  I think the difference is more than 10X which means your 2 hours capture as much faint detail as 200 hours of mine, even if the equipment was identical which it isn't.

I also sense a misconception regarding what exactly is it that darks do. Perhaps it I am only misunderstanding an informal use of the term "read noise", but darks do not remove read noise. Darks are completely unrelated to read noise. Indeed darks, like all calibration frames, remove no noise whatsoever and instead increase it. You cannot remove noise by subtracting a dark frame that only has noise because noise is random and works additively. The things that we use darks for, we all call them "noise" informally or for simplicity's sake but from an information theory point of view are actually signal, albeit undesired. Specifically they are (for a given sensor operating in a given gain):
1) Amp glow (function of temperature and time)
2) Hot pixels (their number is a function of temperature and time)
3) Dark signal (a function of temperature and time)

Each of the three kinds of undesired signal is basically predictable for a given sensor configuration, exposure and temperature which is why subtracting a dark takes it away. However, it has its own associated noise (proportional to the square root of the signal) and this noise is NOT removed when you add darks. It is increased because each dark brings in exactly the same amount of noise as a light. The only thing that reduces noise in the whole process is averaging and sigma clipping. Or a noise reduction algorithm, which is also using statistics but in a different way. These are not inferior alternatives. These are the only alternatives for reducing read noise.

So the question becomes whether including darks brings in more benefits than costs compared to other methods that deal with the undesired signal, in the very specific context of a) a DSLR and b) with dithering. I will say beforehand that I am not against darks. There is a reason the VLT and Hubble raw datasets include darks. But this applies to specific types of gear. Especially CCD and some CMOS sensors that exhibit severe amp glow.  Here is a 300 second Ha sub from my ASI 178MM, whose sensor falls under that category (all images are stretched):



And here is the same sub, corrected with a master dark (not a perfect match but very close and the improvement is clear)



Here is the master dark itself. This is the "noise" (actually thermal signal) that gets subtracted:



I agree this is significant and not amount of dithering can remove it (except perhaps thousands of subs, in which case it is best to just take a dark). There is no disagreement here. It would be silly not to use darks. When you are eliminating a thing that looks like the sun in the corner, it doesn't matter if you add a little noise to the result.

But we are not talking about the ASI178. We are talking about DSLRs.

Now let's see a master dark from the D7500 (again, stretched at extreme levels). This is actually from 54x240 second subs, right below the threshold where the D7500 starts perhaps needing darks:


There is simply no comparison. There is almost no dark signal, since the D7500 has in sensor dark current suppression (meaning the main source of thermal signal is "choked" electronically inside the pixel, before any electron gets captured). There is very little amp glow on the left edge which one would probably want to crop anyway, a few hundred hot pixels and that's all. Would I want to add the faint random noise of that image to my lights, just to get rid of the hot pixels? No I wouldn't. Much better to use a BPM or a hot pixel detection algorithm in the raw processor. They will not introduce noise, unlike the dark. And they will produce a more correct result, as they interpolate the value of the hot pixel from the neighboring ones instead of clipping it to zero.

And like I said the D7500 is an exception in that it probably needs darks after some point. It is very different with most other modern DSLRs.

To be sure, I am not saying "do nothing about the undesired signal". I am saying that with a DSLR and especially in a dithering context which is our subject, one can (and in most cases should) deal with it in better ways:
-You *should* swamp the readout noise and bias signal. But is very difficult not to do that with any modern DSLR because of the quality of the electronics, just don't use the lowest ISO and anything above 30 seconds will swamp those things. No bias. Plus, how much sense does it make to use bias frames if the bias signal is clipped on camera?
-You can use a BPM or an algorithm to deal with hot pixels, although they are much less of a problem with dithering.
-You can use algorithmic vignetting instead of flats. You *should* prefer flats if you have large dust motes.
-You can use noise reduction *before* integration to deal with the various kinds of noise, something which calibration frames won't do.

I do apologise for the lengthy and somewhat passionate post.

Cheers,
Dimitris
Like
 
Register or login to create to post a reply.