Diffuse gas/illumination Pleiades Astrophoto PixInsight · Entropy · ... · 23 · 1516 · 6

Entropy 2.11
...
· 
·  7 likes
Hey all,

So I have some narrow band data for the heart nebula but I'm getting a lot of diffuse illumination from, I think, the hydrogen channel that I don't see in other images of the nebula.

Does anyone know how to fix this, is this normal for the heart nebula, or is it just simply over-stretched? Its really irking me out.
SH2190_astrobinhelp.jpg

Here is a google drive link of the stacked subs, any help is appreciated ....
https://drive.google.com/drive/folders/1iOLLkcVlBn2UXv4SAfJgR80n9_vxqSxs?usp=drive_link

Like
PABresler 0.00
...
· 
·  1 like
Here is mine:  https://www.astrobin.com/i5m69a/B/?nc=&nce=

The masters might be clipped. Did you use Dynamic Background Extraction on each master before stretching them?

Peter
Like
ognvet 0.90
...
· 
·  1 like
Hi there, 

As already commented, doing DBE in every one of your masters before stretching helps a lot to solve this. I will try to do it from your files and send it to you. 

CS

Obdulio
Like
pfleurant 0.00
...
· 
What camera, filter wheel,etc., are you using? Let's see the flats.
Like
mxpwr 4.37
...
· 
·  2 likes
My recommendation is to use DBE only with first order polynomial.
if you use 2 or higher you tend to add artifacts like the one you are seeing here.
Like
Die_Launische_Diva 11.14
...
· 
·  2 likes
My guesses are problems with flats (and calibration in general), or a light leak. Ensure you have covered those first before trying anything else in post-processing (DBE/ABE, etc).
Like
menardre
...
· 
·  1 like
As other have said, doing DBE and focusing on the problem areas helps.

I have found that sometimes I can correct this using StarXterminator ... which essentially takes your one image and makes two .. one with only stars and one with non-stars. You can then process the stars and non-stars images separately, working to eliminate the problem areas, and then re-combine (PixelMath) when you are done.
Like
Entropy 2.11
...
· 
·  4 likes
Thanks to everyone here, I spent all morning trying all of your suggestions regarding DBE instead of ABE.  I think the results are much better.
Don't mind some of the compression artifacts, its a 48 MP picture native and I have to compress it pretty far to get it under the 1MB limit. SH2_190 Astrobin.jpg
Like
ognvet 0.90
...
· 
·  3 likes
Hi @Entropy, 

Check this out: Light Vortex Astronomy - Tutorials

Full of great tutorials to learn from. 
I just took a look at your data, masters are really good. One important thing, once you do star alignment, pay extra attention to not leaving black pixels in any of the frames (your OIII and SII had them); this affects the data in the linear phase and doesn't give the best stretching results. Having BlurXterminator, StarXterminator, and NoiseXterminator makes things a lot easier. I apply BlurXterminator immediately after DBE, StarXterminator before or after the stretching (depending on the project), and only after stretching and once StarXterminator is done, I do apply NoiseXterminator. I did a test with your data, and it's really great; see below; as an example of what you are really getting, that it's great!
I personally, do analyze thousands of times my images; and still never happy with the results; but by trying many times, you learn a lot in the process. 

This is the image after processing YOUR DATA!
Heart_Nebula_Entropy.jpg
CS
Obdulio
Like
SergeC 0.00
...
· 
Nobody's mentioned LinearFit, yet. Standard part of my workflow, check the histogram of each NB channel and LinearFit to the  one whose peak is furthest to the right . I usually follow this with BlurX (set to correct only) , and then maybe another round of BlurX set to nonstellar then stellar while  setting a manual psf (these latter 2 steps I added after watching Russel Croman's interview with Adam Block). YMMV.
Like
Gunshy61 10.10
...
· 
·  3 likes
Hi Entropy,

Just my two cents worth, but I feel that nowhere enough attention is paid to background extraction and the subtleties of its use can dramatically affect the results of your processing - especially when the field of view contains nebulosity over most of the frame.   

You want to make sure your point selection only contains sky glow and not dim nebulosity.   I take my strongest signal to noise masters (lum, Ha) and combine them to form the strongest signal/noise linear mono-image I can.   Then I overstretch - particularly the dimmest portions of the image.   This image is used only for determining where I should place my background points.    Make sure these points are selected only at places where the signal is truly zero - either the dark nebula is so thick that it is truly black or the nebulosity is truly absent.   Because this image is over-stretched, you may have to change the colour of the points to actuallly see them.   If, as in the Heart Nebula, there are very few places that are truly black, then you may have few points to compute your background.  Try and avoid temptation to add points where the subject matter isn't black.    One this is done, and your points are selected, iconify DBE and exit dynamic mode, and you can delete this image.

Take one of your master frames (either colour of mono)  and double click on the iconified DBE so that the points you selected actually appear on the linear image.   I agree with what is stated above, in that you should avoid making the background spline fit tolerance too big.   You just want all of your points included, but don't want to over-match the spline fit to your background or you will generate terrible artifacts.   Splines can be great, but they can also be terrible and it is a fine art to know when and how to use them.

Clicking the check mark without any correction method will display the background model that will be used.   Make sure this makes sense before doing any correction on each of you masters.

The final crucial part of DBE is which correction to apply - subtraction or division.   With experience you will see the different results that these different methods will yield.   Subtraction can clip actual data, while division can amplify or diminish one of the signal channels too much.  I usually create two "background removed" images using each subtraction and division and then take a linear combination of the two (50% of each?), to create the final result.   The exact porportion will depend upon whether you want to show the dim stuff or not.    At least including some portion of the division result will keep data from being clipped.

I could go on and on about this, but DBE needs to be applied very carefully.   You can try GraXpert, but this involved leaving Pixinsight to do your background extraction.

Hope this helps,
Dave
Like
SteveCooper 2.41
...
· 
·  5 likes
Here is more of a "Modified Hubble Palette" with the blues and gold accentuated. It seemed more like your original intent to me.
RZ_RZ_SOS.JPG
Like
ognvet 0.90
...
· 
For my own data I would have done a dynamic blending as described here: https://thecoldestnights.com/2020/06/pixinsight-dynamic-narrowband-combinations-with-pixelmath/

As I am applying for most of my own projects with some exceptions.

Clear skies

Obdulio
Like
Entropy 2.11
...
· 
Steve Cooper:
Here is more of a "Modified Hubble Palette" with the blues and gold accentuated. It seemed more like your original intent to me.
RZ_RZ_SOS.JPG

Definitely my intent, however It seems the unanimous opinion is the strengthen my DBE game, however I can't seem to get out that texturing in what I'm assuming is the SII channel around the edges. I feel like I'm still missing a part of the puzzle.

At this point I've tried Division, Subtraction, combinations of both and various  different tolerancing.
Like
SteveCooper 2.41
...
· 
·  1 like
Definitely my intent, however It seems the unanimous opinion is the strengthen my DBE game, however I can't seem to get out that texturing in what I'm assuming is the SII channel around the edges. I feel like I'm still missing a part of the puzzle.

At this point I've tried Division, Subtraction, combinations of both and various  different tolerancing.


Forgive me if this has already been addressed, but you are applying DBE to linier data correct?  The DBE tool is a difficult one to master. Selecting the right sampling points and settings is very tricky. There are many instructional videos on this. @David Payne had some good tips too.
Like
HotSkyAstronomy 2.11
...
· 
D. Jung:
My recommendation is to use DBE only with first order polynomial.
if you use 2 or higher you tend to add artifacts like the one you are seeing here.

That's why I set my own samples when I use 5th order, and set the smoothen to unweighted, and use modestly small samples, with small gaps, then fill the gaps in with jumbo sized samples, imagine a square made of 4 evenly spaced samples, and then a very large sample that covers those 4 placed in the middle of the gaps. Only issue is it takes a very long time to set the 700ish samples, so I rarely do that. But every time I do, oh boy.
Edited ...
Like
HotSkyAstronomy 2.11
...
· 
·  1 like
At this point I've tried Division, Subtraction, combinations of both and various  different tolerancing.

Try setting your own samples, see my other post in this thread. It takes a while to set, and calculate, but the result is mind numbing.
Like
Gunshy61 10.10
...
· 
Hi again, I would stretch using GHS to control the brightness of the background to taste.
CS, Dave
Like
Entropy 2.11
...
· 
David Payne:
Hi again, I would stretch using GHS to control the brightness of the background to taste.
CS, Dave

*Hey David, I've definitely been using GHS. I've been reading up on DBE trying to get a better grasp. It's nice to be able to see what's possible with the data but definitely frustrating not being able to get there.

I'm kind of caught in a weird spot, I keep either completely crushing out the faint nebulosity at the edges or it manifests in a weird way that just looks like noise.

It's frustrating but I'm happy so many experienced  people have commented with help.
Like
Entropy 2.11
...
· 
·  1 like
V.M Legary:
At this point I've tried Division, Subtraction, combinations of both and various  different tolerancing.

Try setting your own samples, see my other post in this thread. It takes a while to set, and calculate, but the result is mind numbing.

*Thanks VM, I have tried setting my own samples. I've been reading up on DBE during my commute. I'm going give it another go tonight.
Like
HotSkyAstronomy 2.11
...
· 
·  1 like
V.M Legary:
At this point I've tried Division, Subtraction, combinations of both and various  different tolerancing.

Try setting your own samples, see my other post in this thread. It takes a while to set, and calculate, but the result is mind numbing.

*Thanks VM, I have tried setting my own samples. I've been reading up on DBE during my commute. I'm going give it another go tonight.

*If pixinsight's DBE doesnt work, try going onto SIRIL's one, it's very stable and can do a great job simplifying the process.
Like
kyh2791 0.00
...
· 
·  1 like
Image36.jpg

I also tried a quick fix...
I was able to solve your problem with ABE due to the difference in correction order, but I think it would be better to try DBE for a little more detail.

it was really fun to see the results of corrections on this topic with many people. Thank you for the valuable information. It was a good correction practice.
Like
mxpwr 4.37
...
· 
·  1 like
From my experience, when you have to rely on DBE, you have lost already. Typically due to bad flat frame and/or nasty light pollution. 
You can mitigate with DBE, but usually not fix it.
​If you have a well matching flat frame and "nice" gradients, you apply ABE with first order and you are done.

I recall there was a post about a bit more complex solution to this problem that requires taking an image of the same object but with shorter focal length and use that to find the correct background.
https://pixinsight.com/tutorials/multiscale-gradient-correction/
I've tried it a few times with mixed results. Can be very good or not...
Edited ...
Like
pfleurant 0.00
...
· 
Let's see the flats!
Like
 
Register or login to create to post a reply.