Quality Control Astrobin Community Survey · James Tickner · ... · 105 · 1199 · 29

profbriannz 16.52
...
· 
@James Tickner  you are in the best position to judge what you need going into the mosaicing work.  My only  (slight) concern is that XT [corrrect. only] mode is so good at fixing de-focus across a field, I am not sure if we will be able to harness its full pwers post-mosaicing.  But lets fix mosacing first!

CS Brian
Like
MichaelRing 3.94
...
· 
James Tickner:
Stage 2 is 'matching' of background levels between mosaiced frames. I have a trial of this function working, which just fits gradients in overlapping fields with a very simple linear function and iteratively adjusts the fits in each field until everything matches. From some early tests on images taken near the N and S pole, this results in quite good matching, allowing even the very faint polar IFN to be seen.


The idea I am following currently is to define background levels based on the 40mm frames. For that I want to extract an area that is matching the frame we wish to modify from the 40mm frame and then look at the background levels found in some areas on the extracted parts and the original image.

This could help to make sure that we do not aggregate errors in background levels when images do not match.
This step may get obsolete when we also do the 'gross' gradients based on the 40mm images, but in my current tests I was not able to repro the very good results we had when the Steffen (the GraXpert developer) did this, my results always show a color shift in some areas that even I can see....

So here I am currently trying to learn PI scripting good enough so that I can script the necessary steps in PI and can then try to fix the errors I see hopefully with the help of Steffen. As we are working with a number of images at the same things get very easily confusing so I think a script is a better way to do this.

Likely what @James Tickner  is doing will get us good results a lot faster but this hole journey is also about learning new tricks, so I guess it is good to try different ways and to always be able to redo steps in an automated way so that in case we find a better way we can redo things without much manual efforts....

Parts of this idea have already ended up in latest GraXpert, the team accepted a change request from my side and now we can also use the non AI method of GraXpert to clean up gradients and can re-use the sample points and settings we used to get that result so that we can (for example) reuse that points when data gets added to a field or we introduce a new tool to our workflow.

@Brian Boyle : let me know when you are interested in this, then I can create a new thread on this topic. The reason I invested time in this area was that some of the gradients we had did not fully remove in the AI version and sometimes I think I saw that arround bright stars the AI was artificially darkening things a bit too much.

Michael
Edited ...
Like
Astrogerdt 0.00
...
· 
·  1 like
@Michael Ring In case you need any help with PixInsight scripting or programming in general, let me know, I already got some (albeit small) experiences with PixInsight scripting and more with programming in general. 

And, also in case you need the functionality, I wrote a small script to find a 25x25 region of true background in RGB images. Here is my announcement in the new scripts part of the PI forum: https://pixinsight.com/forum/index.php?threads/findbackground-new-script-for-finding-true-background-in-an-image.22417/

CS Gerrit
Like
MichaelRing 3.94
...
· 
·  1 like
Nice!

Many thanks, I will message you directly when I need help...

Michael
Like
Astrogerdt 0.00
...
· 
Hi all, 

@James Tickner I recently read this post by Juan (PixInsight developer): https://pixinsight.com/forum/index.php?threads/spcc-failure.22572/post-143815

It is about the suitability of the GAIA magnitude values for color calibration, without using the spectral data. His conclusion is, that the used G Filter is not very similar to the ones used normally. 
With my own script, I also encountered some issues with the catalog brightnesses, however, I am not quite sure if the error is in my code or the data themselves. 

Maybe this is worth looking into for our own color calibration. 

CS Gerrit
Like
james.tickner 1.20
...
· 
·  2 likes
Hi guys

Sorry for the long delay due to some work travel, but I finally got around to looking at the impact of BXT on star colour and brightness. I took the 9 images that Gerrit had kindly processed through BXT and ran them through my processing chain. I then compared the QC reports of the BXT-sharpened and original images.

The results for all 9 fields are available in my Google Drive in the BXT comparison folder (link here https://drive.google.com/drive/folders/1xCLRfFOiEqI-pp01m8FBZ8GFEuPEbJV1?usp=sharing).

The Stage 0 reports contain information about star colour and brightness. They show that BXT unfortunately does not conserve flux, meaning that the total luminosity of stars changes. As far as I can tell, it seems that stars with bad optical distortion end up brighter than they should be. The quality of the agreement between measured and catalog luminosities is degraded after applying BXT, with the effect most pronounced for images where the distortion is greatest.

Here's an example for field 002 where the distortion is quite low.

First the result for the original raw image - note the good correlation between measured and catalog magnitudes.
image.png
Now the result for the BXT image - correlation is worse and the star flatness is not so good.
image.png

Now here are the results for field 221 where the distortion was large, particularly in the top-right corner. Again, here is the raw image result first:
image.png
And now after applying BXT. Stars in the top-right and bottom-left corners have been brightened by 0.5-1.0 mag, and the result is a poor correlation with catalog magnitudes. The star flatness is very bad. It seems in this case that even the background flatness is also affected.
image.png
The good news is that as expected, BXT significantly improves resolution across the image, and does a great job of reducing apparent optical distortion. This is shown in the stage 1 reports. Here is an example for the same field 221 which shows bad tilt.

First, the result for the raw image without BXT - note the poor star sizes at top-right and bottom-left.
image.png
Next the result for the BXT image:
image.png
Resolution in the centre of the image has improved from 22.6" to 15.5", and the tilt and off-axis aberration are both much better. 

I experimented with applying a smooth, artificial flat to the BXT images to improve the flatness of the star brightness. This works to a reasonable degree, but then introduces a significant non-flatness to the background. Again, here is field 221 as an example - stage 0 processing on the BXT image with a multiplicative correction applied to flatten out star magnitudes across the field.
image.png
Star flatness is improved (but still not as good as the original raw image), but the background has been artificially brightened in the lower left part of the image. I could probably remove this with gradient subtraction, but it feels like the different corrections are now starting to fight one another. I'm also concerned that BXT's impact on star brightness might be different from its impact of brightness of extended structures, so we would be introducing still more distortions.

So, it seems we want the benefits of BXT on resolution and star shape then we have to live with its impact on star colour and brightness, or vice versa. Gerrit: which version of BXT did you use? Do I correctly recall some forum comments indicating that this problem might be reduced in later versions? Would it be worth sharing the results with the BXT developer for his thoughts?

I'd appreciate your suggestions!

James
Like
profbriannz 16.52
...
· 
·  1 like
Hi James

Great work.  I would certainly share this with Russell.  He appears very approachable on these matters.

Mathematically de-covolution is inherently flux-preserving, so the change in star brightness should not be happening if this is true deconvolution.   But I am no software expert, so perhaps this is inevitable in the way it is coded up.

Regarding what we keep [consistent star shapes, star brightnesses or background flatness], I would certainly vote for star shapes and background if wwe couldn't keep all three.

I think the survey wlll be used primarily for visual inspection and not stellar photometry [GAIA does a better job for that].  In even even I imagine the final survey product will be a stretched non-linear image.   Anything that allows us to seamlessly merge and stretch multiple fields,  minimising seams while maximising visibility of both bright and faint objects would have my vote.

By the way, managed to complete and upload first 2 fields of Feb lunation.  Every likelihood we will complete a first pass of the Souther Sky below -38 and possibly below -33] in the next month.   

CS Brian
Like
MichaelRing 3.94
...
· 
Most important: Yes, please share your insight with Russel, I had contact with him several times in the past (when StarXterminator was still in is early days..) and I agree with Brian, he is quite approacheable. If you do not want to contact him directly I guess either Brian or I can contact him as we are paying customers of his tools.

On star brightness a thought:

@James Tickner  , if nothing comes back from the discussion with Russel we might think about bringing in another tool, StarXterminator, if we really need to bring star brightness under control.
That way you could apply your gradient to stars only to correct the brightness of the stars. The main question is of course how strong the impact of the too bright stars effectively is as I guess that we will likely not use stars at their full brightness anyway (and by this reduce dynamic range) to make the dust/objects stand out more.

Is my interpretation of the stage1 correct that there is no extreme color cast after applying BlurXterminator? I remember one of the most obvious problems we saw after applying was a certain color cast after SPCC/BlurXCorrectonly which does not seem to show when we use your color management routines, or is this a misinterpretation?

By the way, @James Tickner , did you find the time to upload some of your re-stacked files to your share? Would love to gater SPCC stats from them...

Michael
Edited ...
Like
james.tickner 1.20
...
· 
@Michael Ring@Brian Boyle Thanks for the feedback!

Whilst the BlurXterminator home page describes it as 'an AI-powered deconvolution tool', I do wonder if it is actually performing a RL or similar style deconvolution (with the PSF determined via AI), or whether it is performing a straight neural-network mapping (eg taking a 20x20 pixel region and mapping the 400 input values to 400 output values via a neural net). If the former then flux-preservation would be expected; if the latter, then it would depend on whether the training set was designed to be flux preserving or not.

Could one of you PM me Russell's email address, or even better email him and make me an introduction (I'm a bit old-fashioned like that ). I'll pick up the conversation with him.

Whilst I agree that the benefits of sharpness and field flatness are ones that we want, I'm a bit concerned that the results on field 221 seem to suggest that BXT is affecting both star brightness and also background brightness, and the impact is different in the two cases. So even if we fix up the stars (by applying a calculated flat-field correction, or removing via StarXterminator and then re-adding after correction), I worry that the affect on diffuse structures will remain. I appreciate your comment Brian that we're not aiming for a photometrically perfect result (and non-linear stretching will break that aim anyway), but my concern is that we might introduce mosaicing artefacts, for example where a 'tilted-and-BXT-corrected-and-brightened' image meets an 'untilted-unbrightened' image. This should be something we can test though, just be picking a couple of neighbouring images that pass through an obvious bright structure like the SMC, LMC or portion of the Milkyway.

@Michael Ring  I don't think anything I've done recently should have affected the colour casts, but I haven't looked carefully at this. Also, I'll upload my newly stacked images when I get a chance at work - it will take forever from my home connection. @Brian Boyle Do I remember correctly that you were going to restack your files as well to deal with the 'donut' issue? Give me a heads-up as these new files become available and I'll pick them up and remake the QC reports.
Like
Astrogerdt 0.00
...
· 
I finally found the time to catch up with what is going on here. 

@James Tickner I am using the current Version, with AI Version 4. 

As for the destructive vs beneficial use of BXT, what if we apply it as a kind of last step?
After color calibrating we got a set of clean data, that is as pure as possible. After that, we could create a set of resized, stretched and stitched images that show the sky as pure as possible. This part of the workflow should be automated, at least I think that this is possible with PixInsight and certainly other software too. 
We could create also create a second set of images based on the same color calibrated images. Applying BXT, SXT, heavily stretching the nebula and slightly stretching the stars, and then combining it again. This should also be entirely possible in PixInsight. Then stitch these images too. 

Yes, that creates additional work, but we could get one set where we know for sure that there can be nothing that is invented by some AI, and another set that is much deeper and shows more fine details. The first could be used as a reference for processing, to know what certainly is there, while the second could be used for identifying interesting spots on the sky and planning exposures. 

What do you think about that approach?

As for BXT doing real deconvolution, or some AI magic, I personally believe it does the later. The differentiation between nebula, stars and background is a clear indicator, to me, that there is more at work than simple traditional algorithms. 

CS Gerrit
Like
profbriannz 16.52
...
· 
Hi James,

I can restack my files if required, but I haven't started yet.  It is a massive job.  About 200 fields at 1.5hours/field using my 2017 iMac.  [And I have to do it when I am not using my Mac for anything else].

It would be helpful to know which are acceptable for mosaicing purposes, so I can just focus on them.  However, until we have a successful mosaic we won't know that either.

The safest way for me to avoid doughnuts is simply to switch off [or dramatically reduce) the cosmetic correction.  Can you give me the numbers of fields which span the range of severity of the issue [from minor to major], so I can experiment with the best approach?

Once I have that, I will slowly work my way out from the South Celestial Pole and record my progress on the booking sheet.  

Thanks
Brian
Like
profbriannz 16.52
...
· 
·  1 like
@James Tickner introductory email sent to Russel Croman...
Like
james.tickner 1.20
...
· 
I finally found the time to catch up with what is going on here. 

@James Tickner I am using the current Version, with AI Version 4. 

As for the destructive vs beneficial use of BXT, what if we apply it as a kind of last step?
After color calibrating we got a set of clean data, that is as pure as possible. After that, we could create a set of resized, stretched and stitched images that show the sky as pure as possible. This part of the workflow should be automated, at least I think that this is possible with PixInsight and certainly other software too. 
We could create also create a second set of images based on the same color calibrated images. Applying BXT, SXT, heavily stretching the nebula and slightly stretching the stars, and then combining it again. This should also be entirely possible in PixInsight. Then stitch these images too. 

Yes, that creates additional work, but we could get one set where we know for sure that there can be nothing that is invented by some AI, and another set that is much deeper and shows more fine details. The first could be used as a reference for processing, to know what certainly is there, while the second could be used for identifying interesting spots on the sky and planning exposures. 

What do you think about that approach?

As for BXT doing real deconvolution, or some AI magic, I personally believe it does the later. The differentiation between nebula, stars and background is a clear indicator, to me, that there is more at work than simple traditional algorithms. 

CS Gerrit

Hi Gerrit

I had a look at the description of BXT AI v 4. It does suggest that flux conservation should be better than earlier versions, so it's interesting that we are seeing something different. One thing it does suggest is to initially apply BXT in 'correct only' mode (ie just fixing optical distortions). Sharpening should then be done later after the mosaic is assembled.

Just to rule this out, would you be able to run BXT on just field 221 in 'correct only' mode?  I'm interested to see whether we still see the brightness distortion with this setting. 

Regarding your workflow suggestions, I'm still hoping to get to the point where we can produce good mosaics of linear images that can then be stretched to taste afterwards. 

James
Edited ...
Like
james.tickner 1.20
...
· 
Brian Boyle:
@James Tickner introductory email sent to Russel Croman...

Thanks Brian - sorry not to have picked this up sooner, but I don't use Gmail very much any more.

And I've been called many things in my life, but I don't think I've ever been auto-corrected to James Twaddle before 
Like
james.tickner 1.20
...
· 
·  1 like
Brian Boyle:
Hi James,

I can restack my files if required, but I haven't started yet.  It is a massive job.  About 200 fields at 1.5hours/field using my 2017 iMac.  [And I have to do it when I am not using my Mac for anything else].

It would be helpful to know which are acceptable for mosaicing purposes, so I can just focus on them.  However, until we have a successful mosaic we won't know that either.

The safest way for me to avoid doughnuts is simply to switch off [or dramatically reduce) the cosmetic correction.  Can you give me the numbers of fields which span the range of severity of the issue [from minor to major], so I can experiment with the best approach?

Once I have that, I will slowly work my way out from the South Celestial Pole and record my progress on the booking sheet.  

Thanks
Brian

I'll put together some donut-detection code and let you know. Probably something simple like comparing flux in a small radius (1-2 pixels) with flux in a large radius (my usual 7 pixels for star flux determination) - that should identify the problem hopefully.
Like
Astrogerdt 0.00
...
· 
@James Tickner I checked my BXT processed files again and they were all processed in Correct only mode already. 

Seems like this settings does not preserve the flux as good as we hoped it would. 

CS Gerrit
Like
profbriannz 16.52
...
· 
James Tickner:
Brian Boyle:
Hi James,

I can restack my files if required, but I haven't started yet.  It is a massive job.  About 200 fields at 1.5hours/field using my 2017 iMac.  [And I have to do it when I am not using my Mac for anything else].

It would be helpful to know which are acceptable for mosaicing purposes, so I can just focus on them.  However, until we have a successful mosaic we won't know that either.

The safest way for me to avoid doughnuts is simply to switch off [or dramatically reduce) the cosmetic correction.  Can you give me the numbers of fields which span the range of severity of the issue [from minor to major], so I can experiment with the best approach?

Once I have that, I will slowly work my way out from the South Celestial Pole and record my progress on the booking sheet.  

Thanks
Brian

I'll put together some donut-detection code and let you know. Probably something simple like comparing flux in a small radius (1-2 pixels) with flux in a large radius (my usual 7 pixels for star flux determination) - that should identify the problem hopefully.


​​​@James Tickner Are F168 and F200 [now on DropBox] any better from the doughtnut viewpoint?

CS Brian
Like
james.tickner 1.20
...
· 
·  1 like
A quick update on the BXT colour issues. As noted in earlier discussions, BXT seems to introduce some odd colour and brightness changes to stars, the sky background and DSOs. It also does a great job of both cleaning up ugly stars in the corner of the fields, and improving visibility of faint stars and DSOs, so it's a good thing to include in the workflow. Brian and I have communicated with Russell, who has noted that BXT is intended mainly for visual image improvement and does not guarantee flux conservation, so for now this is something we will have to work with.

The image below is for field 221, which shows bad camera tilt and strong BXT colour artefacts. Row 1 shows the original image with a simple x5 linear stretch. Row 2 shows the BXT image with the same x5 stretch. Actually, BXT also introduces a strong offset to the sky background, which I've corrected by simply subtracting a separate constant value for R, G and B. Without this correction, the images are completely washed out. The resulting greenish background and green/yellow star colour cast is quite apparent.

Row 3 shows the 'corrected' BXT image. I've subtracted an estimate of the sky background gradient, and applied a smoothed (x,y) dependent multiplicative correction based on the comparison of measured and catalog RGB star magnitudes. Effectively, this is trying to get star colours correct across the field. 

Column 1 shows the entire image field; column 2 is a crop of the top right corner of the image where star shape is worst; column 3 is a crop of the bottom centre of the field.

By construction, the corrected BXT image has a flat, neutrally coloured background and approximately 'correct' star colours (there might be a bit more to be tweaked in the star colour correction - still working on the details of this). The main issue is the colours introduced to the galaxies, which appear with a purple cast in column 2 and a yellow cast in column 3. Galaxies in both crops are neutrally coloured in the original raw image.

I feel that the increased definition and star clarity resulting from BXT is something we should aim to include, but can we leave with the colour casts that it introduces?

I should emphasise that field 221 has some of the worst distortions, and consequently the worst colour distortions. The impact on other fields should be significantly less I think.

Appreciate any thoughts or suggestions!

field221_comparison.png
Like
Alan_Brunelle
...
· 
·  1 like
Brian Boyle:
James Tickner:
A quick update ...
  • A combination of turning off the hot/cold spot correction in DSS and replacing my old, poor-quality flats with a more recently acquired set seems to have cleaned up most of my field flatness issues. I'm now consistently seeing a flatness metric of < 20%, comparable to what I get on images from Brian, Michael, Dan and Todd. The bad news is that I have to laboriously restack ~10,000 images, which is going to take a while. I can't get DSS's batch stacking process working, so it's one at a time for the next few days! Once the files are updated, I'll re-upload them to my Google Drive.
  • ​​@Brian Boyle On closer inspection, I'm also seeing the 'donut' stars reported by Gerrit in at least some of your raw images (I've only checked a handful). I wonder if this might account for the greater catalog v measured star magnitude dispersion I'm seeing. I don't know if you've tracked down the cause of this yet.



I was preparing myself for the day when I might have to re-process all my images.    I suspect I do have holes from my cosmetic correction in WBPP.

Its been tough working through this long thread to get an idea of what a newbie here needs to do to contribute usable data.  I saw this issue mentioned with cosmetic correction in PI and recalled I had the same issue with donut hole stars.  I also tracked it down to Cosmetic Correction used in WBPP.  For me the solution is to check Auto detection, and check hot sigma and set to 10, not default.  I don't use any other settings in CC since it generally provides little service to my images that I can tell.  Sorry if this issue is resolved later in the thread, but reading this whole thing is slow going.

Alan
Like
Alan_Brunelle
...
· 
James Tickner:
A quick update on the BXT colour issues. As noted in earlier discussions, BXT seems to introduce some odd colour and brightness changes to stars, the sky background and DSOs. It also does a great job of both cleaning up ugly stars in the corner of the fields, and improving visibility of faint stars and DSOs, so it's a good thing to include in the workflow. Brian and I have communicated with Russell, who has noted that BXT is intended mainly for visual image improvement and does not guarantee flux conservation, so for now this is something we will have to work with.

The image below is for field 221, which shows bad camera tilt and strong BXT colour artefacts. Row 1 shows the original image with a simple x5 linear stretch. Row 2 shows the BXT image with the same x5 stretch. Actually, BXT also introduces a strong offset to the sky background, which I've corrected by simply subtracting a separate constant value for R, G and B. Without this correction, the images are completely washed out. The resulting greenish background and green/yellow star colour cast is quite apparent.

Row 3 shows the 'corrected' BXT image. I've subtracted an estimate of the sky background gradient, and applied a smoothed (x,y) dependent multiplicative correction based on the comparison of measured and catalog RGB star magnitudes. Effectively, this is trying to get star colours correct across the field. 

Column 1 shows the entire image field; column 2 is a crop of the top right corner of the image where star shape is worst; column 3 is a crop of the bottom centre of the field.

By construction, the corrected BXT image has a flat, neutrally coloured background and approximately 'correct' star colours (there might be a bit more to be tweaked in the star colour correction - still working on the details of this). The main issue is the colours introduced to the galaxies, which appear with a purple cast in column 2 and a yellow cast in column 3. Galaxies in both crops are neutrally coloured in the original raw image.

I feel that the increased definition and star clarity resulting from BXT is something we should aim to include, but can we leave with the colour casts that it introduces?

I should emphasise that field 221 has some of the worst distortions, and consequently the worst colour distortions. The impact on other fields should be significantly less I think.

Appreciate any thoughts or suggestions!

field221_comparison.png

As I have worked up my first images, I had thought about the use of BXT.  Upon completing the reading of this thread, I am pleased that this and other processing ideas have also been discussed for some time on this thread.  Not to delay my primary question here, are these processing ideas, such as applying BXT to the images intended to only be done by you guys, once you get your hands on our stacked images?  It was my understanding that the images should be delivered without any molestation by anything other than the stacking process.  I really have not seen the color shift in stars or background that you guys are discussing, but then I am not using the tools you guys are using to measure it.  In any case, if you want us contributors to do the BXT, I have no problem doing it, if you provide the rules of the road on such activities.

I'm not sure how paranoid you want to be about the effect on flux or star brightness for stars that have undergone a BXT correct-only treatment.  A simple method to see if there is any real issue is to simply measure magnitude plus/minus the treatment in ASTAP to see if the stars are effected and by how much.  If done for stars of different magnitude classes, it might be better understood if the problem is universal across the whole spectrum of stars in a typical image or if it is an issue for only certain brightness classes.

I have been working on my 135 for some time to try to reduce and eliminate the tilt in my system.  Shimming here and there and then testing.  I have recently since learned that the tilt is within my camera (An old-style QHY268C, that uses a dovetail/rotator connection).  A thin shim of electical tape on the key location of the male edge of my dovetail and now my tilt is mostly gone and it remains decent even upon camera rotation.  That said, my stars now are not comma'd, but the stars in the extremes (C-sensor) are a bit X shaped.  Mostly seen in smallest stars.  I see essentially no color or halos anymore.  BXT makes these stars look perfect.  I thought of stopping down the lens, and even bought a few stop-down rings to use, but I really want to work wide open.  Stopping down adds time and time kills throughput for this weather-limited imager.  Not only that, but my small stars are quite small, (working with 3.76 um pixels) and I am not sure the shape of the stars will even show up when the image is rescaled to 10"/pix.  

I am going to claim a grouping of frames on the spreadsheet with the hope of getting 6 or so done when I travel to a dark sky site for a couple nights.  Still no guarantee of clear skies, so fingers crossed.  If I claim these blocks and fail to get them during the next new moon, is there a mechanism to "unclaim" them, so others can do them if I cannot?  Not that there isn't lots of other stuff available up here on the top of the world!

Alan
Like
MichaelRing 3.94
...
· 
·  1 like
Hi Alan,
I reserved the fields for you. Currently there is still so much to do in the North that we can simply keep the reservation and you can add data later. When by accident or chance I also happen to get data on a field that you reserved I will add my Name to ther reservation an we can then later see if it's worth to combine our data or not.

On the topic of BTX: Please do not appy it, after doing the stacking your job is done.... We will very likely apply btx to the images but the processing flow is not finalized yet and who knows wich tool will be invented next to bring our images to the next level and for that reason we should only do the absolutely necessary to our data so that we can adjust to the requirements of new tools once they evolve without the need to request the data again from the contributors.

I wish you lots of fun and of course clear skies on your trip!

Michael
Like
Alan_Brunelle
...
· 
Hi Michael,

Thanks for your clear and concise answer on the BXT.  I think the window of opportunity for some of the fields that I signed up is getting late, so I sure hope I get that clean night to capture them.  If I fail, then those fields will likely have to wait until next year.

If my understanding is correct, at my chosen travel site I will be @ B1, SQM 22.00, Brightness 0.172 mcd/m2,  Artificial Brightness 0.370 μcd/m2.  So I am hoping the 1hr total integration time works for me and I can do a decent number of fields.  If I need lower artificial light numbers, there vast areas east of my chosen location, that have 0.000 numbers, but the drive time and exposure to the elements and unknowns will require I get more familiar with the area before I risk it.

Best,
Alan
Like
profbriannz 16.52
...
· 
·  1 like
A quick summary and some issues

1) Good progress on imaging in the south, and encouraging developments in the north [thanks to @Alan Brunelle@Michael Ring@James Tickner ]. 

2) @James Tickner is working on the mosaicing using BXT.  Do we know when we might see next iteration of mosaicing output?  I think this will also be helpful for QC.   Camera tilt and colour gradients are all being addressed but depth and (possibly) haze will, I think, prove the key factors in rejecting frames.  Any tabular output from the mosaicing process given these characteristics for each field would be helpful as we move into the 2nd year of the survey.  

3) I am conscious I still have a 150-field backlog to process and re-process, @James Tickner to what extent are my doughnut stars causing an issue for you?   Saving my pennies for a modern Mac.   I could start the slow process on re-processing if this was a priority, or alternatively process my outstanding field if e.g. we wished to produce another progress map.  

4) @Michael Ring and I plan [or have] imaged the sky with a 40mm Sigma Art f1.4.  @James Tickner Do what extent will this be helpful to you?


Overall, we continue to make good progress against my expected timeline of 2-3 years for the survey, with the final pipeline process happening only a little before survey completion!    Great work everyone.  

CS Brian
Like
profbriannz 16.52
...
· 
·  1 like
@Alan Brunelle

Thanks for the helpful advice on the use of CosCor.  You are absolutely correct in your thinking that CosmeticCorrection was to blame. 

In my recent WBPP pipeline, I have simply not used CosmeticCorrection.  As you said, I am not sure if CC actually provides any help, as most  [all?]of the hot pixels that would be picked up in a 10sigma detection would appear to taken our by rejection algorithm in the integration.  

My quandry is whether to re-process with high rejection in CC or no CC. As I can see any difference.

CS Brian
Like
MichaelRing 3.94
...
· 
·  1 like
@Brian Boyle  I have built a quite fast PC for stacking when you are interested you can send me an USB stick with your raw data and we can stack the data on this PC.
Like
 
Register or login to create to post a reply.