TheSky X autodark issue suspected

Discussion in 'Legacy Models - Community Support' started by Colin Haig, Nov 20, 2020.

  1. Colin Haig

    Colin Haig Staff Member

    Joined:
    Oct 27, 2014
    Messages:
    7,962
    Location:
    Earth
    I had another look at that 9114 image, as I realized you probably didn't give us a raw unprocessed image.
    It was auto-dark subtracted, not a raw image, and it looks like the auto-dark wasn't right - I can see faint stars were subtracted.
    I also see you changed the version of TheSkyX that you are running.
    Now you are at 12827. I just checked the latet daily build version is 12831.
    Possible next steps:
    (a) provide us witha raw, unprocessed images (light with Ha and dark), binned 1x1 in FITS format, preferrably of the same target. Then we can rule out an ongoing fault with the camera or not.
    (b) Contact Software Bisque regarding an apparent autodark problem
    (c) Why are you using autodarks anyway? Usually you want to get perhaps a few dozen light frames, and build a dark library, and then calibrate them.
     
  2. Doug

    Doug Staff Member

    Joined:
    Sep 25, 2014
    Messages:
    10,316
    If you stretch the 9114 image very hard, you get dark spots on the left side of stars. It's suggestive of a failed dark subtraction due to stars being present. I see these have "BD" status, meaning they have had bias and dark subtracted.

    Might I suggest a simpler and much better solution? You should never be using autodarks in the first place. They're a good thing to do when focusing or centering targets, but when you're doing long exposures they are the last thing you want to do.

    You want to use full calibration, in post processing, for multiple reasons:
    1. If there is a problem with your calibration frame(s), you do not ruin all of your raw images. Thus you avoid having to throw out data that was arduously collected on the rare clear dark nights.
    2. Calibration frames add extra noise into your final result. You want to average a bunch of calibration frames in order to reduce their noise contribution. Typically the software will have this capability as part of the calibration function.
    3. If you're not happy with the final result, you can go back and reshoot dark frames etc.
    4. You can collect tons of dark frames etc. at the beginning or end of your observing session, and you don't waste quality skies on them. Spending time recording the back of the shutter is not an efficient use of clear dark skies!
    5. You can build of a collection of calibration frames and use them for data from multiple nights. I often keep calibration frames for months before reshooting them.
    So start shooting your images RAW, with no autodark or other processing. Shoot your dark frames separately, and check them before you use them. You will get much better results, and whatever is happening here won't destroy your images.
     
  3. JoeGafford

    JoeGafford Cyanogen Customer

    Joined:
    Feb 22, 2016
    Messages:
    30
    Location:
    Denver, CO
    I see the same dark pixels in the Lum and better in the green and the Ha filtered images stretched. It shows more in the narrowband images. These dark pixels are normal and may increase in number over time use of the CCD chip. Use a dark and light pixel mapping to rid of these things. I got these in my KAI 2020 chip. Using master darks made every 6 months plus flat fielding may help, especially the donuts and most of the vignetting.
     

    Attached Files:

  4. William B

    William B Cyanogen Customer

    Joined:
    Jan 8, 2015
    Messages:
    641
    Location:
    Christchurch, Dorset UK
    I had a different thought when looking at the ~9114~ image, it reminded me of a calibrated image where the dark was contaminated by an incompletely flushed preceding ‘light’ image.

    The slight offset to the left in this case due to the intervening mount movement (or dither) between the time of the previous ‘light’ image and the auto-dark and ‘light’ of the current image.

    The appearance of the dark ‘spot’ artefacts against only the brightest, most saturated stars together with the shift in location makes me think this is a ghost from the last image of the previous light contaminating the auto-dark.

    I wonder whether your camera or acquisition software has a problem with sensor flushing or RBI mitigation between exposures.

    You could try taking an automated image series of a dense star field containing several bright saturated stars and consisting of say five ‘light’ frames followed immediately by five ‘dark’ frames in your chosen capture program using the worst affected filter for the lights (which you would expect to be red, Ha and SII if largely an RBI effect) and with no auto-dark selected.

    When you examine the image series you should see a complete change from a normal ‘light’ frame with stars to a ‘dark’ image with no stars, even when stretched hard. If you see a dark image that contains a faint ‘ghost’ of the previous image, and that ‘ghost’ either disappears completely, or becomes even fainter in the subsequent ‘darks’ then that would confirm that sensor flushing or RBI mitigation is not working properly.

    If the above proves to be the case and as sensor flushing and RBI mitigation is controlled by both software and hardware you should repeat the test using CCDOps, or some other different capture program to further determine if this is due to a hardware or software issue.

    HTH
     
    Last edited: Nov 26, 2020
  5. Colin Haig

    Colin Haig Staff Member

    Joined:
    Oct 27, 2014
    Messages:
    7,962
    Location:
    Earth
    William, that's what I was getting at in earlier posts - looking like RBI.
     
  6. Jim Bradburn

    Jim Bradburn Cyanogen Customer

    Joined:
    Jan 9, 2018
    Messages:
    32
    OK, but I just called up image 9114 and there they are, just to the left of the bright stars. Maybe, because I am able to make them disappear, this is becoming a moot point and not worth your time?
    But, thank you for the help so far.
    Jim
     
  7. Jim Bradburn

    Jim Bradburn Cyanogen Customer

    Joined:
    Jan 9, 2018
    Messages:
    32
    Thank you William. I will attempt this exercise once the moon is down.
    Jim
     
  8. Jim Bradburn

    Jim Bradburn Cyanogen Customer

    Joined:
    Jan 9, 2018
    Messages:
    32
    You guys are my heroes. I have always used AutoDark because I didn't know any better. I learned more today than in quite a while..
    I will stop using AutoDark, and start using my calibration library. I will send a raw image through the red filters after the moon disappears.
    Thank you all.
    Jim
     
  9. Colin Haig

    Colin Haig Staff Member

    Joined:
    Oct 27, 2014
    Messages:
    7,962
    Location:
    Earth
    Jim, I'm reminded of that "NO HUNTING" sign up at the cottage... the one with bullet holes. ;)
    Once we have an image without autodark, we'll be able to tell if the camera image is clean, just like the sign was before the gunfire.
     
  10. Jim Bradburn

    Jim Bradburn Cyanogen Customer

    Joined:
    Jan 9, 2018
    Messages:
    32
    Colin, I like the metaphor and will get some non AutoDark images once the moon goes away. Jim
     
  11. JoeGafford

    JoeGafford Cyanogen Customer

    Joined:
    Feb 22, 2016
    Messages:
    30
    Location:
    Denver, CO
    I have a cap over the camera adapter flange and set it in the window at night exposed to the outside when it is cold and do my series of darks then. This frees up the time during observing sessions. You need at least 4 images each of exposure times and temperatures. I do darks of 1, 2, 3, 5, and 10 minutes in the window, less than 1 minute at the DSS. My temp ranges are -7C (20F) summer; -18C (0F), -25C (-13F); -35C (-31F); fall, winter, spring.

    Joe
     
  12. Jim Bradburn

    Jim Bradburn Cyanogen Customer

    Joined:
    Jan 9, 2018
    Messages:
    32
    Hello Colin
    Attached are raw .fit files taken this evening of M1. The have not been calibrated, no Auto Dark. One luminance, red, green and blue.
    Let me know what you see.
    Jim
     

    Attached Files:

  13. Jim Bradburn

    Jim Bradburn Cyanogen Customer

    Joined:
    Jan 9, 2018
    Messages:
    32
    Attached is a color combination of the above files. So what are the red dots?
     

    Attached Files:

  14. William B

    William B Cyanogen Customer

    Joined:
    Jan 8, 2015
    Messages:
    641
    Location:
    Christchurch, Dorset UK
    Hi Jim.

    I had time this morning for a quick look at your latest images and there is nothing unusual in them.

    The "red" dots are the uncalibrated hot pixels in the red channel.

    If you look carefully you will see that there are also blue dots and green dots in the blue and green channels corresponding to the same group of hot pixels on the sensor.

    The displacement of the three colour 'dots' is normal and shows that the combined image aligned correctly on the stars during stacking while mount dithering moved those stars to different pixels on the sensor relative to the static hot pixels.

    These pixel artefacts will calibrate out of the final image once you apply calibration frames prior to combination, ideally the full set of bias, darks and flats, and use multiple subs in each channel so that sigma reject can correctly identify and remove those groups of bad pixels during combination.

    Because your hot pixels are grouped together in several rather large aggregations you will need to use a large mount dither between each sub-frame so that when you combine the subs using a sigma-reject method then the bad pixel artefacts will be rejected from the combined image.

    Looking at your worst group of hot pixels I see at least 7 adjacent pixels in the vertical axis and five in the horizontal so you should aim for a mount dither larger than that to avoid 'dark' spots appearing over light regions of the image after calibration has dealt with the bad pixels.

    Unfortunately there is no way around that need for a large dither and plenty of sub exposures in each channel once the sensor begins collecting large clusters of adjacent hot, cold and dead pixels if you want to fully remove all pixel artefacts from the final combined image.

    Finally, your red channel is underexposed.
    The L, G and B images were taken at 180 seconds and the R was only 60 seconds which has left the final combined image weak in the red.
    Use equal exposure weighting for the colour channels and colour balance in post processing if necessary.
    It is easier to reduce the contribution of a colour channel in post processing than to try and boost a weak one, which results in increased chrominance noise.
    The use of a strong deconvolve has sharpened the nebula but left rather prominent halos around the brighter stars.

    I expect Colin will add some comments too...

    William.
     
  15. Jim Bradburn

    Jim Bradburn Cyanogen Customer

    Joined:
    Jan 9, 2018
    Messages:
    32
    Hi William
    Thank you for your quick response. First, the underexposed red group is my error. I must not have been paying enough attention last night when I set up the series.
    I am dithering 7 pixels. Should I raise that to 10? Also, do you mean just more exposures in the color channels? I took three each in this image, and 12 or 15 in luminance.
    Jim
     
  16. Doug

    Doug Staff Member

    Joined:
    Sep 25, 2014
    Messages:
    10,316
    I suspect the hot pixel problem isn't caused by lack of dithering... I think it may be due to the stacking process - you're not rejecting outlier pixels for some reason. Perhaps it is time to ask your software provider for assistance.
     
  17. William B

    William B Cyanogen Customer

    Joined:
    Jan 8, 2015
    Messages:
    641
    Location:
    Christchurch, Dorset UK
    Hi Doug.

    I might be wrong but I think the latest images that Jim posted are just single subs, no calibration or stacking involved.

    Hi Jim.

    Yes, raise the dither to 9 or 10 pixels and increase the number of colour subs per channel.

    For sigma reject to work adequately during combination you need many subs to calculate accurately the mean value for any given pixel so that those sub images containing pixels that deviate from the mean can be rejected fully in the final combined image.

    Difficult to suggest a number as so many variables are at work but maybe 6 or 8 per colour channel as a minimum would be a good guess.

    Once this problem of dark spots in your images is successfully solved and if the program you use for calibration and combination will combine sub images with different binning then try a series of LRGB with the RGB taken at bin 2x2 and the L taken at bin 1x1.

    With the increased sensitivity of binning 2x2 you can reduce the colour sub exposure time by around a third to save telescope time and the final image won’t look that much different to one taken using binning 1x1 for all channels.

    The downside is that you need matching calibration frames for the colour subs at bin 2x2 but at least these can be taken later and not waste telescope time.

    If you get a chance, try taking a series of the same object on the same night with L at bin 1x1 and two sets of the RGB at 1x1 and 2x2 binning then process the two sets separately and compare, you may find that binning the colour channels will work well with your setup and help save time.

    William.
     
  18. Jim Bradburn

    Jim Bradburn Cyanogen Customer

    Joined:
    Jan 9, 2018
    Messages:
    32
    William and Doug
    Actually this color image was calibrated, bloom removed, registered, normalized, sigma data rejected, separately combined, all at 1x1 binning, before color combine. I will try your suggestions of binning, more dithering and increased sub images tonight.
    I will send this over to CCD Ware and get their input as well.
    The four raw images I sent first showed no dark spots. Have we solved this with the shutter adjustment, or do you see something in them that needs attention?
    Jim
     
  19. Doug

    Doug Staff Member

    Joined:
    Sep 25, 2014
    Messages:
    10,316
    Hang on... did you subtract a dark frame from these before combining? That's the basic first step.

    Here's the process for creating decent image stacks:
    1. Take a series of LRGB images, say 10 images per filters. They should be dithered.
    2. Take a series of dark frames with, say 10 images total. These have to be the same exposure time and CCD temperature as above - take multiple dark frame sets if the exposure times are different for the various filters.
    3. Create a master dark frame
    4. Subtract that master dark frame from ALL of your images. Use the calibration / reduction feature of your software to do steps 3 and 4
    5. Realign and stack the images for each color band. Use SD Mask if available, or Sigma Clip if not.
    6. Stack the resulting LRGB color frames.
    Edit: Okay you say you are doing this... if so then there's a processing issue.

    At this point this is NOT a camera issue. Everything looks normal.
     
  20. Jim Bradburn

    Jim Bradburn Cyanogen Customer

    Joined:
    Jan 9, 2018
    Messages:
    32
    Sorry, there were 12 Lum and 3 each RGB
     

Share This Page