Basic Processing Help with images from STF8300

Discussion in 'My Astrophotos' started by John Cangelosi, Oct 19, 2022.

  1. John Cangelosi

    John Cangelosi Standard User

    Joined:
    Dec 3, 2021
    Messages:
    21
    Hello,
    I'm hoping to get some basic help with the image processing workflow. I'm a teacher at a high school and we have a pretty good observatory (12" Newt f7.8, Paramount ME, STF-8300, auto guider, filter wheel, etc.). I had some good results last year collecting (decent autoguided 5 min exposures) files of M51 over multiple nights, stacking, and color processing in Photoshop. Our next target (for no particular reason) is the Ring Nebula and I'm running into some issues with the basics.

    I have included some screen captures of a single 35 second exposure where I used CCDOps5 to both 'Auto Contrast' and then manually adjusted the back and contrast to what I thought looked OK. I've also included the original fitts.

    When collecting the test images (including the attached) earlier this month I found that my longer exposures made the Ring look really blown out (in the TheSKYX preview that shows right after the image is taken) so I backed my time way down to 4-6 seconds depending on the filter (even though details of good images I found online were collected with pretty long exposures).
    When I tried to stack in Deep Sky Stacker I ran into some weird issues such as the Red channel not stacking (not enough stars for that exposure I assume). So I did some adjusting of the images in MaxIm LT and re-stacked but I ended up making things worse with color balance and making everything uniform (in terms of contrast).
    Upon further reading (CCDOps manual) it seems like I should be just concentrating on my background readings (?)

    Here are some questions if you have a moment:
    If I can capture a long exposure should I- even though the object of interest looks blown out and without detail (no darker center of the Ring)? (should I just strive for a certain background count with my 8300?).

    Should I go through and adjust contrast on all my images before stacking? If so, is there a way to 'batch process' in MaxIm LT or CCDOps5 to make everything uniform?

    To me it looks like people typically use different exposure times for different filters. Is there a way to determine this or is there a conventional method (I use basic LRGB filters)?

    Any information or suggestions would be greatly appreciated!!!
    Thanks,
    John
    [​IMG]
    [​IMG]
     

    Attached Files:

  2. William B

    William B Cyanogen Customer

    Joined:
    Jan 8, 2015
    Messages:
    641
    Location:
    Christchurch, Dorset UK
    John.

    From very much an amateurs level....

    You have quite a wide ranging set of questions here and you are also operating across different software applications, which makes it difficult to cover everything on a forum dedicated largely to Diffraction Limited software and products, also your images are not showing for some reason, although I could download the .FITS image.

    As a Paramount user myself I have some experience with TheSky so perhaps thats a good place to start.

    When the image is captured and displayed in the TheSky's Photo Viewer a variable automatic screen stretch is applied.

    The calculation for the screen stretch is determined by the range of pixel values in that image and so the screen representation will vary from one filter to the next, from one exposure duration to the next and from one target to the next.

    Depending on local light pollution conditions, moon phase and target elevation, which affect the background sky brightness as well as the target, then the same target with the same filter and exposure time can result in very different screen stretch representations of the image during capture and this is important to understand that you can't rely on the screen stretch representation alone to judge your exposure times.

    Luckily, the TheSky's photo viewer contains a numerical indicator for pixel well ADU (Analogue Digital Units) and with that indicator you can judge the exposure time using a measurement that is unaffected by the actual screen stretch applied to the image.

    TheSky Photo Viewer.jpg

    After the image is captured in TheSky click on the Histogram button at the top left of the Photo Viewer window, that's the button that looks like a bar-graph, the lower part of the Photo Viewer will expand to show the histogram display.

    At the bottom right of the histogram display is a drop-down menu "Method", if not already selected by default select the "Heuristic" option and then adjust the "Scale" and "Highlights" slider tabs to better display the target object for evaluation.

    Now you can measure the pixel values (ADU) by moving the cursor over various objects and locations in the image to determine whether you need to lengthen or shorten the exposure.

    Loading your image "NoAutoDarkClear1x1NGC 672000003863.fit in TheSky's Photo Viewer and moving the cursor over the brightest star in the frame (top left of centre), TYCHO 2643:823, you will note that the ADU readout for the cursor position at the bottom left of the photo frame for the coordinates e.g. 1456,392 (1456 =X, 392 =Y) has the ADU value of 65,535, which is maximum saturation for the pixel, 65,535 being the maximum number that a 16bit binary number can represent as a decimal value.

    Now, if you move the cursor to brightest part of the Ring Nebula outer shell, for example at location 2706,1676, the pixel reading here is only 3446 ADU and your exposure time was 36.5 seconds. With your exposure time of 36.5 seconds the brightest part of the Ring Nebula is roughly twenty times less than the saturation value.

    You could double the exposure time to 73 seconds, and ignoring for the moment dark current which is giving an artificially high background ADU in this image, the ADU for the same pixel at those coordinates would double to reach a predicted ADU of ~6896, which is still approximately ten times smaller than the saturation value for this sensor.

    When you take your trial exposures the balancing act is finding an exposure time that does not saturate the target or leave parts of the target buried below the noise level and that also does not over saturate all of star cores and cause excessive bloating.

    You will find using the Auto Dark option in TheSky Camera Control setup helpful here as it will give the auto-screen stretch better data to work with and allow you to more easily visualise how the final image will appear. The cursor readout over the target will give you an absolute indication of whether or not you are over or under exposed.

    Looking at your image there should be just visible a small barred spiral galaxy to the far right hand side of the Ring, close to the edge of the frame, IC1296, PGC 62532 Mag:15.39, and I think buried in the noise IC1296 is just becoming visible while the very faintest of the Ring Nebula's outermost shell is not visible at all, which is to be expected as that outermost shell is extremely difficult to capture and would completely saturate the Ring's brighter shells, as in this image:

    Screenshot 2022-10-21 at 13.15.07.png

    I suspect that with an appropriate dark subtraction you could increase the exposure time to between 60 and 80 seconds for the clear filter, and 90 to 120 for the colour filters, use the cursor pixel ADU readout to measure across the Ring and check that the shells are not saturated.
    Setting as low as 4-6 seconds for your RGB filters was much too low, it was the auto-stretch function of TheSky's Photo Viewer that made the Ring's shell appear blown-out

    As for adjusting the stretch of the images before stacking, no, this is never done, images are calibrated and stacked only in the linear state, all stretching and other enhancements are carried out after stacking.

    There are two schools of thought re' adjusting the exposure times per RGB filter to achieve a matching background level.
    You can do that as quick and dirty method to determine per-filter exposure times but the results achieved only compute a match for your local sky background, which may be light polluted, or affected by moon glow, so that the resulting. RGB combination will show not the true object colour but a neutral sky background and biased object colour.

    A far better method is to point the telescope to a known G2V class star (sun type) near the zenith, take a set of exposures through each filter in turn, setting an exposure time that achieves a pixel value at the stars core of approx 50% saturation so that the camera's sensor and amplifiers are operating in the linear portion of the camera's response curve, for your camera that would be in the ~30,000 ADU region, calibrate the images and then measure accurately the combined light of all the pixels that are covered by the G2V star's airy disk using a photometry program in aperture mode.

    Unfortunately TheSky can't do this, it has no photometry support AFAIK, you would need to use MaxIm, or some other program that has photometry support, then when you calculate that G2V star's combined output as average or median ADU against exposure time for each filter, which gives you ADU per unit of time (s) you can then determine the ratio difference between each filter for that known G2V type star..

    Once you have the ratios determined it is easy to set an exposure time for the luminance filter that does not saturate the target and then use the calculated ratios to determine the respective exposure times for the red green and blue filters. This is a one-time measurement and the ratios are then a constant for any target for your imaging system, so long as the camera, filters, or optics are not replaced with something different.

    A final couple of observations:

    The Ring might appear "blown out" as you described in the colour filters because of the differing proportions of luminance v chrominance that the filters pass. When looking at a filtered red image only the structure in the red band will be visible, the contribution to structure that the green and blue components gave in the clear filter will be missing.
    This lack of detail gives the impression that the object is over-exposed through a colour filter and is why that cursor numerical ADU readout is more important for judging exposure times.
    Once those three RGB channels are re-combined then the fine structure is revealed once more.

    Lastly, for this target the detail in the gas shells is very fine and seeing conditions have to be quite good to resolve them, I can see some slight tracking issues too, the stars are slightly oval, and which is blurring slightly the fine detail you might otherwise expect to see.

    That's my take on the subject, hopefully there will be some other contributions with a more expert explanation than my own limited experience can provide.

    HTH

    William.
     
  3. John Cangelosi

    John Cangelosi Standard User

    Joined:
    Dec 3, 2021
    Messages:
    21
    Thank you William! This is incredibly useful and I'll definitely be using your reply as a frequent reference.
    I did an imaging run last night (before seeing your reply) with a 100 second exposure (but binned 2x2) so I have some better images to work with. After reading your reply today, I did the ADU analysis in TheSky's .fitts viewer and that was very helpful. There is plenty of room in the nebula (the highest value is around 19K) but there are some stars that are maxed out so I'll scale the next session back to your suggestions and analyze from there. I also quickly stacked and color combined the images from last night's short session and things look better- but I'm looking forward to collecting more files correctly to improve the result.
    Thanks again!
    -John
     

    Attached Files:

  4. William B

    William B Cyanogen Customer

    Joined:
    Jan 8, 2015
    Messages:
    641
    Location:
    Christchurch, Dorset UK
    John.

    The latest image looks better for exposure but the colour channels are incorrect, see attached extract below.

    Make a visual check on the filter assignments in your filter wheel setup, you can do that without removing the camera by selecting each filter in turn and triggering an exposure of several minutes.
    This will force the wheel to rotate to the chosen filter and while the camera shutter is still open and by shining a flash light into the front of the OTA to illuminate the filter wheel as you peer down the OTA you will be able to see the colour of the selected filter reflected back from the camera sensor behind the filters.

    For interference type filters the strong colour reflected from the front of the filter is the complementary colour while the weaker colour reflected back from the sensor while the shutter is open is the pass colour for the filter.

    If your filters are older absorption filters the colour reflected back will be primarily the same for both sides of the filter.

    The outermost shell of the Ring should be red and the heart of the nebula blue with mainly blue-white between the two and a thin wisp of golden yellow adjoining the outer red ring, but your image has a green outermost shell with a red heart and golden-yellow between, so either your filters are loaded in the wheel in a different order than you think, or the filter designation is incorrect in the capture program, or if combining the colour channels manually in post processing the wrong image stacks were assigned to the respective colour channels.

    I extracted the RGB channels from your image and recombined them correctly as shown in the attached image.

    Be careful also not to clip the black level so hard as the faintest wisps of outer shell of the Ring and any background galaxies or dust in the FOV will be lost.

    William.

    TestRing.jpg
     
    Last edited: Oct 22, 2022
    Colin Haig likes this.
  5. John Cangelosi

    John Cangelosi Standard User

    Joined:
    Dec 3, 2021
    Messages:
    21
    William- I can't tell you how helpful this is! As you called it, my filters were out of order. I was able to verify that with the described method and as a double check I assembled a quick color image. When I compare it to your example (thank you for including that) I'm more hopeful that things are in the correct place.
    Thank you so much.
    -John
     

    Attached Files:

    Colin Haig and William B like this.

Share This Page