basic help

Discussion in 'Image Processing' started by Gregory Farley, Jun 4, 2016.

  1. Gregory Farley

    Gregory Farley Cyanogen Customer

    Joined:
    Jun 4, 2016
    Messages:
    4
    I am running a demo version, that is supposed to be a full edition. I have been using Images Plus for 2 years. I am planning upgrading to a QSI camera from a DSLR next week. I wanted to understand the program beforehand. I am using LRBG images from a different source and I can not get them to stack. I follow the tutorial and if I end up with a Group image (10% of the time) it is not in color. So, why do I not end up with a final image? I always have errors, but no images are red "x". The PinPoint Astrometry window opens, not sure what to do with that? Why are my images not in color, just monochrome? Is the Tutorial missing steps?
     
  2. Colin Haig

    Colin Haig Cyanogen Customer

    Joined:
    Oct 27, 2014
    Messages:
    845
    For the community to help, it would be good to have a bit of info, and maybe one of us can explain a couple of things.
    It would be good to know what errors you are encountering.

    It might help to upload your sample images (see the Upload a File) button next to Post Reply.
    What are the source images? What kind of camera, filter, and optics. What's in the field?
    e.g. 4 monochrome CCD image taken with each of L, R, G, B, filters, exposure length X, darks, flats, bias frames.
    Or were these DSLR images filtered?
    Are they FITS format? Or CR2 or ???
    Is it a star field, planet, nebula, galaxy you are working on?
    Are the coordinates embedded in the FITS header?

    Are you using Combine Color to take the LRGB FIT files and convert them to one color image? That might be what you want to do.
    If on the other hand, you are trying to Color Stack, then the images all need to be aligned properly. This means, each image must have accurate position data. So when you mention PinPoint, it's likely what is going on is MaxIm making use of the PinPoint LE plate solving engine to try to guess where in the sky the image was taken (RA,DEC) because the data isn't in the FITS header. PinPoint needs to be set up with a stellar position catalog like GSC1.1 or better.

    When you ask why your images are not in color, I suspect you think each of the source images should show up as Red, Green, Blue on screen. But really, when you think about the data, they are just mono images. No point in displaying them in color until you combine them.
    Unless of course you started with a one-shot color camera or DSLR image, which has filters and likely a Bayer mask (1 red, 1 blue, 2 green pixels in a repeating pattern). So in that case, your Red-filtered image still has color data. So then you need to follow a different workflow.

    The difference between programs like Images Plus and MaxIm DL is a bit like the difference between Art and Science. MDL is very analytic, and relies on good data and precision. IP is a bit more fun for making pretty pictures.
    So, you kind of need to "forget" how IP works when using MaxIm.
     
  3. Gregory Farley

    Gregory Farley Cyanogen Customer

    Joined:
    Jun 4, 2016
    Messages:
    4
    I am using a Nikon D600 for the photos that I take with my telescope. All the images I take witht he Nikon are in .nef (RAW) format. However, Since, I am looking at acquiring a QSI 683 Mono with filter set I have been playing with LRGB images that I get from Slooh.com in Images Plus. I would like to make the move to Maxim DL Pro when I get the new camera, so I was wanting to use this time to get acquainted with the program. I have already been able to get the program set up with the telescope and the Nikon camera (no actual photos yet, because my guide camera failed). I have uploaded a set of .FITS files that I have been using.

    How do I get the data for PinPoint? is it not included? I know this is a separate problem for a different thread but I will be trying to tie in my Starry Night Pro with Maxim as soon as I figure out the POTH thing. Living in the Midwest, I do not have a lot of locals that I can talk to and get help, so any you can give is greatly appreciated.

    I will try to process some of my DSLR photos today to see if I can get that to work. I was just following the tutorial steps for processing photos (Open File, Brightness and Contrast, Stacking, Filtering, Stretching. . .) maybe that is not the process order to follow for LRGB photos? I found a"Best Practices" list for MAXIM and it has basically the same order.


    Thank you again, gratefully...
    Gregory
     

    Attached Files:

  4. Colin Haig

    Colin Haig Cyanogen Customer

    Joined:
    Oct 27, 2014
    Messages:
    845
    Here's two output images, done two different ways. There are other variations you can play with.
    One is done with Color stacking, of the 2 luminance images set to white, and the r,g,b, as red, green, blue.
    The other is done with Combine Color.
    Since I don't have any info on the relative transmissivity of the SLOOH filters, I left it with 1,1,1 coefficients. Note that means that "esthetically" the images won't look pretty, but you'll see what I mean if you click the images posted here.
    In other words, usually you need a longer blue exposure than red, as the chip is usually less sensitive in blue. But, these are all the same length exposures, and we don't know the details of how much light gets through each filter and how sensitive the camera is.
    Normally when stacking, you'd take like 10 images of each color, and stack each of them, then color combine the resultant r,g,b for example.

    As the SLOOH images are very close, was able to just hit the Align... button, and pick the first L as the reference, and then use the Auto - Star Matching.

    The final FITS were a bit big, so here's jpg versions.
    colorstack.jpg m16_20160516_031XXX_X_XXXX_LRGB.jpg

    Later, if you want to play with PinPoint, eg If you do want to get the coordinates solved, you'll need a fast internet connection to download one of the catalogs.
    For example, GSC 1.1: http://cdsarc.u-strasbg.fr/ftp/cats/bincats/GSC_1.1/
    On my PC, I set up a C:\stardata directory, and put all the catalogs underneath e.g. C:\stardata\GSC11
     
  5. Colin Haig

    Colin Haig Cyanogen Customer

    Joined:
    Oct 27, 2014
    Messages:
    845
    As far as the POTH thing goes, it's pretty simple.

    Program 1 --> POTH --> ASCOM Telescope Mount --> your scope on a COM port.
    Program 2 --> POTH
    Each program thinks it is talking to your scope.

    Set up ASCOM and the telescope ASCOM driver e.g. Meade LX200.
    Check from Starry Night that it will talk.
    e.g. Starry Night telescope mount set to ASCOM, and you choose your telescope and set its properties.
    After you know that works, then change SN --> telescope mount set to ASCOM POTH
    ASCOM POTH properties set up to point to the Meade LX200 driver.
    Then in other programs, eg Program 2 --> telescope mount set to ASCOM POTH
    I just recommend driving the scope from one program.

    Have fun!
     
  6. Gregory Farley

    Gregory Farley Cyanogen Customer

    Joined:
    Jun 4, 2016
    Messages:
    4
    First and foremost, thank you for your help. I went in and tried the Color Stacking on the same set of images and I got the attached photo. I did play around with some of the other features (Kernels, Color, Levels etc. . .) but I came up with the attached image. At least with an image I can start playing with the other features and tools to figure them out, thanks again. I will try connecting the POTH Tuesday when I return from work.

    No to take too much of your time, but you said "10 images of each color", I understand about the importance of multiple images, but I often have difficult time figuring out exposure time. Typically I have borrowed from my photography playbook and taken differing time exposures to capture the different light levels. I usually max out the exposure time when the details of the object are being lost to the background lighting. Do I need to mess with the varying exposure times? Also, do you alternate Darks into the sets, or do Darks at the end? Living in the Midwest I may only have one or two good nights to set up (and frequently I have to travel to dark sky about 30min to 1 hr away) and take photos every couple weeks. If it helps lets take M51 in rural dark sky (I can just resolve the Milky Way bands), how would you set up your exposures?

    Thanks, again
    Gregory
     

    Attached Files:

  7. Colin Haig

    Colin Haig Cyanogen Customer

    Joined:
    Oct 27, 2014
    Messages:
    845
    Hey, my pleasure helping you get started. Glad to see you are on the right track.
    Just like on your DSLR, you want to ensure you have as much signal as you can, without swamping the chip or having a really bright sky background. It depends a lot on your optics and sky conditions, as well as the sensitivity of the camera.
    If you've played with the histogram of an image on the DSLR, the idea is to ensure you are making good use of the range. You don't want the brightest stars to be at the maximum brightness possible - e.g. clipped at the upper end.

    Want to calculate your exposures?
    For maximum exposure,:
    You need to know how many electrons fill up the pixel buckets aka the Full Well Depth sometimes expressed as a number like 60,000 e- vs the digitization values (16 bit camera goes to 65535). In a perfect camera, every photon turns into an electron, and the A/D gain (eg 1 e- per ADU) determines how many electrons per digitized bit. Designers of cameras try to balance the FWD with the system gain, so that a full well is a max value, but its not always the case.
    If you overfill the buckets (the pixels), they will bloom, and spill into their neighbours - equivalent to an overexposed image.
    Sometimes people will intentionally do this, to get the most signal from the faint nebulosity or something.

    A better way is to take many images, as the signal to noise ratio grows as a square. e.g. 9 x 10 second images has a lot less noise than 1 x 90 second image. Plus, you can always throw out the occasional airplane that flies across the image ;-) And its easier to get a shorter image.
    The other thing is you can't always see the object in your image until you stack a bunch of them.

    You can actually work it all out mathematically, but then your results will vary in practice.
    For example, your chip is most sensitive (look for the phrase QE or Quantum Efficiency) in certain wavelengths, and there is usually a manufacturer's data sheet that says something like 70% QE for reds around 680nm, and 60% QE green at 560, and 40% blue at 450.
    So, through a clear filter, blues will take 70/40 times longer to match the red exposure level.
    And if your blue filter only transmits 50% versus the red filter permits 80% to get through, then you have to multiply that up.
    70/40 * 80/50 = 2.8x the red exposure.

    Another issues is bigger faster scope (larger diameter, faster f/ ratio) takes less time for the exposure, but brings up the sky background brightness pretty fast too. I have an 11" f2.2 astrograph, but skies are crappy so am limited to less than 30second exposures without filters.
    With narrowband filters, you can go a lot longer, but you are only getting a small % of the available light.
    My recommendation would be start in the back yard and have low expectations. ;-)
     
  8. Colin Haig

    Colin Haig Cyanogen Customer

    Joined:
    Oct 27, 2014
    Messages:
    845
  9. Gregory Farley

    Gregory Farley Cyanogen Customer

    Joined:
    Jun 4, 2016
    Messages:
    4
    Thanks, now we are talking my language (I am a math guy). I will check out the webpage. All I have been able to do is read about this stuff and when the sky is good, set everything up to put it to practical use.

    Thanks again,
    Gregory
     
  10. Colin Haig

    Colin Haig Cyanogen Customer

    Joined:
    Oct 27, 2014
    Messages:
    845
  11. Nick Bernier

    Nick Bernier Cyanogen Customer

    Joined:
    Sep 9, 2016
    Messages:
    2
    Location:
    Georgia, USA
    Hi Colin,
    I am connecting StarryNight, and MaxIm-DL/MaxPoint to my ASCOM telescope driver. MaxPoint said to use it as the hub instead of POTH, which seems to work ok (StarryNight and MaxIm-DL point to MaxPoint hub which then connects to ASCOM driver).

    Questions:
    Should I use MaxPoint or POTH as my hub?
    I just purchased FocusMax and was wondering if it should also be connected to MaxPoint hub?

    Thanks for your help!

    Regards,
    Nick Bernier
     
  12. Doug

    Doug Staff Member

    Joined:
    Sep 25, 2014
    Messages:
    2,989
    If you are running MaxPoint, then use that as your hub. POTH is redundant.
     
  13. Nick Bernier

    Nick Bernier Cyanogen Customer

    Joined:
    Sep 9, 2016
    Messages:
    2
    Location:
    Georgia, USA
    Hi Doug,
    That's what I thought, but wanted to make sure. Thanks for the quick response!

    Regards,
    Nick
     
    Last edited: Sep 21, 2016

Share This Page