TIME-OBS accuracy

Discussion in 'CCDOPS and SBIG Universal Driver (Retired)' started by CraigNZ, Sep 1, 2015.

  1. CraigNZ

    CraigNZ Cyanogen Customer

    Joined:
    Dec 29, 2014
    Messages:
    93
    Location:
    Ngutunui, New Zealand
    STF-8300M camera.

    I am doing some drift scan studies and I need to know the exact start (or end) of an exposure (down to a few msecs).

    I noticed in the FITS header the line TIME-OBS='10:33:46.479', does this mean the time is known precisely to the microprocessor to 1 msec? If so, how would I then set the microprocessor time to 1 msec with a GPS? Given the delays in the PC/USB/microprocessor I can see all sorts of delay issues.

    Another method would be if there is some signal (maybe the Guide connector signals?) which could indicate precisely when the integration begins (or stops), I could then monitor this with an external analyzer and superimpose its signal with GPS timings to precisely determine the start (or stop) time of the camera.
     
  2. CraigNZ

    CraigNZ Cyanogen Customer

    Joined:
    Dec 29, 2014
    Messages:
    93
    Location:
    Ngutunui, New Zealand
    Thinking some more about this I suspect the TIME-OBS is written into the FITS header by the SkyX software. The question then is what time is used? It could be from the PC clock which can report in msec, or it could come from the microcontroller as part of the protocol between the driver and camera. So I guess we need to start with what is the source of this time.
     
  3. Doug

    Doug Staff Member

    Joined:
    Sep 25, 2014
    Messages:
    9,932
    The software writes it. You would have to ask Bisque what they do.

    FYI, MaxIm DL has a shutter delay calibration procedure, which gives you meaningful precision. It will only record the time of observation to high precision if you use that feature; otherwise the extra digits are meaningless and are suppressed.

    Also I would recommed using a GPS to maintain PC clock synchronization.
     
  4. CraigNZ

    CraigNZ Cyanogen Customer

    Joined:
    Dec 29, 2014
    Messages:
    93
    Location:
    Ngutunui, New Zealand
    Hi Doug,
    Looks like I will not be able to use the TIME-OBS parameter because of all the delays between the software and the actual time of electron recording in the sensor.

    When integration starts inside the camera (when it begins accumulating electron charge in each pixel) is there any external signal (or noise?) that indicates this precise instant? My thought is if there is then I can correlate this to an exact time using a GPS time reference.

    Alternatively, if there is an exact fixed delay (e.g., 43 msec) between the time the driver sends a 'start' exposure and when the chip begins collecting charge I could use that.
     
  5. Doug

    Doug Staff Member

    Joined:
    Sep 25, 2014
    Messages:
    9,932
    I can't easily come up with that value.

    I suggest you try the MaxIm DL demo and use the shutter delay measurement feature to determine it empirically. The delay should be the same in other software.
     
  6. CraigNZ

    CraigNZ Cyanogen Customer

    Joined:
    Dec 29, 2014
    Messages:
    93
    Location:
    Ngutunui, New Zealand
    I can see how that works and will write an equivalent program. The interesting thing will be how consistent the latency is. I can see where Windows might cause delays (especially when it decides to display a software update from Java) resulting in a wide variance in latency.

    A couple of other ideas:

    a) connect a TTL output from a GPS time reference into a small gating circuit and when an exposure is started at the 1 second output the gate is opened for 1 pulse. The pulse would then activate a push solenoid which would 'tap' the camera resulting in a glitch on the image which can be precisely measured (in a drift scan). The glitch would represent precisely the second marker.

    b) connect the output from above to the DEC guide input signal to the drive. When the pulse comes through the dec motor would turn a very small amount (a pixel or so) and show a 'step' in the drift scan which again could be measured precisely and would represent a 1 second time mark.

    Ideally it would be better if the camera could output a TTL signal indicating start/stop of an integration. That signal could then be captured with a microprocessor using a counter and GPS time reference the exact moment of integration start and stop.

    With drift scan and the motor off the rate of travel over the pixels can be accurately determined. What is needed is some sort of indicator on the image to reference as a time mark. The declination 'step' I suspect could be very accurate since the latency would be repeatable depending on how the signal at the guide port is translated to motor stepping.
     

Share This Page