I will give the bias for every session a go. However, how will that affect my darks since it has a different bias? With enough darks, would the difference show or would the dark current noise swamp it out and essentially treat the bias as a relatively constant offset? The important part of the bias is the amp glow, or what appears to be amp glow. That shows in my uncalibrated stacks very clearly and luckily are reliably corrected across the datasets. Also, to an earlier question: is the longer download time in MaxIm DL vs CCDOPS normal? CCDOPS states a rate of 7.8MP/s, close to the stated 10MP/s, but MaxIm DL takes about twice as long. That makes me believe it is a handshake between driver and MaxIm that is longer than CCDOPS or the default digitization in CCDOPS is faster than the MaxIm "Normal Exposure" digitization rate. That is why I wonder if the bias frames from CCDOPS are comparable to the MaxIm DL ones because the bias would have more read noise with higher digitization rate, but I measured no difference in read noises. I will upload a night of OIII data and the corresponding calibration frames since OIII shows the banding most clearly in my IC443 image. That way you have as much data as you see fit and can just download what you want. Is that Johnson noise in the FETs and Amps and the type of CDS used? Detectors are among my favorite things to study so I am curious. Metrology is what I am considering doing after my degree, so I am more obsessed about characterizing my camera than maybe I should be.