Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Player One SDK 3.6.1 gain issue #887

Closed
ijessen opened this issue Jan 23, 2024 · 6 comments
Closed

Player One SDK 3.6.1 gain issue #887

ijessen opened this issue Jan 23, 2024 · 6 comments
Labels
bug Something isn't working

Comments

@ijessen
Copy link
Contributor

ijessen commented Jan 23, 2024

Describe the bug
Full gain setting does not have the expected effect of producing .003 e-/ADU.

driver: indi_playerone_ccd v1.13
library: libPlayerOneCamera.so.3.6.1

Imaging results suggest the actual gain is something 190 or less (no less than .125 e-/ADU).

It's unclear to me whether the issue is related to driver, SDK, or camera hardware. My instinct says SDK, but I figured I'd open the issue here since I know both @PlayerOneAstronomy and @hiro3110i may be able to chime in.

Update: Cause is for sure the update to SDK 3.6.1.

To Reproduce

see updated reproduction steps in later comment

Exact steps to reproduce the behavior.
1. Pull latest, build, and install both library and driver
2. Run driver, launch and connect Ekos
3. Set gain to 550
4. Do not touch offset (left at default, out of the box, value of 20)
5. Enable LRN mode (not sure if necessary, but that was the mode I was shooting in)
6. Confirm "auto-gain" is disabled
7. Image in 16-bit RAW mode

Expected behavior
Images should reflect .003 e-/ADU by exhibiting a minimum step size between differently-illuminated pixels of 333 ADU (with expected variance due to fixed pattern and read noise). Instead, examination of the resulting images put a ceiling on the minimum step size of only 8 ADU, corresponding to a minimum .125 e-/ADU.

Screenshots
The following screenshot shows statistics of a small selection of pixels within the vignetted corner of a 30" image. This image was shot at 550 gain and 20 offset. Testing shows that each offset unit corresponds to an 8 ADU increase in bias, therefore the min value of each pixel in this scenario (absent noise) is 160 (this is confirmed by darks). Note the mean and median values in this selection are 168, min 161, and max 175. This would make sense for pixels that are being illuminated by a small number of photons, where the e-/ADU value is anywhere above .125. An e-/ADU of .125 would correspond to most pixels in this selection being illuminated by a single photon, some being illuminated by two, and some by none.

image

The next screenshot shows statistics of a small selection of pixels at the heart of a bright star (located below the statistics window) from the same image as above. The max pixel value within this star is 702. If the gain of 550 were in fact applied, that would correspond to maaaaybe 2 photons received from this bright star in 30". There also would be no explanation for the appearance of gradients and other signal (see the nearby galaxies?) at brightnesses between the heart of this star and the background.

image

FITS header confirms the gain setting of 550

image

Finally, the raw FITS file is attached.

M_86_Light_427.zip

@ijessen ijessen added the bug Something isn't working label Jan 23, 2024
@ijessen
Copy link
Contributor Author

ijessen commented Jan 23, 2024

I played around a little - this is in fact a bug introduced with the update to SDK 3.6.1 and it's easily reproducible.

Reproduction:

  1. Pull, build, and install latest driver / SDK version
  2. Launch INDI and set camera gain to max (550 for Poseidon-C) via some client
  3. Kill and restart INDI - has to be a full teardown of indiserver, restarting the driver alone is not sufficient. Note that when INDI restarts and the driver is connected, the previous gain setting will be restored from saved config
  4. Launch Ekos and take an image - note that pixel stats do not reflect the lowest possible e-/ADU which should accompany the max gain
  5. Set camera gain to 0 and take another image - note that pixel stats don't appreciably change, suggesting that the previous image was taken at gain 0 (it was)
  6. Change the camera gain back to max and take an image - note now that pixel stats reflect the expected e-/ADU

You can swap in the older (3.6.0) SDK by relinking the library (without changing the driver version) and the anomalous behavior goes away.

I connected a debugger and confirmed that the INDI driver (v1.13) is calling the SDK to set the gain value after restoring the saved config. In the case of SDK 3.6.0 this has the expected effect. However, in the case of SDK 3.6.1, this first call doesn't change the camera gain (which is initialized to 0). Subsequent calls do, though.

N.B. the driver doesn't make any calls to the SDK unless you change the gain to a different value (see here), simply setting the gain to the current property value is a no-op.

@ijessen ijessen changed the title Player One Poseidon-C gain setting Player One SDK 3.6.1 gain issue Jan 23, 2024
@ijessen
Copy link
Contributor Author

ijessen commented Jan 23, 2024

Root cause identified.

The new SDK exposes capabilities to enable auto mode for gain and the 3 white balance channels. Existing driver code, which is now triggered by these new capabilities, treats associated numeric values as floats, when in fact, all the existing auto-able config parameters (gain, WB_R, WB_G, WB_B) are integer valued (see here).

When the auto gain switch property is updated at connection time (after the numeric property has already been processed), the associated gain value is sent as well. The type error results in a garbage value being sent to the camera, which is interpreted as a 0.

I'll open a PR with the extremely simple fix, but if future revisions of the SDK ever introduce auto capabilities for float-type values, new logic will be required.

ijessen added a commit to ijessen/indi-3rdparty that referenced this issue Jan 23, 2024
@PlayerOneAstronomy
Copy link

@ijessen
Sorry for this problem.
I have forwarded this issue to our engineer, and he is investigating the cause of the problem.

@knro
Copy link
Collaborator

knro commented Jan 24, 2024

Thank you for the PR @ijessen Can you recheck?

@hiro3110i
Copy link
Contributor

@ijessen Thank you for fixing the issue.
Is it a compatibility issue of SDK v3.6.1?
Did previous version have no issue?
I'll check it in my environment.

@ijessen
Copy link
Contributor Author

ijessen commented Jan 24, 2024

@PlayerOneAstronomy no problem - the bug wasn't in the SDK. I will say, though, that I would have expected the library call (POASetConfig) to have returned an error code (POA_ERROR_INVALID_CONFIG) when presented with the bad config input, rather than failing silently.

@hiro3110i at least for my Poseidon-C the issue is only triggered by the update to SDK 3.6.1. The issue doesn't exist when I downgrade the SDK to 3.6.0 (while leaving the driver version at 1.13). The new SDK exposes auto gain for the Poseidon-C, which in turn triggers the driver code path with the bug.

@knro looks good. I built and installed the driver from master and the issue is resolved. One question from me - since I'm by no means a C++ expert - why doesn't the compiler balk at confVal.intValue = num.value? Shouldn't it require an explicit cast from the double to the long? Elsewhere in the driver the same operation uses an explicit static_cast.

@ijessen ijessen closed this as completed Jan 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants