iSIM Device Adapter#829
Conversation
|
This is great!
Are you trying to implement the two channel imaging for "live" display mode or for an MDA (or maybe both..)? From the image.sc thread, I had "live" display in my head for some reason. |
|
To seamlessly create the digital output patterns you describe, you would need to create a StateDevice in your device adapter analogous to DigitalOutputPort in https://github.com/micro-manager/mmCoreAndDevices/blob/main/DeviceAdapters/NIDAQ/NIDAQ.h (or AnalogOutputPort if you use analog outputs). The "OnState" action handler has to implement a couple of interesting things: Now, when you have 3 channels defined in the MDA that only differ in their DigitalOutputPort State, the acquisition engine will send the appropriate sequence to the device (AfterLoadSequence), Start the Sequence, then tell the camera to run a sequence (if you have only the three channels it will be a sequence of 3 images), after which you have all the information you need in the device adapter to do what you need to do. Please ask if this does not make sense. It is all not terribly intuitive.. |
It wasn't clear to me on the image.sc thread whether there should be a distinction for when the interleaving occurs, but during testing I realized that the interleaving makes sense only during Multi-D. Acquisitions. During Live and Snap, the device adapter currently allows you to illuminate 0, 1, 2, 3, or 4 AOTF lines simultaneously, depending on whether the corresponding MOD IN channel is set to
Thank you very much once again @nicost . I'll take a look at the code and see what I can do. Just a point of clarification: I'm actually using analog outputs for the AOTF signals and not digital outputs because our synthesizer accepts a 0 - 10 V analog signal whose value determines the laser intensity (more specifically the diffraction efficiency). But I doubt this matters for making the hardware sequencing work. |
|
Nice work, @kmdouglass! Sorry I'm late to the discussion. I don't have much to add to @nicost's excellent recommendations. If you needed multi-channel during snap/live, there might be a way to hack it by imitating Multi Camera. But the sequenceable state device is much cleaner for MDA in this application.
I can do this for you when merging this PR. We'll just point to the same place as the NIDAQ adapter. (Maybe leave out Makefile.am for now unless actually testing on Linux.) Small reminder: we'll need a license.txt. |
|
Looks very good! Dying to hear if it works as expected!
Indeed. If you want to be able to output different voltages (controlled by the UI), you would need to implement something different than a StateDevice, but if it is all just 0 or 10V, a StateDevice should be fine. |
5d267cd to
92e9053
Compare
|
@nicost Does the new Java acquisition engine support hardware sequencing? The sequencing through a state device appears to be working with the Clojure engine, but not with the Java engine. This is against the MockDAQAdapter; I'll test it against the real hardware again soon. |
Exposure time clamping has been implemented as well so that 1. we do not exceed the peak-to-peak voltage requirements imposed by the waveform configuration 2. we take into account the "real" exposure time from the camera
…rleaving waveforms Also do not output galvo or blanking waveforms if no MOD IN channels are enabled.
505d0f9 to
794d629
Compare
|
@nicost @marktsuchida The device adapter is now ready for review. I am attaching a compiled device adapter (against the mock DAQ) and configuration file in case you want to see how everything looks without needing a real DAQ. Just a note: I am in the process of rebuilding the microscope, so I have only tested this against an oscilloscope. There might be updates to this device adapter over the next few months as I get fully into integration testing. I intend to write the device adapter documentation page after the PR is accepted. Let me know if something is not clear or if you have questions. |
| const char* g_ReadoutTimeNone = "None"; | ||
|
|
||
| // Set to true to use MockDAQAdapter instead of the real NIDAQmx adapter | ||
| static constexpr bool kUseMockDAQ = false; |
There was a problem hiding this comment.
This is only used for testing. I'm not sure whether you have compile-time flags for this sort of thing or not. It should always be false for release builds.
There was a problem hiding this comment.
This could be made a compile-time flag (can be set in the Visual Studio project properties), but it won't be much different from editing the source line.
What might be more useful is to just always build and ship both (mock and real). This prevents bit rot and is also most convenient once in the nightly. This can be done either as two separate adapters (would need two .vcxproj files), two devices in the same adapter, or even a pre-init property.
But I'd suggest worrying about that in a later PR rather than changing the structure of the code right now.
There was a problem hiding this comment.
Fair point. I'll do this in a follow up as you suggest.
| std::vector<std::string> devices = daq_->getDeviceNames(); | ||
| CPropertyAction* pAct = new CPropertyAction(this, &iSIMWaveforms::OnDevice); | ||
| std::string defaultDevice = devices.empty() ? "" : devices[0]; | ||
| deviceName_ = defaultDevice; | ||
| CreateStringProperty("Device", defaultDevice.c_str(), false, pAct, true); | ||
| for (const auto& dev : devices) | ||
| AddAllowedValue("Device", dev.c_str()); | ||
|
|
||
| // Discover channels for default device | ||
| availableChannels_ = daq_->getAnalogOutputChannels(deviceName_); |
There was a problem hiding this comment.
This is the only device communication that I do in the constructor. I discover all the available DAQ devices and their available analog output channels. The NIDAQ device adapter, for comparison, only discovers DAQ devices.
| long numPositions_; | ||
| bool sequenceRunning_; | ||
|
|
||
| static const long MAX_SEQUENCE_LENGTH = 1024; |
There was a problem hiding this comment.
Probably overkill. 4 is the maximum length as determined by the number of MOD IN channels. I left this at 1024 to avoid any problems in case something changes in the future.
| /// pulse train (gated by an external trigger), and an AO task that uses the | ||
| /// counter's internal output as its sample clock for continuous waveform | ||
| /// regeneration. | ||
| class NIDAQmxAdapter : public IDAQDevice |
There was a problem hiding this comment.
All actual hardware communication is confined to this class. This helps separate device adapter logic problems from hardware communication problems.
|
I pushed a commit that points the project to our internal copy of DAQmx 9.2 (old, but shouldn't matter for building what's used here -- it will use the locally installed version at run time). Should be ready to merge. When working on the code, you can either edit the project properties temporarily, or you can put a copy (or symlink, if enabled) of the DAQmx include and lib64 directories where the build points. |
|
This is great! If helpful, happy to test on our hardware combos (NI DAQ + Hamamatsu and NI DAQ + Photometrics) once you are ready. We can log the outputs the the DAQs to ensure they generate the waveforms you are hoping for. |
Thanks a lot for your offer @dpshepherd ! I have tested against Photometrics but do not have any Hamamatsu camera on hand to test against. I suggest waiting until I write the documentation page for the device adapter to test against Hamamatsu cameras, which I intend to do in parallel with my current task of rebuilding the microscope. There are some minor idiosyncrasies that need to happen at configuration time that will be clarified in the documentation. In the meantime, it would be helpful if you could tell me whether DCam exposes its current readout time in its device adapter, and, if so, what this device property name is, as well as the units attached to its value (milliseconds, nanoseconds, etc.) Also, let me know if you think this could be useful for you. I have some ideas about how to generalize this to other waveforms. |


This is a pull request for a device adapter for constructing analog output waveforms from a NIDAQ board for an instant structured illumination microscope (iSIM). It originated with a thread between myself, @dpshepherd, and @nicost here: https://forum.image.sc/t/custom-waveform-generation-for-ni-daq-galvo-control/118662
The main difference with the pre-exisiting NIDAQ device adapter is that it constructs and outputs analog waveforms for the iSIM's camera, galvanometer mirror, and AOTF. The NIDAQ device adapter, on the otherhand, acts as a state machine by outputting predetermined signals corresponding to each state. The galvo waveform for the iSIM in particular is somewhat complex; it is composed of linear ramps and constant parking periods that need tight synchronization with the camera's exposure.
The device adapter currently works for Live and Snap imaging by wrapping a physical camera to intercept some relevants calls into the core, much like the TeensyPulseGenerator. There are still some questions I have regarding hardware sequencing before it fully meets my requirements.
A few points:
IDAQAdapterbetween the device adapter code and the actual calls to the NIDAQmx library. This allows mocking out the hardware to test waveform construction logic.Help Needed
I need help understanding how to make this compatible with hardware sequencing. Currently, I have only implemented what I call "normal" mode imaging. This means that if an AOTF MOD IN channel is enabled, then light from that channel will illuminate the sample on each frame.
What I would like is to do two color (or more) interleaved imaging. I know how to construct the waveforms for this, but I do not currently know how the information about a multichannel MultiD Acquisition is passed into the device. I would need this information at the device adapter level to construct the waveforms.
Below is an example of the desired waveforms for three color interleaved imaging.
I'd be happy to discuss further and to answer any questions you might have.
To Dos
Alert user when the exposure time setting is too short to allow for all rows to expose simultaneously due to the camera's rolling shutterI can't do this becauseSetExposurereturnsvoid, so I clamp the requested exposure time instead.Related to the above, decide what to do about the initial exposure time in the GUI (10 ms) being shorter than the readout timeThis is both a configuration issue and a problem with PVCAM, not iSIMWaveforms. Setting the starting exposure time for PVCAM in the configuration file so that it's longer than the readout time at startup solves it. I will need to document this later.Figure out what to do about hard-coded NIDAQmx include and dependency paths in the VS solution file@marktsuchida Will take care of this when merging. Note that I'm currently testing against NIDAQmx 2026 Q1.Rebuild waveforms whenever the ROI size changes because the readout time will changeNo longer needed sinceSyncTimingFromCamera()was implemented.