Skip to content

iSIM Device Adapter#829

Merged
marktsuchida merged 26 commits intomicro-manager:mainfrom
kmdouglass:isim_device_adapter
Feb 17, 2026
Merged

iSIM Device Adapter#829
marktsuchida merged 26 commits intomicro-manager:mainfrom
kmdouglass:isim_device_adapter

Conversation

@kmdouglass
Copy link
Copy Markdown
Contributor

@kmdouglass kmdouglass commented Feb 9, 2026

This is a pull request for a device adapter for constructing analog output waveforms from a NIDAQ board for an instant structured illumination microscope (iSIM). It originated with a thread between myself, @dpshepherd, and @nicost here: https://forum.image.sc/t/custom-waveform-generation-for-ni-daq-galvo-control/118662

The main difference with the pre-exisiting NIDAQ device adapter is that it constructs and outputs analog waveforms for the iSIM's camera, galvanometer mirror, and AOTF. The NIDAQ device adapter, on the otherhand, acts as a state machine by outputting predetermined signals corresponding to each state. The galvo waveform for the iSIM in particular is somewhat complex; it is composed of linear ramps and constant parking periods that need tight synchronization with the camera's exposure.

The device adapter currently works for Live and Snap imaging by wrapping a physical camera to intercept some relevants calls into the core, much like the TeensyPulseGenerator. There are still some questions I have regarding hardware sequencing before it fully meets my requirements.

A few points:

  • This device adapter is highly coupled to our particular setup. I think it could serve as a starting point for a more general device adapter. This would require quite a lot of discussion.
  • I put an interface called IDAQAdapter between the device adapter code and the actual calls to the NIDAQmx library. This allows mocking out the hardware to test waveform construction logic.
  • As I explain here, the triggering logic is a bit indirect but necessary to avoid having to reload the waveforms in software after each frame. I use a counter channel on the NIDAQ that can be triggered by a TTL input, and this counter serves as the clock for the analog waveforms.

Help Needed

I need help understanding how to make this compatible with hardware sequencing. Currently, I have only implemented what I call "normal" mode imaging. This means that if an AOTF MOD IN channel is enabled, then light from that channel will illuminate the sample on each frame.

What I would like is to do two color (or more) interleaved imaging. I know how to construct the waveforms for this, but I do not currently know how the information about a multichannel MultiD Acquisition is passed into the device. I would need this information at the device adapter level to construct the waveforms.

Below is an example of the desired waveforms for three color interleaved imaging.

image

I'd be happy to discuss further and to answer any questions you might have.

To Dos

  • Implement hardware sequencing for multi-channel MDAs
  • Alert user when the exposure time setting is too short to allow for all rows to expose simultaneously due to the camera's rolling shutter I can't do this because SetExposure returns void, so I clamp the requested exposure time instead.
  • Read back exposure time from camera after setting it
  • Related to the above, decide what to do about the initial exposure time in the GUI (10 ms) being shorter than the readout time This is both a configuration issue and a problem with PVCAM, not iSIMWaveforms. Setting the starting exposure time for PVCAM in the configuration file so that it's longer than the readout time at startup solves it. I will need to document this later.
  • Discover PFI channels for the trigger port so that the user selects which one to use (instead of typing in the PFI channel)
  • Decouple the device adapter from PVCAM's readout time property
  • Figure out what to do about hard-coded NIDAQmx include and dependency paths in the VS solution file @marktsuchida Will take care of this when merging. Note that I'm currently testing against NIDAQmx 2026 Q1.
  • Add License.txt
  • Remove Makefile.am
  • Remove blanking output if no MOD IN channels are enabled and we're in Snap or Live mode
  • Rebuild waveforms whenever the ROI size changes because the readout time will change No longer needed since SyncTimingFromCamera() was implemented.

@dpshepherd
Copy link
Copy Markdown

This is great!

What I would like is to do two color (or more) interleaved imaging. I know how to construct the waveforms for this, but I do not currently know how the information about a multichannel MultiD Acquisition is passed into the device. I would need this information at the device adapter level to construct the waveforms.

Are you trying to implement the two channel imaging for "live" display mode or for an MDA (or maybe both..)? From the image.sc thread, I had "live" display in my head for some reason.

@nicost
Copy link
Copy Markdown
Member

nicost commented Feb 10, 2026

To seamlessly create the digital output patterns you describe, you would need to create a StateDevice in your device adapter analogous to DigitalOutputPort in https://github.com/micro-manager/mmCoreAndDevices/blob/main/DeviceAdapters/NIDAQ/NIDAQ.h (or AnalogOutputPort if you use analog outputs). The "OnState" action handler has to implement a couple of interesting things:
else if (eAct == MM::IsSequenceable)
Should result in:
pProp->SetSequenceable(maxLength)
Where maxLength is the maximum Sequence Length.
It should also have:
else if (eAct == MM::AfterLoadSequence)
and
else if (eAct == MM::StartSequence)
and
else if (eAct == MM::StopSequence)

Now, when you have 3 channels defined in the MDA that only differ in their DigitalOutputPort State, the acquisition engine will send the appropriate sequence to the device (AfterLoadSequence), Start the Sequence, then tell the camera to run a sequence (if you have only the three channels it will be a sequence of 3 images), after which you have all the information you need in the device adapter to do what you need to do.

Please ask if this does not make sense. It is all not terribly intuitive..

@kmdouglass
Copy link
Copy Markdown
Contributor Author

Are you trying to implement the two channel imaging for "live" display mode or for an MDA (or maybe both..)? From the image.sc thread, I had "live" display in my head for some reason.

It wasn't clear to me on the image.sc thread whether there should be a distinction for when the interleaving occurs, but during testing I realized that the interleaving makes sense only during Multi-D. Acquisitions. During Live and Snap, the device adapter currently allows you to illuminate 0, 1, 2, 3, or 4 AOTF lines simultaneously, depending on whether the corresponding MOD IN channel is set to Enabled.

then tell the camera to run a sequence (if you have only the three channels it will be a sequence of 3 images), after which you have all the information you need in the device adapter to do what you need to do.

Thank you very much once again @nicost . I'll take a look at the code and see what I can do.

Just a point of clarification: I'm actually using analog outputs for the AOTF signals and not digital outputs because our synthesizer accepts a 0 - 10 V analog signal whose value determines the laser intensity (more specifically the diffraction efficiency). But I doubt this matters for making the hardware sequencing work.

@marktsuchida
Copy link
Copy Markdown
Member

Nice work, @kmdouglass! Sorry I'm late to the discussion.

I don't have much to add to @nicost's excellent recommendations. If you needed multi-channel during snap/live, there might be a way to hack it by imitating Multi Camera. But the sequenceable state device is much cleaner for MDA in this application.

Figure out what to do about hard-coded NIDAQmx include and dependency paths in the VS solution file

I can do this for you when merging this PR. We'll just point to the same place as the NIDAQ adapter. (Maybe leave out Makefile.am for now unless actually testing on Linux.)

Small reminder: we'll need a license.txt.

@nicost
Copy link
Copy Markdown
Member

nicost commented Feb 10, 2026

Looks very good! Dying to hear if it works as expected!

Just a point of clarification: I'm actually using analog outputs for the AOTF signals and not digital outputs because our synthesizer accepts a 0 - 10 V analog signal whose value determines the laser intensity (more specifically the diffraction efficiency). But I doubt this matters for making the hardware sequencing work.

Indeed. If you want to be able to output different voltages (controlled by the UI), you would need to implement something different than a StateDevice, but if it is all just 0 or 10V, a StateDevice should be fine.

@kmdouglass kmdouglass force-pushed the isim_device_adapter branch 2 times, most recently from 5d267cd to 92e9053 Compare February 12, 2026 12:16
@kmdouglass
Copy link
Copy Markdown
Contributor Author

@nicost Does the new Java acquisition engine support hardware sequencing? The sequencing through a state device appears to be working with the Clojure engine, but not with the Java engine.

This is against the MockDAQAdapter; I'll test it against the real hardware again soon.

@kmdouglass
Copy link
Copy Markdown
Contributor Author

The waveforms are now correctly output by the NIDAQ for both live/snap and hardware sequenced, two channel MDAs. I put two example images below from my oscilloscope. The Expose Out signal is from the camera, not the NIDAQ.

I still have a few minor things to work on, but I will soon mark this PR as ready for review.

Live Imaging

isim_device_adapter_live

Two channel interleaved

isim_device_adapter_2_channel_mod_in

Exposure time clamping has been implemented as well so that

1. we do not exceed the peak-to-peak voltage requirements imposed by the waveform configuration
2. we take into account the "real" exposure time from the camera
…rleaving waveforms

Also do not output galvo or blanking waveforms if no MOD IN channels are enabled.
@kmdouglass kmdouglass marked this pull request as ready for review February 16, 2026 12:38
@kmdouglass
Copy link
Copy Markdown
Contributor Author

@nicost @marktsuchida The device adapter is now ready for review. I am attaching a compiled device adapter (against the mock DAQ) and configuration file in case you want to see how everything looks without needing a real DAQ.

iSIMWaveforms.zip

Just a note: I am in the process of rebuilding the microscope, so I have only tested this against an oscilloscope. There might be updates to this device adapter over the next few months as I get fully into integration testing. I intend to write the device adapter documentation page after the PR is accepted.

Let me know if something is not clear or if you have questions.

const char* g_ReadoutTimeNone = "None";

// Set to true to use MockDAQAdapter instead of the real NIDAQmx adapter
static constexpr bool kUseMockDAQ = false;
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is only used for testing. I'm not sure whether you have compile-time flags for this sort of thing or not. It should always be false for release builds.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This could be made a compile-time flag (can be set in the Visual Studio project properties), but it won't be much different from editing the source line.

What might be more useful is to just always build and ship both (mock and real). This prevents bit rot and is also most convenient once in the nightly. This can be done either as two separate adapters (would need two .vcxproj files), two devices in the same adapter, or even a pre-init property.

But I'd suggest worrying about that in a later PR rather than changing the structure of the code right now.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fair point. I'll do this in a follow up as you suggest.

Comment on lines +167 to +176
std::vector<std::string> devices = daq_->getDeviceNames();
CPropertyAction* pAct = new CPropertyAction(this, &iSIMWaveforms::OnDevice);
std::string defaultDevice = devices.empty() ? "" : devices[0];
deviceName_ = defaultDevice;
CreateStringProperty("Device", defaultDevice.c_str(), false, pAct, true);
for (const auto& dev : devices)
AddAllowedValue("Device", dev.c_str());

// Discover channels for default device
availableChannels_ = daq_->getAnalogOutputChannels(deviceName_);
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the only device communication that I do in the constructor. I discover all the available DAQ devices and their available analog output channels. The NIDAQ device adapter, for comparison, only discovers DAQ devices.

long numPositions_;
bool sequenceRunning_;

static const long MAX_SEQUENCE_LENGTH = 1024;
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably overkill. 4 is the maximum length as determined by the number of MOD IN channels. I left this at 1024 to avoid any problems in case something changes in the future.

/// pulse train (gated by an external trigger), and an AO task that uses the
/// counter's internal output as its sample clock for continuous waveform
/// regeneration.
class NIDAQmxAdapter : public IDAQDevice
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All actual hardware communication is confined to this class. This helps separate device adapter logic problems from hardware communication problems.

@marktsuchida
Copy link
Copy Markdown
Member

I pushed a commit that points the project to our internal copy of DAQmx 9.2 (old, but shouldn't matter for building what's used here -- it will use the locally installed version at run time). Should be ready to merge.

When working on the code, you can either edit the project properties temporarily, or you can put a copy (or symlink, if enabled) of the DAQmx include and lib64 directories where the build points. $(MM_3RDPARTYPRIVATE) points to ../../3rdparty relative to mmCoreAndDevices root.

@dpshepherd
Copy link
Copy Markdown

This is great! If helpful, happy to test on our hardware combos (NI DAQ + Hamamatsu and NI DAQ + Photometrics) once you are ready. We can log the outputs the the DAQs to ensure they generate the waveforms you are hoping for.

@marktsuchida marktsuchida merged commit 7e4af61 into micro-manager:main Feb 17, 2026
2 checks passed
@kmdouglass
Copy link
Copy Markdown
Contributor Author

This is great! If helpful, happy to test on our hardware combos (NI DAQ + Hamamatsu and NI DAQ + Photometrics) once you are ready. We can log the outputs the the DAQs to ensure they generate the waveforms you are hoping for.

Thanks a lot for your offer @dpshepherd ! I have tested against Photometrics but do not have any Hamamatsu camera on hand to test against.

I suggest waiting until I write the documentation page for the device adapter to test against Hamamatsu cameras, which I intend to do in parallel with my current task of rebuilding the microscope. There are some minor idiosyncrasies that need to happen at configuration time that will be clarified in the documentation.

In the meantime, it would be helpful if you could tell me whether DCam exposes its current readout time in its device adapter, and, if so, what this device property name is, as well as the units attached to its value (milliseconds, nanoseconds, etc.)

Also, let me know if you think this could be useful for you. I have some ideas about how to generalize this to other waveforms.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants