Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to people currently handle complex hardware synchronization? #26

Open
henrypinkard opened this issue Apr 11, 2021 · 9 comments
Open
Labels
Microscope builder input needed Actively seeking input from microscope builders/users

Comments

@henrypinkard
Copy link
Member

Specifically, what devices do you use? (e.g. FPGA, Arduino, National Instruments board)

How do you program that device?

What are the essential features provided by this setup?

And how could this be generalized into a device/type API?

@henrypinkard
Copy link
Member Author

Example of FPGA use in micro-manager: https://github.com/jdeschamps/MicroFPGA

@nanthony21
Copy link
Member

I often use TTL synchronization between a Hamamatsu camera and various forms of tunable filters. This involves TTL from the camera (indicating the end of an exposure) as well as into the camera (triggering a new camera).

The difficulty in developing a standard API is that there is such huge variety in how much configuration complexity each device supports.

@henrypinkard henrypinkard added the Microscope builder input needed Actively seeking input from microscope builders/users label Apr 26, 2021
@jdeschamps
Copy link

jdeschamps commented Apr 29, 2021

Example of FPGA use in micro-manager: https://github.com/jdeschamps/MicroFPGA

We use that. It has a simple layout for synchronization: it receives a camera TTL trigger (high at start of each frame, low in between frames), process it into more complex patterns and redistribute it to all lasers (TTL). We had a similar system with an Arduino before. The reason we switched to the FPGA was that we could have >4 lasers triggered in parallel with pulses as low as 1 us (by design, not limited by the fpga), each following independent complex sequences (16 bits sequence of 0=OFF, 1=ON) on either rising or falling edge of the camera trigger. All that in one board, with the control of many other devices on the side (servomotors for filters and lenses, TTL for flip-mirrors and brightfield switches, analog output for AOM/AOTF, PWM for custom laser power, analog read-out to check focus-stabilization, temperature and laser power). It is programmed in an HDL-like language from the manufacturer. In MM, all "subdevices" (laser trigger, servomotors, TTL device, PWM device, Analog in) are independent but under a shared Hub-device.

Something that I've seen in many microscopes in the context of FPGAs, is to use the board to trigger the camera(s) as well. I have no experience with this, so I can't really comment on how much of the "camera API" is then under the responsability of the triggering device (exposure time, interval between frames). I guess one of the big advantage is when you have devices other than lasers to synchronize (e.g. galvos for beam positioning), it is easier to maintain synchronization when you also trigger the camera (less delay?).

Finally, we also had a hacky set-up with two cameras (different models, different sensor and image pixel sizes): one camera (EMCCD, Photometrics) would trigger the other camera (sCMOS, Hamamatsu) which in turn would send the trigger to be processed by the FPGA. Alternatively, the main camera could also trigger the FPGA in parallel to the second camera. Note that one camera was controlled by MM1.4 and the other by MM2, which highlights an important point for #23.

The difficulty in developing a standard API is that there is such huge variety in how much configuration complexity each device supports.

I also have trouble picturing it. Do you know how it is done in the big Python packages for microscope control that are around? It would be interesting to know what choices were made there.

@campagnola
Copy link

I have mostly done this with NI DAQ devices + camera / shutter / etc TTL lines. Every type of device is going to have its own idiosyncracies regarding synchronization, but a high-level API should allow the user to specify that they want Device1 to wait for a trigger from Device2, and then tell Device2 to start. The details of that triggering process can be negotiated by code and configuration supporting the devices. For example, in my configuration I would specify that a camera can trigger the DAQ on a specific PFI line, or the DAQ can trigger the camera by raising a specific DO line. Then the supporting code decides what actions are needed in order to synchronize the devices.

@kbellve
Copy link

kbellve commented Apr 30, 2021

Specifically, what devices do you use? (e.g. FPGA, Arduino, National Instruments board)

I programmed the Heka ITC18 Device Adapter specifically for hardware synchronization. It predates µManager's Hardware Synchronization API and I never instituted that API in the ITC18 Device Adapter due to its limitations.

This system is something I developed in the late 90s to early 2000s, but brought over almost completely when I changed to µManager about 2007/2008. It didn't take me much development time to port it over to µManager because it was already working and proven in imaging acquisition software that I had written previously.

How do you program that device?

I have a main CSH script that outputs time with 16 bit digital number for the TTL states of the ITC18. That script manages everything that needs a TTL signal (camera(s), shutters, lasers (on/off). I then have other scripts for ± 10V devices (piezos), which also outputs time ± analog number, using same time base for synchronization. Everything gets concatenated into a single file, which I call an imaging protocol file. Repeatable TTL/Voltage sequences can be used. The ITC18 Device Adapter uses a thread to process that file to continually feeds the ITC18 buffer.

I use bsh scripts inside Micromanager to load the imaging protocol file into the ITC18 Device Adapter, and set up and initiate µManager for the acquisition. µManager role is to wait for incoming images.

What are the essential features provided by this setup?

Timing precision and accuracy, speed, flexibility, and asynchronous from the operating system and µManager. µManager (and the ITC18 Device Adapter) is only used to keep the ITC18's internal buffer filled. I can control devices from 12 Hz to 200 kHz with microsecond accuracy.

A huge advantage is everything is external to the imaging application. Everything is in readable text (scripts and imaging protocols), and can be easily modified or validated with a text editor. Adding another TTL device for control just means modifying a text script (adding it to the bit pattern). Validating an imaging protocol just requires a text editor rather than debugger.

And how could this be generalized into a device/type API?

Uhm...

I have to think about this...

@xcasas
Copy link

xcasas commented May 17, 2021

  • We use a national instruments board

  • The python library nidaqmx (https://nidaqmx-python.readthedocs.io/en/latest/)

  • The essential features are opening and closing tasks, writing/reading analog or digital data to those tasks, and some triggering and sync functions.

@palmada
Copy link

palmada commented Apr 22, 2022

We use an NI board as the master control for triggering because we generate some wave form and want to trigger at specific points in the waveform.

@jondaniels
Copy link

We commonly do hardware triggering with ASI controllers. For anything complicated we use the Tiger controller with a "PLC" card that emulates an FPGA, so it can be programmed to do some complicated things. Programming that functionality is the bottleneck: somebody has to grok the signaling scheme well enough to translate it into a logic diagram, then that diagram has to be translated into the PLC functions, and then that has to be converted into a a script setting MM properties (or equivalently a series of serial commands) to program that functionality into the card. Documentation including examples at http://asiimaging.com/docs/tiger_programmable_logic_card.

Overall this is super flexible and accessible via Micro-Manager properties. If you give an API we can probably implement it with the PLC, but it's so flexible that it's probably pointless to base an API off of it...

@henrypinkard
Copy link
Member Author

A belated thank you to all of you for all this helpful feedback!

We're finally moving forward on a new features that address many of these limitations. Please feel free to chime in and get involved at: https://github.com/micro-manager/mmCoreAndDevices/issues

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Microscope builder input needed Actively seeking input from microscope builders/users
Projects
None yet
Development

No branches or pull requests

8 participants