04-10-2024 04:30 PM - edited 04-10-2024 04:31 PM
I am trying to use the PXI-6289 to monitor a number of digital signals. I am using the digital I/O pins for this. I am trying to use the DAQmx to read these with the 10 MHz onboard reference clock. I am reading 2,000 samples at a time, so I would expect the loop to execute every 200 microseconds (us). When I run the attached code and use a tick count microsecond timer I see that the actual loop time varies between 166-247 us. Only 34% of the loops execute in exactly 200 us. I was expecting the sample clock to do a better job controlling for sample timing and computational jitter. Any suggestions? I considered replacing the while loop with a timed loop. If I do that do I still need to use the 10 MHz sample clock?
04-10-2024 06:37 PM
The jitter you are measuring is actually the time difference between each DAQmx Read API call.
If you want to measure the jitter of the sample clock, you should use NI DAQ to measure a known signal (e.g. 1MHz pulse from a function generator) and check the returned waveform for any distortion. See Accuracy of the Waveform Timestamp Returned by NI-DAQmx
04-11-2024 07:53 AM
FWIW, I would consider jitter in the realm of 40-50 microseconds in a Windows-controlled 5 kHz software reading loop to be a big win. That's far better than I would dare tell someone to expect.
Would your app allow for reading more samples at a lower loop rate? Such as 20k samples at a time for a 500 Hz loop rate?
-Kevin P