LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Synchronizing DAQmx to an NTP Server

Hello!

 

I'm wondering if there is a way to synchronize my DAQmx acquisition to our system clock (which in turn, is synchronized to an NTP Server).

I'm running on a Windows 10 PC connected to a PXIe-1082 chassis via a PXIe-8398 MXI connection. I've got some PXIe-4309 AI boards that I'd like to synchronize to the server.

 

The issue I've ran into is that the DAQmx timestamp will eventually drift away from the system clock. Run for 10 minutes and the DAQmx timestamp matches the system clock timestamp, but over the course of several hours there will be a substantial drift between the two that grows over time. This makes sense because DAQmx has its own on-board oscillator that, while synchronized amongst the DAQmx tasks, the DAQmx-clock is running independently from the system clock.

 

I've looked into TSN, however, it seems like my hardware setup might not be compatible with this scheme

https://www.ni.com/en/shop/data-acquisition/designing-distributed-tsn-ethernet-based-measurement-sys... - this article lists cDAQ, cRIO, and FieldDAQ as compatible systems.

https://knowledge.ni.com/KnowledgeArticleDetails?id=kA00Z000000kKC4SAM&l=en-US - this article indicates that IEEE 802.1AS is not supported on Windows

https://knowledge.ni.com/KnowledgeArticleDetails?id=kA03q000000x3IVCAY&l=en-US - this has a workaround using NI Sync but it seems to be more around syncing multiple chassis, rather than to a time server itself using a combination of the PXIe-6682 and PXIe-6683 cards.

Extrapolating from that last article, perhaps I could using a 6682, synchronize it to a GPS clock (the NTP server), and use the board to generate a clock for DAQmx to use?

 

Has anyone solved this problem before, is there a standard solution for this that I am not seeing?

0 Kudos
Message 1 of 7
(691 Views)

DAQmx does not get the real timestamp when it first acquires data. It grabs the system time and then all subsequent times are based on the sample rate. Software controls when it grabs the system time, so it is not completely accurate.

 

Since you have a PXI system you can try the following:

  1. Get a GPS Card for the system.
  2. Use the GPS Card for triggering and for conditioning the sample clocks. That is, use NI-Sync to trigger your device at a known time and use the 10 MHz out of the GPS to replace the reference clock in the PXI backplane. The reference clock on the GPS should be conditioned by the PPS from the GPS; thus it should correct itself every second.
  3. Use the reference clock in the backplane to phase lock your DAQmx boards; that way their sample clock is in a PLL with the reference clock.

Now your acquisition will have a known time start and be locked to a GPS so it shouldn't drift with respect to the absolute time. You will need to post process your data with the DAQmx timestamp such that it is replaced by the GPS timestamp. This is only true if you use the built in logging functions. If you roll your own logging, you can do this on the fly.

 

Another option is to digitize the PPS from the GPS on a channel. You won't know the absolute time but you will know that every PPS should occur on a whole second, so you can post process your data that way.

Message 2 of 7
(630 Views)

@mcduff already explained the facts.

 

Now, why do you need the absolute time that the acquisition started?

 

Are you trying to time correlate data from systems that are not integrated?

 

 

If all the instruments that capture are part of the same system and you have synchronized them well, there is no need for getting a very accurate start time as long as all the channels in the system are synchronized to the best possible extent

Santhosh
Soliton Technologies

New to the forum? Please read community guidelines and how to ask smart questions

Only two ways to appreciate someone who spent their free time to reply/answer your question - give them Kudos or mark their reply as the answer/solution.

Finding it hard to source NI hardware? Try NI Trading Post
0 Kudos
Message 3 of 7
(626 Views)

Here's a snippet I once made to illustrate the beginnings of a simple but crude technique you might consider.  Read through more of the thread for context and further thoughts.

 

Summary: the GPS hardware solution from mcduff is a *real* solution, my link is just a little band-aid though it could potentially be "good enough" for some situations.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 4 of 7
(586 Views)

The remarks in the thread Kevin mentions are pretty good, except one: I might be wrong about this part, but I didn't see any sign the PC clock jump back a few seconds.

 

It doesn't jump in modern OS versions normally (it did do so in older versions) but slowly skews the timing of the PC clock to adjust it to the NTP or whatever external time source. Except the difference reaches a certain threshold in which case it actually can brutally adjust the clock in one or several larger steps.

 

So your PC clock is generally guaranteed to be synchronized to some external atomic clock derived time source (unless you have no network connection or disabled NTP and similar services). But its monotonous accuracy in the short term can be pretty bad in comparison to a quartz oscillator generated clock such as what is used on DAQ cards.

 

So if you sample with sub milli-second sample interval you absolutely want to use the clock from the DAQ card. If you sample with above second sample intervals the real time clock of the PC is likely a better timing base but you need to be aware that there is actually a possibility for large jumps due to adjustment of the real time clock.

 

If you sample with fast sample clock over very long periods of time and want the data to maintain synchronization with the real-time clock you are in the realms of high effort engineering.

 

My go at that would most likely be to store the start date and time of the data, and calculate the accumulated difference between the sample clock time and the real time clock time and then report the adjusted sample interval to some 15 digits of precision.😁 

 

The relative timing of the data coming from the DAQ source is guaranteed to be within the accuracy of the oscillator but typically much much more accurate than that (unless you keep the board exposed to heat and freezing temperatures alternatingly 😀). So assuming that the timing interval of the samples is actually constant is a pretty good approximation. If that is not accurate enough you really will need to go the path of GPS synchronized external DAQ clock. But that is neither easy nor very cheap to create.

Rolf Kalbermatter
My Blog
0 Kudos
Message 5 of 7
(512 Views)

I agree with rolfk's remarks about backward or step-function time jumps NOT being an issue to be concerned about.  My remarks in the other thread were based on a naive, wrong, and long out-of-date notion for how such time-of-day discrepancies got handled.  

 

Another nitpick I'd make with my earlier remarks is that it probably isn't terribly valuable to query the time difference on every DAQ read, and for sure you shouldn't make any kind of adjustment / correction that often. The jitter of software execution and driver calls would become a bigger source of error than the clocks.

    I might do a run where I checked once every 5-10 minutes to convince myself that the skew stayed pretty constant.  If it did, I'd probably fall back to something more like checking once an hour -- just often enough that I could do a linear regression on the skew and apply a correction in a post-processing step.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
0 Kudos
Message 6 of 7
(495 Views)

Thanks for all the feedback guys!

 

I'd really like to be able to say that the system is "hardware synchronized" and not mess with corrections in software. I think that means the path forward is importing a hardware clock via an NI Sync board, getting that routed on the backplane and having the DAQmx tasks use that as their PLL reference clock.

 

Now I've got to figure out if I can use GPS (might be difficult to physically route an antenna into this particular facility) or if I can export a clock signal directly from our grandmaster clock HW to the NI Sync board - I think the latter would be the preferred solution.

0 Kudos
Message 7 of 7
(471 Views)