LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Discrepancy between Get Date/Time In Seconds and Waveform Graph time axis

Solved!
Go to solution

Hi Everyone,

I have a question about time drift I am observing between manually calculated time using the "Get Date/Time In Seconds" block and time reported by the x-axis of a Waveform Graph.  This code runs on a PC, not a dedicated real-time target.

 

For a logging application, I am generating timestamps within my VI to synchronize incoming sensor data from various sources that do not have a real-time clock.  The idea is to synchronize the data as much as possible by feeding everything through a VI that generates a timestamp, which then writes a frame containing every current sensor data point and corresponding timestamp out to a file.

 

My timestamps are generated using the "Get Date/Time In Seconds" block.  For operator monitoring purposes during logging, I would also like to plot this data to a graph, specifically a scrolling graph whose x-axis is elapsed time since VI execution. 

 

I initially tried to do this by simply feeding the data stream to a Waveform Graph.  However, I notice that the time displayed on the graph's x-axis drifts significantly from the manually calculated elapsed time.  (I have verified that the manually calculated elapsed time is correct.) 

 

While I am able to get around this issue by using a manually constructed waveform and a Waveform Chart, I am interested to understand why this drift is occurring.  Is it because the Waveform Graph is sample based and simply assumes a constant dt between samples to generate a time during plotting?  (This would eliminate this behavior when executing the code on a dedicated real-time target.)

 

I've attached a sample VI that illustrates this behavior.


Thank you for your help!

0 Kudos
Message 1 of 11
(4,253 Views)

I would like to know what the hardware is.

 

I can state that you have no common clock source between your DAQ device and the OS- so they cannot be "Synchronized" .  

 

Software timing will "Drift" you need "hardware" to synchronize timing.  (No insult intended, but, it is kind of a common misconception)


"Should be" isn't "Is" -Jay
0 Kudos
Message 2 of 11
(4,244 Views)

Hi Jeff,

Thanks for your quick reply.  I understand the synchronization issues without dedicated real-time hardware.  The intended application is simply to get data from disparate sensors relatively "aligned" through software timing -- think multiple load cells and thermistors connected to the computer through two independent DAQ boards.

 

The machine executing the VI is a laptop running Windows 10.  Both the Get Date/Time In Seconds and Waveform Graph use software timing in this case.  I would just like to understand why there is a discrepancy between the two reported times from these items, since, fundamentally, I would expect them both to rely on system time.

 

Edit for further clarification -- the intention of the scope is simply to let the operator roughly know how much time has elapsed since the data collection was started.  If the graph reports that an hour has elapsed, whereas in reality only 50 minutes of data have been collected, this would be an issue in my case.

0 Kudos
Message 3 of 11
(4,240 Views)

Is your SSP up-to-date and registered to your MyNI account?

 

There is online training for DAQ synchronization.  Without a common clock source your software only timing will - (Well, at the risk of restating the obvious.) be DIS-Synchronous!

 

Now, ask how "Synchronous" you want it?  And, the type of timing becomes a key design consideration.

Spoiler
I Am an amature Horologist - Please don't tell the LMB, "Lovely Mrs. Bohrer" 

"Should be" isn't "Is" -Jay
0 Kudos
Message 4 of 11
(4,225 Views)

Hi Jeff,

I will look into this, thank you.  However, are you able to provide help about the initial question regarding why the WaveFrom Graph and Get Date/Time In Seconds produce different results?  I have attached an example VI to illustrate this behavior in the original post.

0 Kudos
Message 5 of 11
(4,221 Views)

They do rely in "system time"

 

But, they have no data dependency! Exactly when did you wish for the graph to have two zeros?  (NaN would be better, but moot for the moment)  

 

The "toDBLs" are insane, as is the False constant wired to the while loop.

 

In Text

"While (1)

;"


"Should be" isn't "Is" -Jay
0 Kudos
Message 6 of 11
(4,202 Views)

Right-click your "Voltage Graph" and go to

 

Properties >>>> Scales

 

then set you multiplier to "0.01".

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 7 of 11
(4,183 Views)

Hi Ben,


Thank you for this information.  Though there is still some discrepancy between the two time measurements that accumulates during program execution (after 2 minutes in the "elapsed time" counter, the voltage graph has a time of 1:57), changing the scale has made this problem less severe.

 

May I ask why changing the multiplier on the graph's x-axis scale has this effect?  Does the behavior I am seeing relate to the 10 ms delay attached to the wait function?


Thanks again!

0 Kudos
Message 8 of 11
(4,178 Views)

@schep wrote:

Hi Ben,


Thank you for this information.  Though there is still some discrepancy between the two time measurements that accumulates during program execution (after 2 minutes in the "elapsed time" counter, the voltage graph has a time of 1:57), changing the scale has made this problem less severe.

 

May I ask why changing the multiplier on the graph's x-axis scale has this effect?  Does the behavior I am seeing relate to the 10 ms delay attached to the wait function?


Thanks again!


Did you look at the online training?

 

Myguess, 8-Ball totally shaken.

File...New... from template

 


"Should be" isn't "Is" -Jay
0 Kudos
Message 9 of 11
(4,166 Views)
Solution
Accepted by topic author schep

Yes the multiplier determines the spacing.

 

There is some jitter in the wait function combined with thread scheduling as it transitions from resource wait state to executable,blah blah, blah.

 

You could use a high resolution timer to improve the quality of the T0 used by the Waveform version to get the work a bit better.

 

If you have hardware in your PC you could use the on-board clock to clock an Timed Loop but you will never get it perfect on a Windows machine.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 10 of 11
(4,165 Views)