LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Unreliable system clock

Hello all -

 

I'm writing a VI that records 2 analog inputs while a single analog output is being generated.  The the user first initiates analog output (continuous), at which time I create a file with 1 arbitrary datapoint, for the purposes of saving the timestamp in the header.  The user then begins recording the analog inputs at a later, desired time, which I continually write to another file.  Once input and output have been terminated, I need to be able to match up the timing of the input and output samples as accurately as possible (no worse than 1 msec error). I only need to match the timing offline, after the data has been collected.

 

Using the timestamps from the headers in these two files to indicate when input and output began, I should be able to easily match up everything, since I already know the analog output sample rate.  BUT, I've discovered that generating the analog output slows down my system clock (by about 1 second every 20 seconds)!  This means that I can't trust the timestamp from the analog input file as the true start time of analog input.  (e.g., If the input file timestamp is X seconds after the output file timestamp, more than X seconds of output would have been generated by the time the input began.)

 

It seems to me that the only way around this is to start input and output at the same time.  This would eliminate the need to match up timestamps, since they're (ideally) the same. However, I'd rather not do this since I'd have lots of extraneous recorded input.

 

 

Any ideas?

 

I'm using LV 7.1 w/ DAQmx 8.0.0f0 on a Dell Latitude D810 w/ Windows XP.

 

 

Thanks in advance,

-B

0 Kudos
Message 1 of 8
(2,987 Views)

bme123 wrote:


I need to be able to match up the timing of the input and output samples as accurately as possible (no worse than 1 msec error).


This won't work anyway. At least in Windows, the resolution of the timestamp is roughly 16 ms. You could try using the tick count primitive or calling the performance counter API functions to get a higher resolution, but you might run into the same situation (which I never personally encountered).

 

In any case, you should be aware that Windows is not a real-time OS - it does not guarantee that it will do what you want (increment the software clock correctly, for instance) exactly when you want it. You might be able to improve this by improving your performance (let's say by preallocating the file or by collecting the data in memory).

 

 

 

 

 

Using the timestamps from the headers in these two files to indicate when input and output began, I should be able to easily match up everything, since I already know the analog output sample rate.  BUT, I've discovered that generating the analog output slows down my system clock (by about 1 second every 20 seconds)!  This means that I can't trust the timestamp from the analog input file as the true start time of analog input.  (e.g., If the input file timestamp is X seconds after the output file timestamp, more than X seconds of output would have been generated by the time the input began.)

 

It seems to me that the only way around this is to start input and output at the same time.  This would eliminate the need to match up timestamps, since they're (ideally) the same. However, I'd rather not do this since I'd have lots of extraneous recorded input.

 

 

Any ideas?

 

I'm using LV 7.1 w/ DAQmx 8.0.0f0 on a Dell Latitude D810 w/ Windows XP.

 

 

Thanks in advance,

-B


 


___________________
Try to take over the world!
0 Kudos
Message 2 of 8
(2,972 Views)

Hi! 

The timer of your DAQ Hardware runns more properly than the timer of your PC. So the timing of the signal is always correct, and the PC timer will differ more and more from the true time the later the input starts.

 

But you could use the sample rate to determine time.

 

Lets say:

 

AO begins at 0000ms at a sample rate of 100Samples @ 1kHz, so you have to send Samples 10x per Sec.

 

The user starts AI after 251 Samples were send (count the loops of the while loop around your continous output)

=> so the AI started at 251*100ms = 25 Seconds and 100ms

 

If you put AO and AI in sequence (connecting ie the error outputs) this might work better than just taking the PC-time. But if it is 1ms good....I do not know.

 

Greets Christian

0 Kudos
Message 3 of 8
(2,959 Views)

tst-

 

Thanks for the reply, I'll have to give the primitive and/or API functions a shot.  That's interesting about the timestamp resoltuion -- could you tell me where you read this or how you figured it out (just curious)?

 

Could you also let me know if this is what you are referring to by preallocating?  I forgot to mention earleir that I'm generating the waveform for my analog output by repeatedly reading in X lines of a text file (Read from Spreadsheet.vi) every time my AO loop iterates.  If I'm understanding you correctly, you're saying that I should create a dummy array of length X, and replace the values with those output by the Read from Spreadsheet at the beginning of each loop iteration?

 

Thanks,

-B

 

0 Kudos
Message 4 of 8
(2,936 Views)

Christian,

 

Thanks for the reply.  In your example, wouldn't the accuracy of the start time of analog input be limited by the length of time the AO loop takes to complete each iteration (100 msec for 100 samples @ 1kHz)? Since the AO while loop counter only updates once each iteration, I'd need to make that loop iterate once every millisecond to achieve the accuracy I need, correct? 

 

I'll give this a try later, but I'm not sure if my computer can write samples to the output buffer that quickly.

 

Thanks,

-B

0 Kudos
Message 5 of 8
(2,927 Views)

In a cyclic voltametry project I did a while back, I needt o closely corelate teh sitmulus with the response. In that app I clocked my AI samples using the AO clock so that every reading was taken at the same time as the output samples. There are shipping examples that show how to sync input samples to outputs.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 6 of 8
(2,923 Views)

bme123 wrote:


could you tell me where you read this or how you figured it out (just curious)?


I don't have any official source for this. The timestamp data type itself uses 64 bits for the fractional part, so it should be able to get a resolution slightly higher than 16 ms, but my understanding was that this was limited in Windows because LV uses the API function which relies on the system clock which has that resolution. You can do a Google search and I assume you will find references to it. In any case, you can easily verify this by running the primitive in a loop with a 1 ms delay and building the results into an array. You will see that the results jump in ~16 ms intervals.

 


Could you also let me know if this is what you are referring to by preallocating?


Yes, that's one example of replacing in-memory. You should note that repeatedly reading from a file (or writing to it) can significantly hurt your performance, but knowing exactly what to do requires some experience. I don't have much experience with timers and synchronization, so I just you follow Ben's advice.


___________________
Try to take over the world!
0 Kudos
Message 7 of 8
(2,906 Views)

"I'd need to make that loop iterate once every millisecond to achieve the accuracy I need, correct?"

 

No, because the write command doesn't take 100ms. It only needs  a short time, then the loop loops and waits before the write command can start again (trggered by the reading hardware).

So the limitation is, that you only have 10 possible but quite exact Starting Points per Second.

But I thought it would not matter because it is started by hand anyways.

 

It would be a rather simple solution. But if it is 1ms exact..... With additional timing hardware you can, of course, do better.

 

 labview.png

Oh, and you have to initialize the reading before your trigger because the init would take too long.

Message Edited by CtheR on 02-25-2010 01:08 PM
0 Kudos
Message 8 of 8
(2,889 Views)