LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Avoiding Millisecond Counter (Tick Count) Overflow

Hi All & Happy new year to you!
 
I have an application which is designed to be left running for months. Basically I continually read the value of the tick counter within a while loop, if a certain event occurs I store the tick counters value and on subsequent loop iterations i read & subtract the stored count value from the current count value and compare this to a constant.
 
In simple terms I am using the tick counter as a timer to keep track of how long an event has been running for, so if for instance someone scans the wrong barcode I want to display a "scan error" message on an indicator for 10 seconds and then resume back to an "Idle" mode.
 
My problem is that I am aware the LabVIEW tick counter will overflow or reset after approx 50 days, if this occurs during an event like I have described above then there is the possibility that an erroneous timing will occur, this could be critical in my industrial application.
 
Does anyone know how this can be avoided or circumvented?
 
Thanks in advance for your assistance.
 
 
 
 
 
 
I then I read the tick counter within a loop and if a certain event occurs,
0 Kudos
Message 1 of 5
(6,258 Views)
Instead of using the tick count, you could use the get date time in seconds, which return time values with the same resolution (ms), with the additional advantage that it is an absolute value...
Chilly Charly    (aka CC)

         E-List Master - Kudos glutton - Press the yellow button on the left...
        
0 Kudos
Message 2 of 5
(6,248 Views)
Thanks Chilly, I will try that out in a test vi and let you know how I got on.
0 Kudos
Message 3 of 5
(6,245 Views)


@ATE Man wrote:
Hi All & Happy new year to you!
 
I have an application which is designed to be left running for months. Basically I continually read the value of the tick counter within a while loop, if a certain event occurs I store the tick counters value and on subsequent loop iterations i read & subtract the stored count value from the current count value and compare this to a constant.
 
In simple terms I am using the tick counter as a timer to keep track of how long an event has been running for, so if for instance someone scans the wrong barcode I want to display a "scan error" message on an indicator for 10 seconds and then resume back to an "Idle" mode.
 
My problem is that I am aware the LabVIEW tick counter will overflow or reset after approx 50 days, if this occurs during an event like I have described above then there is the possibility that an erroneous timing will occur, this could be critical in my industrial application.
 
Does anyone know how this can be avoided or circumvented?
 
Thanks in advance for your assistance.

As long as the interval between two related events is not larger than the range of the uint32 there is no problem at all.
If you  keep the values as uInt32 and subtract the earlier event time from the current event time with the subtract function you will get the correct interval anyhow even if there has been a counter turn over. This is a feature of proper integer arithmetic implementation as specified by IEEE and LabVIEW implements that too. Just try it out by subtracting 4294967295 from 10 both set as unsigned int32 and you will see the result to be 11 which is the difference between the two numbers when looked at as unsigned int32.

If any interval you are interested in can be greater than
4294967295 ms or 1193.05 hours then you would have to resolve using the timestamp instead of the tick count as the previous post suggested.

Rolf Kalbermatter

Message Edited by rolfk on 01-04-2006 10:37 AM

Rolf Kalbermatter
My Blog
Message 4 of 5
(6,246 Views)

Thanks Rolfk.

I tried out a little test using constants: My imaginary tick count is at or close to its wrap around point and I store its value to mark when my event started.

A little later I subtract a second tick count value from the first, this second value represents something typical of what I would find just after the wrap around point.

Sure enough, even with the wrap around the difference between the two is correct!

!Interestingly I did find  example code someone had written to try and get round this (attached) whether this is a historical thing or a midsunderstanding I'm not sure.

Its been a while since i did any computer studies and I should know this! but is this something to do with the unsigned representation and a carry or sign bit?

Thanks for your help!

 

0 Kudos
Message 5 of 5
(6,212 Views)