LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Millisecond timer with pause

Solved!
Go to solution

Hi,

I am very new to LabVIEW and programming

I am trying to implement a millisecond timer using two tick clock(ms) functions. The timer must run at the beginning and it should pause on a rising edge of the digital signal. Although I am able to achieve this, the is some problem with the accuray of the timer, as it shows atleast 2ms lesser than the actual value. Moreover, sometimes (especially in the beginning) it also shows some awkward values like 1ms. Kindly help me to know, if there is a better way to implement the timer or how can I solve the inaccuracy.  timer_accuracy.PNG

0 Kudos
Message 1 of 16
(829 Views)

Hi gok,

 


@gok010 wrote:

Kindly help me to know, if there is a better way to implement the timer or how can I solve the inaccuracy.  


You forgot to attach the code.

All we got is an image: we cannot edit/run/debug images in LabVIEW!

 

Comments/Improvements:

  • There are terminals with missing labels.
  • There are uninitialized shift registers.
  • No need for the 0/1 constants at the IndexArray node.
  • Why do you convert the millisecond difference to DBL?
  • You can (probably) replace the innermost case structures with Select nodes…
  • There is a 20ms wait inside the loop. You will never measure anything in the 1ms/2ms range…
  • You can use the output of the wait functions instead of that TickCount inside the loop, they give the same value…
  • The DAQmxDelete belongs outside the case structure!
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 16
(802 Views)

In addition to Gerd's notes on the code, just in general there's a flaw in your plan and that is the fact that you are running on a non-real-time operating system (presumably Windows).

 

Any operating system that isn't real-time can and will stop running your program for a few milliseconds from time to time.  Even if you fix all the potential errors in the code itself, Windows might suddenly decide that there's something else more important than LabVIEW to do for 10 ms, or 50 ms, or even 1 second.

 

The solution is that you have to move the timing to the DAQ itself, which can capture and timestamp independently of Windows.  Right now you are capturing the "instant" state of the DAQ at whatever time it gets to the "read" function.  If you change to a read function that instead captures continuously 1000 times a second (or more) into a hardware buffer, then each time you read from it you will instead get multiple time-stamped readings that you could then go back and analyze to actually get the measurement you're trying to make.

0 Kudos
Message 3 of 16
(793 Views)

In addition to what has been said, please clarify what you are trying to do:

 

  • You read two digital lines, do you want to measure each independently?
  • Most of your shift registers are not initialized, so they contain stale data from previous runs. What is outside the currently visible code? Is there a toplevel loop? Are you using the "continuous run" button? Is this a subVI?
  • On the first iteration, both ticks are read within nanoseconds, so the difference will probably close to zero.
  • You only need DBL if you expect to accumulate times larger than the range of U32. Are you?
  • If you really want orange, use high resolution relative seconds instead.
  • Please list all requirements. Should falling edges be ignored? What if there is another rising edge later?
  • Where is the terminal of the stop button???
  • You have a glaring race condition due to the reading of the two inputs (start boolean, digital line specs) in parallel to resetting everything to defaults. Whatever happens last wins and the outcome cannot be predicted at all because execution order is not defined.

 

Please attach your entire code and explain exactly how it should be used. For simplicity, change the code to a simulation by substituting the digital reads with a boolean array control and eliminating all DAQ. This way it is much easier to work out the logic!

0 Kudos
Message 4 of 16
(786 Views)
Solution
Accepted by topic author gok010

See if this can give you some ideas....

 

altenbach_2-1702403908443.png

 

 

 

Message 5 of 16
(773 Views)

Hello Altenbach,

 

Thank you for replying to my post, though your code works well up to measuring 30ms, it is inaccurate in measuring less than 30ms

0 Kudos
Message 6 of 16
(717 Views)

Hi gok,

 


@gok010 wrote:

your code works well up to measuring 30ms, it is inaccurate in measuring less than 30ms


In which way is it inaccurate? Can you provide some numbers/examples?

The limit/accuracy to measure is ~2ms because of that 1ms wait inside the loop. (Without taking into account the influences introduced by the OS.)

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 7 of 16
(713 Views)

Hi,

I have tried to acquire in hardware-timed single-point mode with a sample rate of 1000, but it still fails to measure values accurately for times less than 30ms.

0 Kudos
Message 8 of 16
(708 Views)

yes, for example, when the time of the rising edge is less than 30ms the timer shows a value of zero

0 Kudos
Message 9 of 16
(702 Views)

Hi gok,

 


@gok010 wrote:

yes, for example, when the time of the rising edge is less than 30ms the timer shows a value of zero


Why do you think this is a problem of Altenbach's code?

 


@gok010 wrote:

I have tried to acquire in hardware-timed single-point mode with a sample rate of 1000, but it still fails to measure values accurately for times less than 30ms.


Does your hardware allow for such sample rates in single-point mode?

Does your hardware allow for continuous measurements using hardware timing? (Why don't you use this when possible?)

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 10 of 16
(681 Views)