09-23-2016 09:42 AM
Hi AF Experts,
There is something that I'd like to be sure about the best way to have an actor performing an action periodically.
I was using the time-delayed functionnality and I found some strange behaviour that I think we've explained, but i'd like to know what you guys think about it.
The time-delayed works with a notifier which timeouts to trig the message send, all that into a while loop.
To me, this timeout works like a wait function. So the real loop iteration duration is (the timeout) + (the duration of the others actions in sequence in the loop)
-> So the messages are not enqueued at the rate we thought, but something a little lower (bigger period).
I made a simple test project to check this idea.
The actor (AUT) has a method (Action.vi) which we want to be called periodically (100ms).
I compared 3 methods:
- 1 - With the AF time delayed VI.
- 2 - Through a parallele loop timed by the metronome.
- 3 - Through a parallele 'timed loop'
The Action.vi itself compares the time when it is called with the supposed one in a perfect world (based on the 100ms period).
The result appears in the chart graph.
As you can see, 2- and 3- are stable ( 2- has more jitter). BUT 1- present a slow shift. That proves that the period of emission of the time delayed function is little higher than what we want.
Do you think there is something wrong in this reasonning ?
(the project used for the tests is attached)
09-23-2016 12:26 PM
Your assessment is correct. The period for a Time Delay Message will drift over time. If you need rigid timing, you shoud use one of the other mechanisms you've described.
09-23-2016 01:16 PM
If you are not on a real-time system (i.e. Windows, Mac or desktop Linux), anything less than a quarter second is subject to significant error bars anyway. On an RTOS, you'd put the message generation into a Timed Loop and get it on a rigorous schedule. That will generate the messages at a fixed rate... it says *nothing* about the rate those messages get handled unless that too is deterministic.
09-23-2016 02:25 PM
One can correct for eliminate this drift by recording a "zero" time and calculating the required timeout to get to the next "metronome tick" (using the mod function). One still has jitter, but no long-term drift.
It looks like this:
09-23-2016 02:29 PM
You can only "correct" it if it is wrong. It is a behavior. It might not be the behavior you want, but I do not consider it to be a bug.
09-23-2016 02:53 PM
Scientific meaning of "correct", as in "make an adjustment for", not "correct a bug". The OP can make a modified version of the Time-Delayed message feature that gives driftless timing.
09-23-2016 03:14 PM
Ah. I see. Perhaps we should check the scientific meaning of "context"?
09-24-2016 06:28 AM
Though not a bug, it would perhaps be better if the Time-Delayed-Message thing did adjust itself to eliminate drift. That is sometimes useful to have and is more intuitive, and adds no meaningful overhead.
09-26-2016 02:32 AM
Ok, thank you for the answers. It confirms what we thought.
I think we will devellop and use our own time delayed vi based on something like James solution. I don't like to modify directly the framework as we often move our projects from a computer to another one.
Anyway I also think that it would be nice to have the AF time-delayed function working with no drift, as the solution is very simple (for next version of LV). I can't find any use case where the current way it works would be better than a no-drift-solution.
09-27-2016 10:13 AM
Great post!
I was curious why in one of my recent projects that I was seeing the timestamp on different chart displays drifting over time.
I had multiple charts, their data was produced by different actors using the time delayed message to fire a message to read data.
Over the first day things were ok, the timestamps were pretty much in sync. By the second day, I noticed a few seconds difference in the timestamps between the charts. By the fourth day, it was into the minutes of difference.
I will look at James' solution as a possible patch to my application.
Thanks for the continued great ideas!