Actor Framework Discussions

cancel
Showing results for 
Search instead for 
Did you mean: 

The danger of time delayed messages ?

Hi AF Experts,

There is something that I'd like to be sure about the best way to have an actor performing an action periodically.

I was using the time-delayed functionnality and I found some strange behaviour that I think we've explained, but i'd like to know what you guys think about it.

The time-delayed works with a notifier which timeouts to trig the message send, all that into a while loop.

To me, this timeout works like a wait function. So the real loop iteration duration is (the timeout) + (the duration of the others actions in sequence in the loop)

-> So the messages are not enqueued at the rate we thought, but something a little lower (bigger period).

I made a simple test project to check this idea.

The actor (AUT) has a method (Action.vi) which we want to be called periodically (100ms).

I compared 3 methods:

- 1 - With the AF time delayed VI.

- 2 - Through a parallele loop timed by the metronome.

- 3 - Through a parallele 'timed loop'

The Action.vi itself compares the time when it is called with the supposed one in a perfect world (based on the 100ms period).

The result appears in the chart graph.

Capture.PNG

As you can see, 2- and 3- are stable ( 2- has more jitter). BUT 1- present a slow shift. That proves that the period of emission of the time delayed function is little higher than what we want.

Do you think there is something wrong in this reasonning ?

(the project used for the tests is attached)

0 Kudos
Message 1 of 13
(5,309 Views)

Your assessment is correct.  The period for a Time Delay Message will drift over time.  If you need rigid timing, you shoud use one of the other mechanisms you've described.

0 Kudos
Message 2 of 13
(4,637 Views)

If you are not on a real-time system (i.e. Windows, Mac or desktop Linux), anything less than a quarter second is subject to significant error bars anyway. On an RTOS, you'd put the message generation into a Timed Loop and get it on a rigorous schedule. That will generate the messages at a fixed rate... it says *nothing* about the rate those messages get handled unless that too is deterministic.

0 Kudos
Message 3 of 13
(4,637 Views)

One can correct for eliminate this drift by recording a "zero" time and calculating the required timeout to get to the next "metronome tick" (using the mod function).  One still has jitter, but no long-term drift. 

It looks like this:

Metronome timing.png

Message 4 of 13
(4,637 Views)

You can only "correct" it if it is wrong. It is a behavior. It might not be the behavior you want, but I do not consider it to be a bug.

0 Kudos
Message 5 of 13
(4,637 Views)

Scientific meaning of "correct", as in "make an adjustment for", not "correct a bug".  The OP can make a modified version of the Time-Delayed message feature that gives driftless timing.

0 Kudos
Message 6 of 13
(4,637 Views)

Ah. I see. Perhaps we should check the scientific meaning of "context"? 

0 Kudos
Message 7 of 13
(4,637 Views)

Though not a bug, it would perhaps be better if the Time-Delayed-Message thing did adjust itself to eliminate drift.  That is sometimes useful to have and is more intuitive, and adds no meaningful overhead.

0 Kudos
Message 8 of 13
(4,637 Views)

Ok, thank you for the answers. It confirms what we thought.

I think we will devellop and use our own time delayed vi based on something like James solution. I don't like to modify directly the framework as we often move our projects from a computer to another one.

Anyway I also think that it would be nice to have the AF time-delayed function working with no drift, as the solution is very simple (for next version of LV). I can't find any use case where the current way it works would be better than a no-drift-solution.

0 Kudos
Message 9 of 13
(4,637 Views)

Great post!

I was curious why in one of my recent projects that I was seeing the timestamp on different chart displays drifting over time.

I had multiple charts, their data was produced by different actors using the time delayed message to fire a message to read data.

Over the first day things were ok, the timestamps were pretty much in sync. By the second day, I noticed a few seconds difference in the timestamps between the charts. By the fourth day, it was into the minutes of difference.

I will look at James' solution as a possible patch to my application.

Thanks for the continued great ideas!

0 Kudos
Message 10 of 13
(4,637 Views)