02-27-2006 08:52 AM
02-28-2006 02:51 PM
03-02-2006 09:34 AM
Ryan,
Thank you for your answer. I have already solved the problem. I passed over a new control parameter in DAQ Assistant2.vi, which is called timeout and is set by default to 10s. It is missing in the detailed help corresponding to this VI.
Regards,
Jarda T.
03-03-2006 09:27 AM
03-26-2008 08:04 AM
I am having a similar problem. I am using DAQ assistant to acquire signals on demand. The loop is set to iterate at 25ms. This I thought would give me the required sample rate of 40 times per second. However, when I open the associated log file, it only gives me readings of 8 times per second. I have attached the VI. Can anyone see what is wrong here? In a previous post I was told to try a producer consumer architecture. I did this, but this did not improve my logging rate. I now have a feeling it has something to do with the DAQ assistant. If anyone has had similar problems and solved this, please shed some light.
Thanks.
03-26-2008 08:27 AM
03-26-2008 08:47 AM - edited 03-26-2008 08:56 AM
EDIT: I may be mistaken about the DAQ assistants creating and clearing the task on every iteration. Looking closer at the subVI, I see it uses first call primitives to only execute those task the first time the subVI is called. But I would still recommend converting the Express VI's to the basic DAQmx functions.
You may also want to put some get timer functions in your code with some sequence structures before and after different code parts such as the DAQ assistants and File writes to see where the code may be taking the longest. I would have though the File Write might be taking longer than you want, but you said the problem went away when you removed the DAQ assistants and put in a random number generator implying that the File Write portions are okay. If it turns out that is not the case, then it is probably a good idea to move the file writes into their own loop and pass the data by queue. Look at the producer/consumer architecture for how to do this.
03-26-2008 11:20 AM
Thanks for the info. By breaking up the DAQ assistant and doing what you said :
"I would recommend setting up your DAQ tasks before the loop, read the task only inside the loop, and clear the task after the loop"
This has solved my problem. I am now logging at the rate I require.