Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

How can I use the internal clock to measure the time of acquiring?

Hi all,
    I'm using PCIe-6251 board to acquire the analog input from the output port of itself. I wanna make this run as fast as possible, and measure its rate and then calculate the time variance and standard deviation. I store the data aquired into the queue and would like to save the time into it as well. What I did at first is to use a real-time tick count to record the time of acquiring, and I've several problems:
    1. Is the real-time tick count the internal clock of the board or the reference clock of the software? I think it may be generated by the software, which may bring in some extra errors. So how can I use the internal clock so that it records the "real" time of acquisition? Since I wanna make it very accurate (microsecond), so I don't think the software timer is available.
    2. Can I change the format of writing data into the file? The way it writes is not two columns of voltage and time, but one voltage followed by one time, and then the other pair. I've to write another code to deal with the array.
   Thank you so much.
 
Best wishes
Bo
------------------------
My blog Let's LabVIEW.
0 Kudos
Message 1 of 9
(3,807 Views)
P.S I'm using NI LabVIEW 8.5 under Windows XP
Thanks


Message Edited by foolooo on 02-08-2008 11:17 AM
------------------------
My blog Let's LabVIEW.
0 Kudos
Message 2 of 9
(3,806 Views)
To summarize my problem, usually the internal clock is used to "generate" the signals, but can I use it to "measure" the acquired signals? And how can I implement that? Thank you.
------------------------
My blog Let's LabVIEW.
0 Kudos
Message 3 of 9
(3,777 Views)
Hi,

In response to your 1st post, for time critical real-time systems, I recomend that consider using LabVIEW RealTime. More information can be found here:

http://www.ni.com/realtime/

When you use a DAQmx task to measure signals, hardware timing is used to acquire those signals. They are then sored in a Buffer which the program reads. There are various DAQmx examples under Help>>Find Examples that demonstrate this.

If you take a look at those then let me know if you have any specific generation questions I'll do my best to be of help to you.

Best Regards,

Ian Colman
Applications Engineer
National Instruments UK & Ireland


0 Kudos
Message 4 of 9
(3,736 Views)

Hi Ian,

Thank for your reply, and I'm trying to create some new project on the real-time. But I think thought I can get a more accurate time with LVRT, I cannot get the time variance or/and how accurate the time can be. What I want is to measure the hardware, both the speed and the accuracy, so is there any manner that can implement that? Thank you very much.

Best wishes,

Bo

------------------------
My blog Let's LabVIEW.
0 Kudos
Message 5 of 9
(3,732 Views)

Hi Ian,

I'm afraid that I didn't create the LVRT project correctly, for the wizard didn't recorganized my board. My board is PCIe-6251 M series. I just want to record the time of data acquisistion, and write them into a file continuously. The time i've got is based on the software, which may be not so reliable. Have you got any idea about how i can record the time by the hardware? I'm new to the labview, sorry for trouble you all. Thanks. (Attached is the pic of my code)

Best wishes

Bo

 



Message Edited by foolooo on 02-15-2008 10:26 AM
------------------------
My blog Let's LabVIEW.
0 Kudos
Message 6 of 9
(3,718 Views)

Hi Ian and others,

After a short study of LVRT, I got that I need to change my OS to LVRT. But is there any way that I can avoid doing that? I mean I don't need to get free of the error of windows by now, and I just to get the time information together with the acquired data. And I hope it could be from the hardware. Thank you all for your kind reply.

Best wishes

Bo

------------------------
My blog Let's LabVIEW.
0 Kudos
Message 7 of 9
(3,691 Views)
Hi Bo,

If you simply want your samples to have a time associated with them, just acquire your data in a waveform format. The software provides a t0 and then the hardware provides dt and a hardware clock automatically. In this case you should be seeing good accurate timing associated with your samples.

Hope this helps,

Ian
0 Kudos
Message 8 of 9
(3,646 Views)

Hi Ian,

Thank for your kindly help! It works.

Best wishes

Bo

------------------------
My blog Let's LabVIEW.
0 Kudos
Message 9 of 9
(3,644 Views)