LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Sample rate vs. samples per channel

Hoping someone can shed some light on this for me. I'm still a little lost as to exactly what sample rate and samples per channel mean in Labview as opposed to other data acquisition software and hardware I've used in the past. I'm using Labview 10 and a cDAQ-9174

 

If you look at the attached screenshot, the way I read it, this task will tell the module in the cDAQ to start acquiring samples. The module will acquire 10 samples, 100 times a second. Or, to state it a different way, it acquires 1000 samples per second. It is my understanding that the module and the chassis itself have some buffer room on them. Once the loop iterates the read.vi pulls the samples from the buffer and displays the voltage on the indicator labeled voltage and enqueues the data. The data displayed and the data enqueued is what? Since the polymorphic terminal is set to 1chan1samp, is it just a sample grabbed at random and displayed? Is there any rhyme or reason to it? If I instead set it to 1chanNsamp would it be a 1D array of all 1000 samples or just an average of every 10 samples in an array? This is the part where I get confused.

 

Any clarity that someone can shed would be helpful.

 

Thanks,

 

-Ian

0 Kudos
Message 1 of 9
(26,203 Views)

Sample rate is how fast you sample a signal.  Samples per channel is the size of buffer alloted per channel.  If you have a sample rate of 1000 samples/s, and you have 1000 sampels per channels, your buffer will overflow if it is not read before 1s elapsed.  you always want to set your samples per channels a few times over your sampling rate.

------------------------------------------------------------------

Kudos and Accepted as Solution are welcome!
Message 2 of 9
(26,199 Views)

Sample rate is how fast you sample a signal.  Samples per channel is the size of buffer alloted per channel.  If you have a sample rate of 1000 samples/s, and you have 1000 sampels per channels, your buffer will overflow if it is not read before 1s elapsed.  you always want to set your samples per channels a few times over your sampling rate.

------------------------------------------------------------------

 

 

Just for clarification,  regarding the sample clock, you said that "samples per channel" is the size of the buffer per channel.  Is this true for CONTINUOUS, FINITE, and H/W TIMED SINGLE POINT, or, only for CONTINUOUS sample mode? 

 

In the case where you cause the buffer to overflow by setting the samples per channel less than the sampling rate, do you lose data, or do you get an error?  What happens when you overflow the buffer?

 

With my application, I am collecting a voltage using virtually the identical VI shown above, and I am then sending that voltage value to some calculations which then generate a string command that is sent to my external device through a serial connection.  Essentially, I am converting voltage measurements to device commands, e.g. high voltage means "go FAST", and low voltage means "go SLOW" etc.  I turn it on, voltage varies, and the commands flow.

 

I am confused because, it appears that the commands are lagging the voltage readings, meaning that if the voltage increases (force sensor/strain gauge) I should see my device change speed, but, I do not see it every time.  Instead, SOMETIMES it changes speed immediately, but, other times, it takes a long time, like more than 1 second.  Sometimes, it appears to go the wrong direction, and then "catch up."  For example, if the voltage is increased, the external device may slow down initially, and then speed up later, say over a 1 to 2 second time period.  This makes me think that there is some buffer that is collecting voltage values faster than my command algorithm can send them to the external device.  I am using the NI 9211 in cDAQ 9174 chassis, to collect the voltage data, which I was told should be several times per second (the 9211 has 4 channels and can sample 14/s, so, each channel would be about 3/s, according to my understanding). 

 

So, I am wondering if my algorithm that creates and writes the string commands to the external device is taking more time than it takes to collect one voltage data point.  What I want is for my VI to measure one voltage, and create/send that command from that voltage value, BEFORE it measures another voltage value.  In other words, I want EVERY voltage value to be converted to a command for the device.  How can you tell the relative speed of two parts of a VI, and how can I determine whether the voltage data is accumulating in a buffer and then sending commands "late"?  In my VI, the Samples per Channel, and the Rate are both set by the user on the FRONT PANEL.

 

Thanks for your thoughts.

0 Kudos
Message 3 of 9
(26,187 Views)

Hi,

 

Please see bold below for my response. 

 

 

Just for clarification,  regarding the sample clock, you said that "samples per channel" is the size of the buffer per channel.  Is this true for CONTINUOUS, FINITE, and H/W TIMED SINGLE POINT, or, only for CONTINUOUS sample mode? 

For all modes.

 

In the case where you cause the buffer to overflow by setting the samples per channel less than the sampling rate, do you lose data, or do you get an error?  What happens when you overflow the buffer?

When overflow occur, normally, you get an error. However, there are different setting that you can set for your DAQ buffer to bypass the error

 

With my application, I am collecting a voltage using virtually the identical VI shown above, and I am then sending that voltage value to some calculations which then generate a string command that is sent to my external device through a serial connection.  Essentially, I am converting voltage measurements to device commands, e.g. high voltage means "go FAST", and low voltage means "go SLOW" etc.  I turn it on, voltage varies, and the commands flow.

 

I am confused because, it appears that the commands are lagging the voltage readings, meaning that if the voltage increases (force sensor/strain gauge) I should see my device change speed, but, I do not see it every time.  Instead, SOMETIMES it changes speed immediately, but, other times, it takes a long time, like more than 1 second.  Sometimes, it appears to go the wrong direction, and then "catch up."  For example, if the voltage is increased, the external device may slow down initially, and then speed up later, say over a 1 to 2 second time period.  This makes me think that there is some buffer that is collecting voltage values faster than my command algorithm can send them to the external device.  I am using the NI 9211 in cDAQ 9174 chassis, to collect the voltage data, which I was told should be several times per second (the 9211 has 4 channels and can sample 14/s, so, each channel would be about 3/s, according to my understanding). 

My guess is where you are reading in your array of measurement.  For example, everytime you read, you read 1000 samples.  The sample of interest could be buried in the array.  Every time you do a read, you should take a min, max, avg, etc of the array to process the information.  You can also process the array one element at a time to detect change. 

 

So, I am wondering if my algorithm that creates and writes the string commands to the external device is taking more time than it takes to collect one voltage data point.  What I want is for my VI to measure one voltage, and create/send that command from that voltage value, BEFORE it measures another voltage value.  In other words, I want EVERY voltage value to be converted to a command for the device.  How can you tell the relative speed of two parts of a VI, and how can I determine whether the voltage data is accumulating in a buffer and then sending commands "late"?  In my VI, the Samples per Channel, and the Rate are both set by the user on the FRONT PANEL.

 

Thanks for your thoughts.

------------------------------------------------------------------

Kudos and Accepted as Solution are welcome!
Message 4 of 9
(26,173 Views)

sorry to interfere in your chit chatting 😉 but I also have a question considering Daqmx timing (sample clock). Correct me if I'm wrong. Sample clock corresponding to the DAQ hardware means: rate- how fast AD convert my data to the digital to the buffer and samples per channel in the same Sample clock device-corresponding to the size of the buffer that I dictate to write the data. Then if I read the data in the Daqmx read the samples of channel corresponds the samples per channel that I free the buffer from data. So I should free the buffer fast as I can not to cause overloading the buffer and in turn not to cause a situation that I read the empty buffer.

 

 

0 Kudos
Message 5 of 9
(25,834 Views)

If I am understanding you correctly, I believe you are accurately describing the functionality of the sample clock. From our LabVIEW help the DAQmx Timing VI (which is where the sample clock is configured) "Configures the number of samples to acquire or generate and creates a buffer when needed. The instances of this polymorphic VI correspond to the type of timing to use for the task."

 

I hope this clears things up.

 

Regards,

 

Doug B

Applications Engineer
National Instruments
0 Kudos
Message 6 of 9
(25,804 Views)

Hi,

We are also having issues with DAQmx Sample Clock. What units is the rate input in? We assumed this would be in Hz (1/s), but the data that is output does not reflect this. Thanks! 

0 Kudos
Message 7 of 9
(9,641 Views)

You can expect better, more useful help if you present your question/problem in a better, more useful way.  This usually involves some moderate amount of effort to describe the problem, discuss what you've tried and observed, and then post code and/or default data whenever possible.   Help us help you.   That said...

 

1. Yes the sample rate units are Hz (samples per second).

2. Some devices only support a relatively high *minimum* rate.  Many (most?  nearly all?) devices will silently (mostly) coerce an out-of-range sample rate request into a valid legal value. 

    For example, I know some cDAQ bridge interface modules have a minimum rate of about 1.6 kHz.  That's what you'll end up getting if you ask for a lower rate, even as low as 1 Hz.

3. If you read from the device in waveform format, the "dt" field will reflect the true, actual sample rate.  You can also drop a DAQmx Timing property node to query the sample rate after setting it.  This will also return the true, actual rate.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
Message 8 of 9
(9,630 Views)

I found that when I use 10K sampling rate and 1000 samples per channel, it doesn't saturate the DAQ card when I read the same (1000) number of sample. I assume that as long as the samples per channel can be read in time, it won't overload the DAQ.

 

I'm also on my way of learning. Thanks.

0 Kudos
Message 9 of 9
(276 Views)