LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

NI Daq device analog input onboard buffer continuous acquisition

Solved!
Go to solution

I have AO0 connected to AI0 and AO1 connected to AI1

one output is a cosine function and the other a sine

 

I am trying to utilize the NI device buffer to acquire/store data

 

i can increase the buffer read time so that the buffer should fill and start overwriting the first value but i see 150 samples in my buffer of 100 size and the chart doesn't indicate any overwritten or missed data.

 

cwhw112_0-1702516076330.png

 

 

 

The AvailSampPerChan seems to reset when the read occurs

Does the read clear the buffer?

 

The specs for the NI 6251 say the input fifo is 4095 samples

I am interpreting this to be the maximum buffer size

I can put a much larger value in for my buffer size (200,000,000) and dont get an error (too high says memory cant be allocated)

What is the maximum buffer size?

 

The limit on sampling rate seems to be around 750,000 (instead of the spec'ed 1MHz which causes a USB error)

increasing my buffer read time to 10sec, I see 7,684,692 Samples in Buffer

With my buffer size set to 750,000 and rate to 750,000 - i see around 743,000 in my Samples in Buffer with Buffer read set to 965ms (any higher loop rate and I get an error that the application cant keep up - although i can increase the buffer size to get rid of the error)

so it doesnt like the samples in buffer > buffer size by 7000

buffer size 10000 and sampling rate 10000 update rate 999 causes the keep up error

buffer size 1000 and sampling rate 1000 update rate 10,000:no error (samples in buffer greater than buffer size by 9000)

buffer size 5000 and sampling rate 5000 update rate 2,000: sometimes error, sometimes not

I cant find a pattern to determine

 

How much greater does AvailSampPerChan have to be than Buffer size before an error occurs?

Will the error occur before the buffer is filled up and begins overwriting itself?

 

with larger values, my x axis updates every 1/2 sec but spans 1023 but 1.023 seconds havent passed. 

I believe these are samples, not milliseconds and the right side of the axis seems to update to match this idea but the xaxis autoscale must only keep the high value of samples showing the previous 1023 (maximum)

Is my interpretation corrrect?

 

 

0 Kudos
Message 1 of 8
(711 Views)

@cwhw112 wrote:

 

The specs for the NI 6251 say the input fifo is 4095 samples

I am interpreting this to be the maximum buffer size

I can put a much larger value in for my buffer size (200,000,000) and dont get an error (too high says memory cant be allocated)

What is the maximum buffer size?


The FIFO is a hardware buffer; the one you can change and are changing through the API calls is a "software" buffer. As the FIFO empties out it fills the buffer(RAM) on your computer. This is what you are allocating.

 

The limit on sampling rate seems to be around 750,000

The NI 6251 is muxed; the Samples per second specifies the max aggregate rate is 1 MSa/s, thus for 2 channels the max rate is 500kSa/s.

 

How much greater does AvailSampPerChan have to be than Buffer size before an error occurs?

When your buffer overflows on the software side, you will get an error and acquisition will stop. The way your VI is set up now you are reading all available samples in buffer.

 

Here are some suggestions:

  1. Get rid of the Wait in the loop; the loop rate will be controlled by the DAQ Read.
  2. Read about 100ms of data every loop; thus for a 1MSa/s rate read 100kSa/s of data each loop.
  3. The samples per channel specifies the number of samples to acquire or generate for each channel in the task if sample mode is Finite Samples. If sample mode is Continuous Samples, NI-DAQmx uses this value to determine the buffer size. If you don't want to mess with buffer functions set this value to the sample rate, that is, 1 s of data.
  4. Why are you transposing the data? Use a Waveform Graph.

See if this gives you some ideas.

 

snip.png

 

 

 

 

Message 2 of 8
(686 Views)

Never mind, mcduff's explanation is much clearer than mine ...

 

Bob Schor

0 Kudos
Message 3 of 8
(664 Views)

I am just making an Activity from the "LabVIEW for Everyone" book.

OK, I went back and re-read, the buffer is not on the hardware, it is in the PC but allows a place to store without having to handle each point one at a time. (may have gotten the hardware trigger blurred into this)

There does appear to be max buffer limit and if you go over, get an error.

I didnt realize aggregate meant for both and divided by 2 for each. thank you

The example uses a chart but doesnt transpose it (maybe due to their older version). the book associates graphs with once everything is collected to be graphed at the end, and chart as more of an as you get the data it is charted, so maybe you lose the previous buffer scan data if using a graph.

To confirm, the read does clear the buffer?

also the x axis is samples, not time?

 

The most unclear to me is the image example. buffer size is set to 100 but I read 1.5 seconds so 150 values, well over the buffer size.

I even slowed it down and see 4 times the buffer size with no sign of corruption/overwritten data

cwhw112_0-1702601747544.png

 

0 Kudos
Message 4 of 8
(615 Views)
Solution
Accepted by topic author cwhw112

@cwhw112 wrote:

 

To confirm, the read does clear the buffer?


Yes

 


@cwhw112 wrote:

 

also the x axis is samples, not time


You are plotting arrays of points so there is no timing information, thus it is being plotted by array index, that is, dt =1. If you changed the datatype to waveforms you would have timing information there. The read VI is polymorphic, try changing it.

 


@cwhw112 wrote:

 

The most unclear to me is the image example. buffer size is set to 100 but I read 1.5 seconds so 150 values, well over the buffer size.

I even slowed it down and see 4 times the buffer size with no sign of corruption/overwritten data


Maybe you should read this

 

There are commands to explicitly set the buffer size, but you are not using them. For continuous acquisition I never wire the number of samples in the timing mode. I also always use the buffer VIs to set the buffer size. Look in the DAQmx palette for them. (No LabVIEW on this computer.)

0 Kudos
Message 5 of 8
(612 Views)

Here's a key idea for your learning:  with continuous sampling, DAQmx treats the buffer as though it's circular.  When it gets to the end, it starts right over again at the beginning.

 

Once you start a continuous sampling task, the driver is doing most of the dirty work for you in the background, continuously pushing data into the software task buffer in a circular fashion.  However, it also keeps track of which data your app has pulled from the buffer with calls to DAQmx Read.  So it knows if it's about to write over top of previous data that you haven't pulled out yet - in which case it throws an error.

 

Your job as a programmer is to keep calling DAQmx Read to pull data out of the buffer at a rate that, on average, keeps up with the rate that the driver pushes it in, which in turn is trying to keep up with the rate that the device is sampling.

 

Typically, the driver works at a very low level with minimal overhead, so it pushes small chunks of data into the buffer very frequently.  DAQmx Read works up at the app level where there's more overhead associated with each read, so it tends to work more reliably to pull decent-sized chunks of data from the buffer less frequently.  A very useful rule of thumb is to pull out 0.1 sec worth of data per read -- DAQmx can typically support a very wide range of sample rates when you follow this rule.

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
Message 6 of 8
(593 Views)

https://www.ni.com/docs/en-US/bundle/ni-daqmx/page/mxcncpts/buffersize.html

link is broken

 

but i think i found it

 

I put this after the timing block, before the start task

 

cwhw112_0-1702853128533.png

this seems to set the buffer. i cant seem to go 1 over without an error

 

 

0 Kudos
Message 7 of 8
(560 Views)

That is one; there is the same VI in the palette below. (The property node is just wrapped in an icon.)

 

Can you explain your error better? The buffer settings have certain limits and conditions.

 

mcduff_0-1702873597949.png

 

0 Kudos
Message 8 of 8
(538 Views)