LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Sample rate fluctuates for certain sampling frequencies

Solved!
Go to solution

Hardware: PXIe-1082 chassis, PXIe-8861 controller, PXIe-4492 Dynamic Signal Acquisition (DSA) analog input device.

Software: LabVIEW 2019 (19.0.1f5 32-bit)

 

The title explains well my problem. The analog signal comes from a single pre-polarized condenser microphone.

As you can see, the dt between two consequent samples has a "sawtooth" behavior for nominal sampling rates of 30 kHz, 50 kHz, 70 kHz and 150 kHz. The horizontal lines in each subplot represents the nominal expected time interval (dt = 1/fs). 

 

untitled.png

 Here you have a screenshot of the Block Diagram and the Front Panel, the .vi file is attached:

Capture (2).PNG

Capture.PNG

 

I could not avoid noticing that the problematic frequencies are not an exact divisor of the Sample Clock Timebase, but I still miss the solution to this problem. Any idea on the nature of the issue would be very welcome. Thank you in advance!

0 Kudos
Message 1 of 6
(1,348 Views)
Solution
Accepted by topic author gbillot

How do you measure dt?

Could it be, that the the numerical resolution of your txt file is the reason ?

 

EDIT: And always read the actual SR after configuration. Not every SR is possible and rounded to the next higher possible SR.

 

Measuring the actual ADC sample clock jitter is another task 😉

 

Greetings from Germany
Henrik

LV since v3.1

“ground” is a convenient fantasy

'˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'


Message 2 of 6
(1,316 Views)

Henrik's right to zero in on the question of how you measure dt. 

 

Those plots have have all the hallmarks of a *quantization* problem in your measurement of dt.  Notice the regular pattern of how it toggles between only 2 discrete values.  That's exactly what you can expect from a quantized measurement.

 

The *actual* sample rate is not fluctuating that way, only your quantized *estimate* of it is.  

 

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
Message 3 of 6
(1,290 Views)

@Henrik_Volkers wrote:

How do you measure dt?

Could it be, that the the numerical resolution of your txt file is the reason ?

 

EDIT: And always read the actual SR after configuration. Not every SR is possible and rounded to the next higher possible SR.

 

Measuring the actual ADC sample clock jitter is another task 😉

 


See this link for allowed sample rates; as Henrik said, you are probably not sampling at the rate you think you are. Some allowed sampling rates for 4492 are 204.8kSa/s, 163.84kSa/s, 102.4kSa/s, 81.92kSa/s, 51.2kSa/s, 40.96kSa/s, 25.6kSa/s, 20.48kSa/s, 12.8kSa/s, 10.24kSa/s, and  6.4kSa/s.

0 Kudos
Message 4 of 6
(1,268 Views)

Hi Henrik,

 

indeed the issues seems to lie in the precision with which the time column is stored. The Write to Measurement File Express VI (advised against by many) has an option to write a "X column" obtained as t = index of the sample * (1/fs). These values are truncated by default to 10e-4, and that's how the bias is introduced for those fs whose inverse (dt) has many decimals, like 70 kHz. Now I have replaced the Express VI with a Write Delimited Spreadsheet VI, which allows to specify the format (.n%f). However, I still don't know how to write a time array with the actual sampling instants, and not an array created with the nominal dt. All I found is how to time some parts of the VI itself, like the while loop. But that has to do with computing time, not with hardware sampling. 

 

Thank you for pointing me in the right direction.

0 Kudos
Message 5 of 6
(1,204 Views)

Hi Henrik,

 

indeed the issues seems to lie in the precision with which the time column is stored. The Write to Measurement File Express VI (advised against by many) has an option to write a "X column" obtained as t = index of the sample * (1/fs). These values are truncated by default to 10e-4, and that's how the bias is introduced for those fs whose inverse (dt) has many decimals, like 70 kHz. Now I have replaced the Express VI with a Write Delimited Spreadsheet VI, which allows to specify the format (.n%f). However, I still don't know how to write a time array with the actual sampling instants, and not an array created with the nominal dt. All I found is how to time some parts of the VI itself, like the while loop. But that has to do with computing time, not with hardware sampling. 

 

Thank you for pointing me in the right direction.

0 Kudos
Message 6 of 6
(1,200 Views)