Example Code

Setting AI Convert Clock Rate for the Longest Settling Time in DAQmx.

Products and Environment

This section reflects the products and operating system used to create the example.

To download NI software, including the products shown below, visit ni.com/downloads.

    Hardware

  • Data Acquisition (DAQ)

    Software

  • LabVIEW

    Driver

  • NI DAQmx

Code and Documents

Attachment

Description

Overview
The example is to demonstrate how to programmatically change the time of the convert clock

Description:
Since the default of the convert clock of the DAQ is often faster than necessary, it wouldn't align with the ocassional
times when the user wants to have the longest time between the channel as much as possible for more accurate readings
because it would allow more time for the signals to settle before taking a measurement

Steps to Implement or Execute Code:

To implement this example:

1. Define the Input Physical Channel
2. Set the value of the parameters as needed
4. Run the VI
5. (Optional) Turn on the Highlight Execution to see the flow of the VI

To execute this example:

1. Install the required software.
2. Connect the DAQ hardware that supports the AI features
3. Confirm the connection with the MAX with TestPanel
4. Open the VI and refer the Implement Steps
 

Requirements
Software
LabVIEW 2012 or compatible
NI-DAQmx 9.0 or compatible

Hardware
cDAQ with C series Analog Input Module

 

clockfp.PNG

clockbd.PNG

 

**This document has been updated to meet the current required format for the NI Code Exchange.**

 

 

 

Additional Information:

KnowledgeBase 2XPE1QCW: How is the Convert (Channel) Clock Rate Determined in NI-DAQmx and Tradition...

KnowledgeBase ID 30LDURMV: Difference Between the Sample Clock (Scan Clock) and the Convert Clock (C...

Eric S.
AE Specialist | Global Support
National Instruments

Example code from the Example Code Exchange in the NI Community is licensed with the MIT license.

Comments
Cavarval
Active Participant
Active Participant
on

The actual rate may be a little higher due to limitations of the clock generating hardware.

To account for that, instead of using the Rate control value to calculate the Ideal Min Convert Rate, you can use the "actual" sample rate. Just place a Sample Clock > Rate property in step 3 and use its output to both, verify the actual sample rate and calculate the Ideal Min Convert Rate. See the snippet below.

Change convert Clock Rate Snippet.png

Regards,

Camilo.

Camilo V.
National Instruments