To download NI software, including the products shown below, visit ni.com/downloads.
Overview
The example is to demonstrate how to programmatically change the time of the convert clock
Description:
Since the default of the convert clock of the DAQ is often faster than necessary, it wouldn't align with the ocassional
times when the user wants to have the longest time between the channel as much as possible for more accurate readings
because it would allow more time for the signals to settle before taking a measurement
Steps to Implement or Execute Code:
To implement this example:
1. Define the Input Physical Channel
2. Set the value of the parameters as needed
4. Run the VI
5. (Optional) Turn on the Highlight Execution to see the flow of the VI
To execute this example:
1. Install the required software.
2. Connect the DAQ hardware that supports the AI features
3. Confirm the connection with the MAX with TestPanel
4. Open the VI and refer the Implement Steps
Requirements
Software
LabVIEW 2012 or compatible
NI-DAQmx 9.0 or compatible
Hardware
cDAQ with C series Analog Input Module
**This document has been updated to meet the current required format for the NI Code Exchange.**
Additional Information:
Description-Separate-2Example code from the Example Code Exchange in the NI Community is licensed with the MIT license.
The actual rate may be a little higher due to limitations of the clock generating hardware.
To account for that, instead of using the Rate control value to calculate the Ideal Min Convert Rate, you can use the "actual" sample rate. Just place a Sample Clock > Rate property in step 3 and use its output to both, verify the actual sample rate and calculate the Ideal Min Convert Rate. See the snippet below.
Regards,
Camilo.