To download NI software, including the products shown below, visit ni.com/downloads.
Demonstrates how to convert string's hexadecimal representation to integer decimal equivalent.
Description
The example uses converts individual ASCII string characters to their decimal equivalents and then to their hexadecimal equivalents. These hex equivalents are returned as strings. The hex equivalents are concatenated into an unbroken set of digits. The resulting set of digits is identical to that seen in the Hex String Input except without the spacing. The set of hex digits is converted to their decimal equivalent and displayed.
Requirements
LabVIEW 2012 (or compatible)
Steps to Implement or Execute Code
Additional Information or Resources
VI Block Diagram
**This document has been updated to meet the current required format for the NI Code Exchange.**
Example code from the Example Code Exchange in the NI Community is licensed with the MIT license.
Just using Type Cast would be a lot easier...
This is something I was playing around with for a while and I couldn't figure out another way to get the conversion to work. Would you mind showing how you'd do this with a Type Cast?
Since we are dealing with a U64 in the end, we first need to make sure we have 8 bytes in our string. From there, it is just a simple Type Cast to a U64.
I also fixed a couple of bugs I found in your code while doing comparisons. If we have more than 8 bytes in the string, your code will just max out the U64. Also, try putting in a 0x01 in your code and it turns into a single character when it should be 2.
I'll let you decide which on looks simpler, but I know mine will be more efficient.
That's a clever way to handle it. And, you're right; your way is definitely more efficient.Thanks for posting that!