LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How do I convert an U32 integer to ascii?

Solved!
Go to solution

I am communicating with a digital device. I can send it commands, and receive ascii as an output. In many instances, the data I receive is in packets, which I decode. When I combine my 8 bits with the build array function, I get a U32 integer. Currently I am converting that to a U8 integer, and displaying the result as a program rev. We would like to display it as an ascii value, and I have been unable to figure out how to do it. I am including a screen shot of where I am trying to make the conversion. Thank you!

 

0 Kudos
Message 1 of 6
(8,630 Views)
Solution
Accepted by topic author SciManStev

Hi SciManStev,

what do you mean with ascii value? if you need a string, then you can use the "type cast" function.

 

Mike

Message 2 of 6
(8,623 Views)

Please just post your image as a .png file or .jpg file attached your message.  Don't embed it in a Word Document.

 

What you want is the function Number to Decimal String located on the String >> String/Number conversion palette.

 

EDIT:  The function you used (Format into String) will work also if you get rid of the "%%" format constant that you have or replaced it with %d

Message Edited by Ravens Fan on 03-19-2009 09:28 AM
Message 3 of 6
(8,622 Views)

The string to number conversion does not change the number. It just changes it from a number to a string. For example the U8 result I get is 31 hex. This is a "1" in ascii.

I have included a jpeg screen shot.

 

0 Kudos
Message 4 of 6
(8,617 Views)
The type cast did it! Thank you so much! I had never used that function, so I learned something new.
0 Kudos
Message 5 of 6
(8,613 Views)

Thanks bro!

0 Kudos
Message 6 of 6
(5,909 Views)