01-19-2009 04:04 AM
Hi,
I have a 4 byte PIC-Hex value that I know the decimal equivalent of (8B 00 46 57 = 4104.79).
Initially the PIC-Hex is converted to normal Hex by shifting some bytes, which returns 45 80 46 57, which in turn is converted to 4104.79.
I'm struggling to find a method for converting from the initial state to the decimal state using LabVIEW, any help would be much appeciated.
Cheers
Chris
P.s. My knowledge of byte shifting is zero so any information on how to do it would be useful too.
01-19-2009 04:52 AM - edited 01-19-2009 04:53 AM
Chris Woodhams wrote:Hi,
I have a 4 byte PIC-Hex value that I know the decimal equivalent of (8B 00 46 57 = 4104.79).
Initially the PIC-Hex is converted to normal Hex by shifting some bytes, which returns 45 80 46 57, which in turn is converted to 4104.79.
The value 4580 46 57, what is the datatype and how is this represented (if it's a string indidicator, is the display set to HEX?)?
8b004657 is not 4104,79 (see this online calculator)
Let's assume it is a string and the indicator is set to hex:
LabVIEW is fully based on IEEE754 and typecasting the data from string to sgl will do:
Ton
01-19-2009 05:18 AM
I know 8B 00 46 57 doesn't directly convert to 4104.79, first I have to convert the 8B 00 46 57 to 45 80 46 57 by shifting some of the bytes around.
Your solution works great for the conversion of 4580 4657 to decimal, I was inputting the string in a different format, I left a space between each byte (45 80 46 57)!
Do you know anything about shifting and carrying bytes? Getting from 8B 00 46 57 to 45 80 46 57.
Chris
01-19-2009 05:49 AM - edited 01-19-2009 05:51 AM
I have been looking at the (bit-wise) data.
If you shift the bits of the first 2 bytes (word) to the right you get the transaction (I have shift bits). Why one would only shift the upper word I have no idea.
Here's the full conversion:
Ton
01-19-2009 06:31 AM
That is brilliant, thank you!
I have one more question! The format that the string is read is not in the format (with a space after two bytes) that LabVIEW requires at the input to the type cast, how do I get it into that format and ensure it is in Hex format as well? That might not make sense so I have attached a vi of the string that is read and what I tried.
I don't know why it would be set up like that, this is how I have recieved it from the supplier.
Cheers
Chris
01-19-2009 06:54 AM - edited 01-19-2009 06:59 AM
Chris Woodhams wrote:That is brilliant, thank you!
I have one more question! The format that the string is read is not in the format (with a space after two bytes) that LabVIEW requires at the input to the type cast,
Chris
You are welcome, (kudos too).
The string as shown in the codesnipet I added is a normal string, set to HEX format (right click on the control, Hex-Display), if you have set the string to normal display and see the hex-string you have to convert the data with the string-numericconverserion.
EDIT:looked at your attachment and see you have a normal string (ASCII display) with hex-like strings:
Suggestion for study try to understand the differences between the last two code snippets!
Ton
01-19-2009 07:02 AM - edited 01-19-2009 07:10 AM
This is how to I think. Your string is a hex number in stringformat. So you have to convert it to a U32 before you convert it as TonP did.
As I was writing TonP was quicker to came up with a solution 😉
01-19-2009 08:11 AM
ThankThank you for all your help, much appreciated, everything is working as expected.
With regards to understanding the difference between the two code snippets, it is purely understanding the difference between type cast and numeric conversions?
Thanks again
Chris
01-19-2009 08:36 AM
Chris Woodhams wrote:With regards to understanding the difference between the two code snippets, it is purely understanding the difference between type cast and numeric conversions?
Yes it is, and the display mode of the string.
Ton