LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Best way to safely convert ASCII string num to U16 num w/ ability to detect whether the string number is out of range and/or non-numeric chars

I honestly can't believe I've never considered this before I will admit the likely hood of this happening and or harming anything is quite minimal but now that I've thought about it I need to put my mind at ease. I must be missing something very obvious.

 

I'm working on a driver for a serial device. When I request certain state values from it I get them back without any indication of units or which attribute the value relates to. If everything goes as planned that's not a big issue, presumably when I request the temperature, the next read value of say "21.5\r" is the answer to my request. However, this shows how critical it is to have my reading frame synchronized. This should be no problem but I've already detected some undocumented data that this device seems to output which could absolutely mess up this synchronization. So far it seems to occur if the device end board reboots or clears alarms. Unfortunately this device also has user control pad on it that is not locked out and it is possible people will interact with it and mess up my synchronization that way too.

 

But on to the heart of the matter. let's say I'm expecting some number reply like "65001\r" but I get something unexpected like "fault\r". It seems that running my string into the convert to decimal function with a U16 wired into the default value terminal will simply give me the default value, which is in fact a perfectly valid U16 number that will be parsed as a bit field with all manner of implications and side effects, I would presume. 

 

My thought was, I'll just convert using fractional string to fractional number first, check the range on the values (since I know the valid ranges or when to expect a U16 response), then convert to U16 for those bit field parsing circumstance if the dbl was within the U16 range (0-65535). If not within the range I know something is amiss, likely I've become unsynchronized with my reading data, and I can try to possibly flush the buffer, read the buffer until it's empty/timeout, and retry the query command with the hope that I'll be synchronized with the following response.

 

But I can't get over the fact that there's no obvious dead-simple way to convert a string to U16 with an error out wire that will tell me if the string is in fact a U16 valid value. What am I missing?

 

I also had ChatGPT concoct me a pretty impressive regex to try this, even handling the \r and various letter chars adjacent to a U16 number. Still doesn't seem like the right answer for how to do this in LabVIEW but making Regex strings is the one single thing I most use ChatGPT for, he/she is freaking brilliant! Even explains the purpose of each part of the generated regex it creates for you. The days of the office regex virtuoso smugly looking down on everyone that stops by to beg for help are definitively numbered.

0 Kudos
Message 1 of 6
(445 Views)

This would be nice. One thing you could do is set the default to something other than 0. Something that is a lot larger or even a negative number, then put that number into the error cluster to generate an error.




Joe.
"NOTHING IS EVER EASY"
0 Kudos
Message 2 of 6
(435 Views)

Here it is in LV 2018.

Also fixed, because I forgot to connect the connector pane in the original.

"If you weren't supposed to push it, it wouldn't be a button."
0 Kudos
Message 5 of 6
(346 Views)

Scan From String with an argument of "%d" will enforce only decimal numbers, and will throw an error if any characters other than 0-9 are entered.  The only thing it won't do by itself (which Paul's code DOES handle) is if you get a decimal number larger than the valid range of U16's.

0 Kudos
Message 6 of 6
(332 Views)