LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

VISA Read gets incorrect data from serial connection

I am having difficulty using the VISA functions in LabVIEW to read data from a virtual COM port. Data is being sent from a serial to USB chip via a USB connection using OpenSDA drivers. I have a python program written to read from this chip as well, and it never has an issue. However, when trying to achieve the same read in LabVIEW I am running into the problem of getting incorrect data in the read buffer using the VISA Read function.

 

I have a VISA Configure Serial Port function set up with a control to select the COM port that the device is plugged into. Baud rate is default at 9600. Termination char of my data is a newline, which is also default. Enable termination char is set to true. A VISA Open function follows this configuration, and then feeds the VISA Resource Name Out into a while loop where a VISA Read function displays the data in read buffer. Byte count for the VISA Read is set to 20 so I can read more of the erroneous datat, however actual data will only be 6-12 bytes. The while loop has a wait function, and no matter how much I slow down the readings I still get incorrect data (I have tried 20ms thru 1000ms). 

 

The data I expect to receive in the read buffer from VISA Read is in the form of "0-255,0-255,0-255\n", like the following:

 

51,93,31\n
or
51,193,128\n

 

And occasionally I receive this data correctly, however I intermittently (sometimes every couple reads, sometimes a couple times in a row) get incorrect readings like this:

 

51,1\n

51,193739\n

\n

51,1933,191\n

51,,193,196\n

51,1933,252 

 

51,203,116203,186\n

 

Where it seems like the read data is truncated, missing characters, or has additional characters. Looking at these values, however, it seems like the read was done incorrectly because the bytes seem partially correct (51 is the first number even in incorrect reads).

 

I have search but haven't found a similar issue and I am not sure what to try from here on. Any help is appreciated. Thanks!

 

0 Kudos
Message 1 of 18
(5,005 Views)

The first thing is that none of the error clusters are connected so you could be getting errors that you are not seeing. Are you sure about the comm parameters? Finally, I have never had a lot of luck looking for termination characters. You might want to just capture data and append each read into one long string just to see if you are still seeing this strangeness.

 

What sort of device is returning the data? How often does it spit out the data? How much distance is there between it and your computer? Can you configure it append something like a checksum or crc to the data?

 

Mike...


Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion

"... after all, He's not a tame lion..."

For help with grief and grieving.
Message 2 of 18
(4,998 Views)

You do not need the 20 ms Wait. The VISA Read will wait until it gets 20 characters, the termination characrter, or timeout, whichever occurs first.  If you get a short message "1,1,1\n" which only takes about 6 milliseconds, the loop will wait the rest of the 20 ms. During that time one or two more complete messages could have been sent.  I am not sure how the port handles such situations, but this might explain part of your issue.  

 

Try removing the Wait.

 

Lynn

0 Kudos
Message 3 of 18
(4,992 Views)

I've added a Wait so that there is less chance of this loop overexerting the CPU and it is easy to visually check the read buffer (at 1000ms waits), however I have tried removing it and the problem still persists. Thanks for the insight into how VISA Read works. 

0 Kudos
Message 4 of 18
(4,982 Views)
As long as the port buffer doesn't overrun the messages should be buffered. Of course if it did overrun you'd never know about it due to the lack of error handling.

Mike...

Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion

"... after all, He's not a tame lion..."

For help with grief and grieving.
0 Kudos
Message 5 of 18
(4,975 Views)

Would this be the correct way to attach error clusters? I have not implemented error handling in LabVIEW before. 

 

I verify the COM port # using device manager (only have 1 plugged into computer). Baud rate is set to 9600 on the chip. I have tried every option of Flow Control. I've always left data bits at 8. 

 

I'll try to capture a long string now and see if the output is still fudged. 

 

Will get device info and reply to second part of question shortly. 

0 Kudos
Message 6 of 18
(4,973 Views)

I agree with Mike, you need to have the error terminals wired in order to really know what's going on.  It's hard to debug when errors happen and you don't know it.

 

I also note that you configure part of the serial parameters but don't mention anything about datA bits, stop bits, parity, flow control, etc.  Make sure ALL parameters are correct.

 

Kelly Bersch
Certified LabVIEW Developer
Kudos are always welcome
0 Kudos
Message 7 of 18
(4,971 Views)

One thing I've used to try and detect what's going on is the VISA Bytes At Serial Port property node within the while loop, and it is about 12,000. Not sure is this is expected / okay? 

0 Kudos
Message 8 of 18
(4,962 Views)
The answer to you questions are (in order):

No and Hell No.

You are not keeping up with the flow of data coming in the port.

Mike...

Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion

"... after all, He's not a tame lion..."

For help with grief and grieving.
0 Kudos
Message 9 of 18
(4,947 Views)

One way to handle errors.  It could be done better but this is a start.

 

Serial_Read_debugging_added_Error_clusters.png

 

Kelly Bersch
Certified LabVIEW Developer
Kudos are always welcome
0 Kudos
Message 10 of 18
(4,934 Views)