LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Read data from µC / I/O-Assistant

Solved!
Go to solution

Hi,

 

I have the following problem:

I have a microcontroller connected to a COM-Port.

The microcontroller just counts upwards.

I want to analyse the data using LabView.

So I started the Instrument-I/O-Assistant, chose the COM Port and  chose read and analyse.

When I now read the data from the device, The analyse window shows

 

00 01 01 01 02 01 03 01 04 01 05 etc.

 

As you can see, the output consists of two bytes (?). Now I want to tell LabView that always 2 bytes belong together.

The problem is, that you seem to have to create the tokens by yourself.

Is there a way how I can tell LabView just to take two bytes together (w/o the tokens) and put the output on the graph?

 

If that doesn't work, I'd also be glad to know how I can put the single bytes e.g.

 

00 > 01 > 01 > 01 > 02 > 01 > 03...

 

on the graph.

 

Hope you can give me a clue...

0 Kudos
Message 1 of 13
(3,235 Views)

Hi laserco,

 

what is the µC sending? Bytes or words? What do you expect from the µC (and not: what is any IOAssistent providing to you)?

 

To get single bytes from serial string data you can simply use "string to byte array". Index array afterwards to get single bytes.

 

To get words from serial string you could typecast with a proper datatype (like array[U16]). IndexArray afterwards to get single words...

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 13
(3,223 Views)

laserco wrote:

When I now read the data from the device, The analyse window shows

00 01 01 01 02 01 03 01 04 01 05 etc.

As you can see, the output consists of two bytes (?). Now I want to tell LabView that always 2 bytes belong together.

The problem is, that you seem to have to create the tokens by yourself.

Is there a way how I can tell LabView just to take two bytes together (w/o the tokens) and put the output on the graph?

If that doesn't work, I'd also be glad to know how I can put the single bytes e.g.

00 > 01 > 01 > 01 > 02 > 01 > 03...

on the graph.

Hope you can give me a clue...


 

I am not quite sure what you mean? Can you tell us exactly what the dataformat is from the uC. Is it 8 or 16 bit? But do not worry. You have tools in Labview for mergeing numbers;)

 



Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
(Sorry no Labview "brag list" so far)
0 Kudos
Message 3 of 13
(3,223 Views)

Hi,

 

I tried using VISA instead of the assistant, and that way I kinda get it to work.

The µC always sends a Hi-Byte and a Low-Byte.

So I set bytes to read to 2 and connected it to the waveform graph. Now I can see a line... Onliest "problem":  The graph "jumps" because I said the graph sould configure the x- and y-axis manually....

Any ideas what I can set the settings to?

 

Another problem is that the data is read even if the VI is stopped. That means that when  I stop the VI fora while. still some data is sent and if I restart the VI, a get a lot of data at once...

Message Edited by laserco on 10-21-2009 02:55 AM
0 Kudos
Message 4 of 13
(3,201 Views)

Hi laserco,

 

"if I restart the VI, a get a lot of data at once..."

 

Did you close the VISA connection?

 

"The graph "jumps""

How can it jump when the µC just counts up?

Did you enable "Loose Fit" ? Just use automatic scaling without it...

 

Can there be an escalation to "only"? Smiley Wink

Message Edited by GerdW on 10-21-2009 10:15 AM
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 5 of 13
(3,194 Views)
I'll attach the VI (editing a post twice not possible?)
0 Kudos
Message 6 of 13
(3,193 Views)
Solution
Accepted by topic author laserco

Hi laserco,

 

you can edit as often as you want - given the 10min time limit and no one else has written a message in between Smiley Wink

 

Your vi shows some often made problems:

- you don't open the VISA connection properly

- in each loop iteration you configure your serial port again - why?

- you don't close the VISA connection at the end

 

You should get in the habit of properly handle such "connections" (VISA, queues, file handles). Every time you have to open a connection, configure it once, do the read/write in a loop and cleanup (aka close) at the end!

 

Edited:

Next bad habit: There is also no error handling included - I also didn't add it to the vi...

Edited 2:

Changed attachment: switched off "Loose Fit" for y-scale...

Message Edited by GerdW on 10-21-2009 10:25 AM
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 7 of 13
(3,187 Views)

No, I forgot to close the connection :smileysurprised:

I should take a closer look on the examples...

 

By saying that the graph jumps I mean that the y-axis is always autoscaled when the graph reaches the top.

That's why I set the y axis manually from 0 to 255 now...

 

Another strange thing is that the graph is "inverted"sometimes when I start the VI. Then it looks like on the screenshot attached...

0 Kudos
Message 8 of 13
(3,182 Views)

Hi laserco,

 

does this "inversion" only happen for the first datapoint?

Then it may come from "inside" the chart: it remembers the last datapoint and starts to draw the line form this value...

 

"y-axis is always autoscaled when the graph reaches the top."

 What else did you expect from autoscaling? Smiley Wink

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 9 of 13
(3,178 Views)

Okay, I used string to byte array... Well, and the array is [x,y], where x is the number of the iteration and y the actual number the µc sends. So after some time it says e.g. [20.150]...

But when I use index byte array now, I cannot put these data on a waveform graph, can I?

0 Kudos
Message 10 of 13
(3,149 Views)