Community Documents

cancel
Showing results for 
Search instead for 
Did you mean: 

Is LabVIEW a general programming language?

The VIEW, by Jeff Kodosky

 

I often hear, and sometimes get drawn into, the debate over whether LabVIEW is a general purpose language or an application-specific development environment for measurement and automation. On the one hand, experienced programmers point out features of popular languages which are missing from LabVIEW, but on the other hand, some users point to elaborate general purpose applications they built using LabVIEW but without any data acquisition or analysis at all.

 

A survey of LabVIEW users would likely match a recent informal poll of developers on the team, where the overwhelming majority were of the opinion that LabVIEW has sufficient functionality to be classified as general purpose, and, in fact, use it that way. The most often cited deficiency is native recursion and recursive data types, followed by object-oriented constructs, but neither was a serious obstacle to the general purpose applications that were built.

 

The Wrong Question.
Poll results notwithstanding, I think this is the wrong question and trying to answer it leads in the wrong direction. To me, it is a little like asking: is a car a place to sit? Of course you can sit in a car, but if that is all you do with it you are missing the main point of having one. A better question is: can LabVIEW be used for general purpose programming? Or better still: can LabVIEW be used to create general purpose applications?

 

The new formulation of the question has the same fuzziness about what is considered general purpose, but it downplays the sometimes religious argument of whether or not LabVIEW is a programming language. Some people do not consider it a language because it isn't text-based and it isn¹t sequential. Oddly enough, the narrowest views of what is considered a programming language are held by people with a computer science background.

 

The most important aspect of the revised question, however, is that it gets the inclusiveness turned around the right way. Stated another way, the original question indirectly implies that general purpose programming is in some sense a larger problem or a superset of measurement and automation programming. However, the subsetting actually goes the other way.

In general, measurement and automation programs have to deal with all of the same issues as general purpose programs, the common issues of data structures and algorithms, file I/O, network I/O, user I/O, database access, printing, and on and on. But measurement and automation programs also have to deal with many more issues than general purpose programs, such as physical I/O, real-time constraints and hardware configuration. They can also have some of the most demanding user interface requirements of all. Measurement and automation programs deal with a superset of the concerns dealt with by general purpose programs.

 

If tool A and tool B can be used for a certain set of tasks, but tool B has more functionality that makes it useful for additional tasks, which tool is really the more general one? This is precisely where we are with LabVIEW. LabVIEW¹s suitability for measurement and automation applications comes about not because its fundamental programming abilities are restricted in some way, but because they are enhanced and extended.

 

This is why it is important to ask the question "can LabVIEW be used to create general purpose applications?" as opposed to the question "is LabVIEW a general purpose programming language?" We do not want to limit the scope or the further evolution of LabVIEW by having it viewed as only a programming language.

 

LabVIEW is much more than a programming language. It is a highly interactive environment for the rapid prototyping and incremental development of applications, from measurement and automation to real-time embedded to general purpose. And now, with the ability to target FPGAs, LabVIEW is a hardware design tool as well.

 

Dataflow.
At the heart of LabVIEW is structured dataflow diagramming. Data flow has been around for a long time and is well understood. It is, in fact, a much richer computational model than the control flow of popular text-based languages because it is inherently parallel. C/C++ and BASIC are not. They have to rely on library calls to operating system functions to achieve parallelism. As a result, the compiler can not help ensure shared sections of code are properly protected, making it difficult to build parallel programs. These problems do not exist in LabVIEW. Even a novice can design a highly parallel application. And no extra effort or knowledge is needed to get it to automatically spread over multiple tightly-connected processors.

 

Data flow has long been advocated as a design tool for business applications. It is being advanced as an alternative computer architecture to avoid the von Neumann bottleneck. Data flow analysis is at the heart of optimizing compilers. Why not program applications using data flow? The natural representation of a data flow is a graph or diagram, so until the mouse and bitmapped graphics came along, it was simply impractical; text entry of a data flow diagram would be analogous to writing a text description of a street map, time consuming and error prone. But now, with fast, large memory machines, and ubiquitous big screens, direct interactive editing of a data flow diagram is simple.

 

Sometimes when showing a LabVIEW diagram, I hear the question, "where is the code?", as if the diagram isn't real unless it generates text. I can't help but marvel at how successful we have been collectively, as an industry, in convincing the world that a limitation in our traditional programming tools is actually an advantage. In reality, it is a severe disadvantage to limit the connection between program editor and program compiler to a simple ASCII stream. People don't ask where the text is when handed a music CD. We don't have or need a text version of the CD contents because we have tools that edit and play music directly from a binary storage format convenient for the tools. Same for video. VCRs record and play video without any intermediate text representation.

 

So why is it different for programming languages? Historically, it was necessary to have a separate editor and compiler, and the easiest thing to do was have them connect with the lowest common denominator, ASCII characters. As machines got larger and faster, integrated development environments appeared, but still the least common denominator connection persists. The wealth of information in the indented layout of a program's text, for instance, is completely ignored by the compiler. There were many attempts to design syntax-directed editors but all ultimately failed because character-by-character editing is so ingrained, it is impossible to get to a higher level of editing structure-by-structure. The compiler just accepts a stream of 7-bit ASCII characters directly assembled using the editor. We use different fonts and colors and type styles when making documents for human consumption, but there is no attempt to exploit these dimensions in our programming language editors or compilers.

 

Interestingly enough, some researchers experimenting with graphical and pictorial programming models have a similarly limited view. The editor produces a picture that the compiler parses. The 2D image is the program and it can just as easily be understood printed on paper as on the screen. The knowledge of how the image was constructed is completely ignored by the compiler as it parses the image from scratch.

 

LabVIEW takes a different approach. LabVIEW's data flow diagram is a little more than 2D, with a wealth of information that can pop up as needed, such as tip strips, but isn¹t around all the time to clutter the diagram. It is possible to print out a LabVIEW application, but it is much easier to view it and navigate it within LabVIEW. The compiler does not have to parse the diagram, because it is already parsed. The editor constructs the parse tree as the diagram is interactively constructed. All the user gestures that construct the graphics also construct the parse tree. What gets passed to the compiler or saved in a file is much richer than the graphics visible on the screen. So, LabVIEW is much more like the VCR model than the text editor model in this respect. And the richer data that is passed to the compiler makes it possible to compile the diagram very fast, to the point where the user can simply ignore that it's happening. That means the cycle time between making a change and trying it out can be very short.

 

The speed of the compiler is just one of many reasons for the high productivity users experience working with LabVIEW. Because the editor constructs the parse tree, it is able to give syntactic and semantic feedback immediately, so mistakes are detected and fixed earlier and quicker.

 

The editor has a rich set of operations to quickly create elaborate user interfaces by direct manipulation. The fact that every module, or VI, has a user interface means that interactive testing is simple to do at each step, without writing any extra code. The fraction of the application that has to be completed before meaningful testing can take place is much smaller in LabVIEW than in traditional programming tools, making design iteration much faster.

 

Even the data types on the diagram are easy to use. Strings and arrays can be routed and operated on without worrying about the details of memory allocation, which means a whole host of errors, such as losing or overwriting memory, just do not exist.

 

The net result of all these capabilities in LabVIEW is greatly increased productivity. Anecdotal evidence from many sources suggests four to ten times the productivity of traditional programming tools. This, therefore, may be the biggest reason for not considering LabVIEW to be a general purpose programming language. It is a much higher leverage design tool which scales from desktop machine to embedded processor to FPGA. Calling it just a language would be a big disservice to the entire LabVIEW community.

 

The Bottom Line.
As we continue the development and evolution of LabVIEW, we will continue improving productivity and performance, extending functionality, and expanding the number of possible deployment targets. However, we will not be constrained by the traditional boundaries between language, editor, compiler, debugger, device driver, and so on, because we believe there are possibilities for improving performance while simultaneously reducing complexity by rethinking the situation from first principles. And by working closely with the LabVIEW user community, we intend to bring these possibilities to fruition.

So, bottom line, is LabVIEW a general purpose programming language? No, it is much more than that. Can LabVIEW be used to create general purpose applications? Absolutely.


Author Biography


Jeff Kodosky co-founded National Instruments in 1976 and has served as a Director since that time. He was appointed Vice President of the company in 1978, served as Vice President, Research & Development from 1980 to 2000, and was recently named NI Business and Technology Fellow. He is well-known as the inventor of LabVIEW, the company's graphical instrumentation software package. Prior to 1976, he was employed at ARL, UT Austin. Jeff received his BS in Physics from Rensselaer Polytechnic Institute.


*Reposted from earlier article for content migration

Contributors