ni.com is experiencing intermittent service disruptions.

Support teams are actively working on the resolution.

LabVIEW Idea Exchange

cancel
Showing results for 
Search instead for 
Did you mean: 
Broken_Arrow

Change CLUSTER SIZE to a Wired Input

Status: New

Cluster Size as a Wired Input:

 

  • Easier to see
  • More implicit
  • Nearly impossible to forget to set it (if it were a required input).

 Cluster Size.gif

Richard






30 Comments
myle00
Member
You should be able to do that today by creating a polymorphic VI, which is basically a group of VIs, which you drop as a single VI and choose one of them at edit-time. Usually, selecting the relevant VI is done automatically, based on the data types you wire into the VI, but you can also show a selector ring, which would basically function like the enum you want.

 That'll actually work well, especially if I name the VIs with the size of their clusters because it'll then give the size of the cluster on the block diagram. And inlining the VIs should remove the calling costs.

Brian_Powell
Active Participant

I'd like to step back and think about this a different way, especially after seeing that there are two different requests...  (1) Show the cluster size visually, and (2) have the number of cluster elements determined dynamically.  Right now, I can tell you we will probably never do #2; we want stuff like this statically typed.

 

But anyway, why do we want to convert between arrays and clusters?  I would argue that a common use case is because we are using the array as a sort of polymorphic data type for the cluster.  E.g., I used Read Spreadsheet File to get an array of values from a spreadsheet, and I want to initialize a cluster with those values.  The array was a convenient data type for reading the data.  Do you think this is a common use case for Array to Cluster?

 

So what could we do to obviate the need for Array to Cluster?  (At least most of the time?)  Is there something else we should be doing instead?

 

Also, the cluster coming out is, for lack of a better word, generic.  If you are going to convert it to a cluster, wouldn't you rather name the elements?  I think someone suggested having a required cluster input for data type (a la Type Cast).  Wouldn't this be better than making the number visible?

myle00
Member
So what could we do to obviate the need for Array to Cluster?  (At least most of the time?)  Is there something else we should be doing instead?

I use array to cluster only when I need to interface to a dll. For example, say I have cluster, cluster A, that needs to be passed into the dll. If cluster A has a array, adapt to type won't pass in the array in the cluster properly so I create a new cluster B where instead of the array I use array to cluster and replace the array with the cluster so that the array becomes embedded in the top level cluster. This typically comes up when I want to pass text embedded in a cluster (i.e. convert string to array to cluster). In this situation it also works well, that if the string was smaller than the required size, the cluster will always generate the right size.

Kevin_Price
Proven Zealot

Here's a little something I discovered as a potentially unsafe workaround.  For some context, go see this older posting.

 

 

array typecast to cluster.png

 

 

Typecasting an array to a cluster seems to function in a predictable and expected way.  The type input of the typecast function then becomes a way to control or document the # of cluster elements.

 

In my initial app, the clusters in question were typedef'ed.  When design changes caused me to add or remove cluster elements, none of the code broke or got buggy like it would have with the "array to cluster" primitive.

 

Caution: as tst warned in the thread I linked, this may not have guaranteed behavior in future versions of LV.

 

-Kevin P

CAUTION! New LabVIEW adopters -- it's too late for me, but you *can* save yourself. The new subscription policy for LabVIEW puts NI's hand in your wallet for the rest of your working life. Are you sure you're *that* dedicated to LabVIEW? (Summary of my reasons in this post, part of a voluminous thread of mostly complaints starting here).
Jeff-P
NI Employee (retired)

I came to post this idea and found this. I really like it, but really I would like to be able to have a cluster input specified. My main reason is that if my cluster is a typedef, if I change the number of elements in the typedef, I don't want to find all of my array to clusters to update the number. I can wrap them in a VI so that I only have to change it in once place, but that feels like a step I shouldn't have to take. The additional input would also get rid of the coercion step when I convert the cluster to the typedefed cluster. 

 

Jeff Peacock 

 

Product Support Engineer | LabVIEW R&D | National Instruments 

smithd
Active Participant

Adding kudos to jeff's variant of the idea

Darin.K
Trusted Enthusiast

Jeff's variant is better addressed by (1) making Coerce to Type a first-class citizen and (2) adding support for clusters to it.

rolfk
Knight of NI

The cluster size as a numeric input shouldn’t be possible but the size should be somehow visible. I like the idea of a sort of editable label below or above the node that shows the actual size (and possibly could be hidden by the user if he wants to). Maybe mutated code from old versions could start out with this hidden  to maintain the look and avoid overlaying other stuff located to close to it.

In addition rather than start out with an arbitrary size when the node is placed on the diagram it should start as adaptable and deduce the size from the downstream cluster size if possible  as soon as it gets wired up.

This function existed as far back as LabVIEW 2 and most likely even before and LabVIEW didn’t have backwards datatype propagation back then so it could not be implemented like that. But this node never has seen any love since. That right click option dialog to set the size feels awkwardly out of place nowadays! If anything it should be part of a proper Properties dialog.

I also like the additional option in form of a right click menu option to “Enable Datatype Input” that allows to wire a cluster with the desired cluster configuration. This input wire should be broken if the datatype contains different types of scalar elements or any non-scalar element but the node should allow normal coercion from one numeric datatype to another. So it would properly coerce from an array of integers to a cluster containing floats for instance with the according coercion dot on the input.

Rolf Kalbermatter
My Blog
wiebe@CARYA
Knight of NI

Wiring the size as a constant would work, if LabVIEW had something similar to C++'s constant expressions.

 

A constant expression is a constant that is evaluated at compile time. This would be useful for situations like this (and a few others, like malleable vis). Wiring a control could break the wire, with an error "constant expression required".

 

I don't see LabVIEW constant expressions coming soon though, but it would be funky.

JohnatanBravo
Member

Wow, it's crazy to see that after all this time we had no solution for this.

I came upon this idea when searching 'how to change cluster size programmatically', I needed to do this for my database connectivity.

This is my solution, it was easy to set-up as in my case the size won't ever be bigger than 30 (I still made a provision for a size up to 100, as it was quick enough to just copy and paste...) 

 

ezgif.com-video-to-gif.gif