LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

ChatGPT AI Assisted Programming with LabVIEW Discussion


@paul_a_cardinale wrote:

I think that "AI" is misnamed.  I think is should be called 'Artificial Stupidity'


And any solutions that come from it should be abbreviated ASS

0 Kudos
Message 21 of 30
(1,413 Views)

@paul_a_cardinale wrote:

I think that "AI" is misnamed.  I think is should be called 'Artificial Stupidity'


I think the issue is Labview, not the "Artificial Stupidity" for example: 

 

Hello chat GPT, please make me a PWM controller in C: 

 

Ok here you go: 

 

 

 

#include <stdio.h>
#include <stdlib.h>
#include <unistd.h>

#define PWM_PERIOD_US 20000 // PWM period in microseconds

void pwmInit() {
    // Set up GPIO pins or any other hardware initialization
    
    // For example, if you're using the Raspberry Pi GPIO:
    // Initialize GPIO pin as output
    // gpioSetMode(pinNumber, PI_OUTPUT);
}

void pwmWrite(int pin, float dutyCycle) {
    int pulseWidth = (int)(PWM_PERIOD_US * dutyCycle);
    
    // Set the pin high
    // For example, if you're using the Raspberry Pi GPIO:
    // gpioWrite(pin, PI_HIGH);
    
    // Delay for the pulse width duration
    usleep(pulseWidth);
    
    // Set the pin low
    // For example, if you're using the Raspberry Pi GPIO:
    // gpioWrite(pin, PI_LOW);
    
    // Delay for the remaining period
    usleep(PWM_PERIOD_US - pulseWidth);
}

int main() {
    pwmInit(); // Initialize PWM
    
    // Example usage: Generate a PWM signal with a duty cycle of 50% on pin 18
    int pwmPin = 18;
    float dutyCycle = 0.5;
    
    while (1) {
        pwmWrite(pwmPin, dutyCycle);
    }
    
    return 0;
}

 

 

 

Hello Dall-E, please make me a PWM controller in Labview: 

 

Response: 

gpt labview.png

 

 

... lol : ) 

 

 

I think our jobs are safe for now. 

 

 

 

______________________________________________________________
Have a pleasant day and be sure to learn Python for success and prosperity.
Message 22 of 30
(1,391 Views)

@Jay14159265 wrote:

LAbviak, the spooderman version of LabVIEW

Message 23 of 30
(1,201 Views)

@alexderjuengere wrote:

@Jay14159265 wrote:

LAbviak, the spooderman version of LabVIEW


spooderman lol : ) 

 

I have ported all my code to LAbviak, the UI is crap, the compiler is crap, but it's open source! 

______________________________________________________________
Have a pleasant day and be sure to learn Python for success and prosperity.
0 Kudos
Message 24 of 30
(1,179 Views)

@alexderjuengere wrote:

@Jay14159265 wrote:

LAbviak, the spooderman version of LabVIEW


I just got the notification for this thread and didn't remember the context (and didn't scroll up), so I googled "LAbviak" to see what it was.

 

Maybe don't Google that one on your work computers... 😉

Message 25 of 30
(1,169 Views)

sorry 😄

 

Spoiler

alexderjuengere_1-1691063312809.png

 

 

0 Kudos
Message 26 of 30
(1,145 Views)

@Kathlynn wrote:

Yes, ChatGPT usually acts pretty sure of itself and might make up believable stuff instead of saying it doesn't know. So, I don't think programmers will be out of work anytime soon. But even if AI can write most of the code and messes up sometimes, you can still fix those mistakes and get things done faster.


I disagree.  I think that it would be difficult to debug.  No one wrote it, so no one is familiar with the code enough to know where to look if something goes sideways.

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
Message 27 of 30
(402 Views)

@Kathlynn wrote:

Yes, ChatGPT usually acts pretty sure of itself and might make up believable stuff instead of saying it doesn't know. So, I don't think programmers will be out of work anytime soon. But even if AI can write most of the code and messes up sometimes, you can still fix those mistakes and get things done faster.


Another problem with that assertion is that things like ChatGPT are often used by people who barely are able to write a "hello world" program in their programming language of choice. They have no possibility whatsoever to understand what the program really does and even less to spot inaccuracies or outright errors. Let's not talk about fixing them.

 

There are enough examples in other areas. AI fabricated legal cases that are used by professional lawyers in front of courts. Judges usually seem to actually check if those cases really exists, but once they use ChatGPT too for that research, things spiral totally downwards.

 

Before you know it, you have a self enforcing spiral of fabricated truth that is almost impossible to penetrate. Well we already have in a way without AI. It's proven that hearing a falsehood often enough will  eventually make many people believe them despite actually knowing that it is not the truth. Thanks to social media, it's very easy to reach lots of people over and over again and in that way influence what is felt as actual truth by the masses.

Rolf Kalbermatter
My Blog
Message 28 of 30
(371 Views)

I have gone to the dark side and done quite a bit of C# programming recently. 

I am getting better at prompting the free version of ChatGPT, and my coding skills are good enough to find, tweak and validate its output. Its also been good with complier errors. And for explaining C# language features. I have C / MFC experience from over 10 years ago, and ChatGPT is has been a game changer for the code I have been writing. I would not have got half the tickets done in twice the time using google / stackoverflow / code project etc and figuring out for myself. Its been easy enough and enjoyable ( I enjoyed LabVIEW more in the past) to develop with. 

 

I got Chat GPT to whip a class to RawPrint ZPL data to a Zebra label printer, and a serial port class for barcode scanning. I was impressed. 

 

One of my gripes with other languages is that in MFC anyway it took something like 10 lines of code to prompt for a file path. And there were 10 different ways to do it, in C / MFC / Microsoft Java etc. Versus LabVIEW with one standard VI you can just plonk on the block diagram. I feel ChatGPT has removed this issue for me. I feel considerably more productive. 

 

Yes garbage in, garbage out is true for AI's as well. I think that will require better curated data. This may give NI an advantage, since no one else will create their own version of Nigel for LabVIEW. Just run the AI over this forum. If they add a meta tag (with version) to the AI output they should be able to avoid the garbage in problem as they refine their AI. 

0 Kudos
Message 29 of 30
(313 Views)

@Kiwi-wires wrote:

 

One of my gripes with other languages is that in MFC anyway it took something like 10 lines of code to prompt for a file path. And there were 10 different ways to do it, in C / MFC / Microsoft Java etc. Versus LabVIEW with one standard VI you can just plonk on the block diagram. I feel ChatGPT has removed this issue for me. I feel considerably more productive. 

 

 


same here.
 

0 Kudos
Message 30 of 30
(234 Views)