LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

LabVIEW crashes reading from very large amount of data

The basic task is to store a very large amount of data (eg. 10GB U8 array) in something similar to global variable. As LabVIEW can sometimes copy data in unexpected way, I used an array of Data Value References of 1D U8 array to avoid that. I first allocate memory, then make as many of these operations as I wish, and in the end, deallocate it. 
I can't use Functional Global, because this "variable" must have several separate instances. So me and my collegue created a VI used like a FG, but it doesn't have shift registers or feedback nodes; instead, it passes a cluster of DVRs in and out. The data written and read will be files of identical size, so we also store an array of files' offsets (it's like a U64 pointer) so that we can access them at will.
But here comes the actual problem: all of this works up to a limit, and, curiously enogh:
1. allocation: initializing a 1G U8 array and creating a given number of DVRs in a loop works fine.
2. writing: accessing appropiate DVR (or two of them) by In Place Element structure and replacing subarray works fine.
3. reading: accessing DVRs in similar way and getting array subset crashes LabVIEW.
The tester sequentially allocates memory, writes to it (even 10GB), reads from it and deallocates it. The problem occured only after some 6GB had been read properly.
It is not caused by lack of physical RAM, as I tested it on 16GB computer. Moreover, on that computer the crash was a sudden (like one second) shutdown of LabVIEW, without any message from it or from Windows. On two different 8GB computers was different: LabVIEW hangs and a Windows message says just "program LabVIEW 12.0.1f5 Development System stopped working".
I never got an "out of memory" error, as all those machines use SWAP. After RAM has run out, program kept running, although it was painfully slow. This problem occures on 64-bit Windows 7 and 64-bit LabVIEW. 
Is something wrong with LabVIEW itself, or maybe the way Windows manages memory? Or should I use a totally different approach?

Message 1 of 2
(2,447 Views)

Hi Andrzej and welcome to NI forums!

 

My first question would be whether you really need 10GB of memory data floating. If the access throughtput or latency does not dictate it, it would make more sense (at least for me) to stream to/from a faster disk then keeping everything in memory. You can create a huge binary file, and a similar FGV-style API to access said file from anywhere on the system. There is even an OpenG toolkit written specifically for accessing large files, available on VI Package Manager.  

 

That being said, If large-scale memory is what you need, then the array of DVRs you did is a good way to go. A few thing I would suggest.

 

  • Check to see if any error logs were created in Documents\LabVIEW Data\lvfailurelog. Those might help to identify the crashes you're experiencing.
  • If LabVIEW just shuts down with no error messages, that seems like an external process shutdown to me. Try checking your antivirus settings.
  • Try modifying the data contained in each DVR. For example, try having 1 DVR with 10 GB, 100 DVRs 100MB each or 1000 DVRs 10MB each.
  • I don't think you need that many In Place structures. In certain cases you use three nested layers. Only one (handling the array data in the DVR) should be sufficient.

Let me know if any of this helped. If you're willing to share, I'd also be curious about your application, so what is the broader goal you're trying to accomplish.

 

Best regards:

 

Andrew Valko

NI Hungary 

Andrew Valko
National Instruments Hungary
Message 2 of 2
(2,398 Views)