LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Storing data into txt or xls with appending other than the first column

Hi,

 

I am running a vi for many times(in a for loop) and for each run, I am getting 4096 samples. When I try to write these data(4096 samples each time) to Excel or txt in append mode, it only puts all data in the first column one after another. Is there any way to put each run data to the next column because Excel is limited to 64 thousand rows at most and I need to postprocess the data? I am using Write to Spreadsheet.vi for this purpose and Labview 2009. Also, I need to put the Write to Spreadsheet.vi into "Not append" mode when I want to write for the first time. After that, I can write in Append mode.

23066i9490C907762B71E6

Thanks,

 

Ouz

NSC

0 Kudos
Message 1 of 8
(2,770 Views)

Your data is being gathered as a 1D array with 4096 elements.  Each time you write to the spreadsheet, it will put in 4096 rows.  To do what you want, put in rows of 4096 columns, you have to acquire all the data first, then write to the spreadsheet.  In other words, gather 4096 data points into a 1D array.  Then on your next pass (loop iteration) gather another 4096 points and form a 2D array with the first set of points.  Keep on adding a new row to the 2D array with each pass.  When done, write the 2D array to the Spreadsheet.  You will see many rows of 4096 columns.

 

The following picture shows how to create 3 rows of 5 columns.  Do the same with your samples.

 

23068i890274DCB1B5C9EB

- tbob

Inventor of the WORM Global
Message 2 of 8
(2,759 Views)

Hi tbob,

 

thanks for the reply. I think you suggested using auto-indexing to make 2-array from 1-d array and writing whole 2-d array at the end. It is a good and easy idea but does that make a problem of Memory error? Sometimes I leave the setup for whole weekend under test so it takes lots of data and gives Memory error. Because of this, I tried to used the same 1-d array with size  of 4096. Also, as you can see from the picture, I used  Build Array.vi to take some other data(other arrays) for each 4096 samples (Build Array.vi is not necessary for one array as in the picture 😉 .

 

Thanks,

 

Ouz

0 Kudos
Message 3 of 8
(2,755 Views)

 


@ouz wrote:

.... Is there any way to put each run data to the next column ...


 

Data in a file is arranged linearly, one row at a time, so to insert a new column you cannot leave the original data in place. Everything needs to be rewritten, because every row changes size.

 

You can only append if you can write one row at a time instead of one column at a time. If this is acceptable, simply don't transpose at the file IO.

 

 

Message 4 of 8
(2,748 Views)

My concerns with gathering all of the data and then writing it is usually not memory related as much as Bill Gates related.  Who knows when Windows thinks you need to reboot?  Who knows when the power is going to fail? I know, it is at the worst possible time.

 

Since you are stuck appending data by rows, I would do that.  Gather 4096 points and write a new row to your spreadsheet file (I happen to use a lot of binary files, but that is just me).  When you are finished gathering data in a single post-processing step read the file, transpose the data, then re-write a new file (I am also paranoid about overwriting data).

 

This way you are not going to lose much data, or waste time constantly reading and writing entire files.

Message 5 of 8
(2,738 Views)

@ouz wrote:

 

 

. Is there any way to put each run data to the next column because Excel is limited to 64 thousand rows at most and I need to postprocess the data? I am using Write to Spreadsheet.vi for this purpose and Labview 2009.  


I am going to make an argument that if Excel isn't giving you the amount of space you need because of the limitation on rows, or a limitation on number of columns, then perhaps Excel is not the best thing to be using to post process your data?

 

If you are looking at creating a spreadsheet that has 4096 rows and inserting data column by column, then you are going to hit the limit of 256 columns that Excel has.  How many sets of data of 4096 elements each are you looking at collecting in a long run?

 

It is hard to give much advice without knowing how much data you are trying to handle, and how you are trying to process it.  But there are a few things you could do.

 

1.  Do your post-processing in LabVIEW with another VI you create.

2.  Do some intermediate processing.  Let the first VI generate the data row by row 4096 elements at a time.  Later create a VI that will reread that file to transpose the array and create a new file that Excel will like.  Of course if you are having memory problems creating a full array in LabVIEW to begin with, you might have problems reading it later to rewrite it.  But if that happens, why aren't you having memory problems with Excel?  But you could read portions of original file and jump around in it to get to the data you need with creative use of the file pointer.

3.  Create an Excel VBA macro that will read the file, and populate the spreadsheet column by column instead of row by row.

4.  Read Managing Large Data Sets in LabVIEW

 

 

Message 6 of 8
(2,737 Views)

Hi altenbach,

 

I see your point and good to learn "Data in a file is arranged linearly, one row at a time". However, I can only put 256 samples in a single row which results in error(not complete data is stored in Excel) when I try to write 4096 samples as expected.

 

Thanks,

 

Ouz 

0 Kudos
Message 7 of 8
(2,735 Views)

Hi Ravens,

 

I can run sometimes 50-60 times 4096 samples(Btw, this is not a weekend run. Hence, I don't have a problem memory for this situation. Although it is better to run higher than this, this amount of run is enough for me the excess operations) but I usually handle 15-20 runs of 4096 samples(but in addition to these, I have some other arrays on my vi.). But I couldn't achieve that much  run when I am not near my computer. I always need to process data in the Excel before next set comes(move the previous set to the next column with hand). Otherwise, I can only get 16 run with 4096 samples for each run due to the row limitation of Excel.

 

Thanks for your suggestions. I will try those immediately.

 

Ouz.

 

 

0 Kudos
Message 8 of 8
(2,725 Views)