LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Reading large data using Database Connectivity Toolkit

Solved!
Go to solution

Hi,

 

I am using Labview database connectivity toolkit to connect to my sqlitev3 database (I am using ODBC). I simply connect to the database (using open_connection block) and read all the values in it (using select_data block). The problem is that my database is huge (around 2GB), and I get this error:

 

NI_Database_API.lvlib:Rec Fetch Recordset Data (R).vi->NI_Database_API.lvlib:DB Tools Select Data.vi->real_web.vi->real_web.vi.ProxyCallerADO Error: 0x8007000E Exception occured in Provider: Not enough storage is available to complete this operation. in NI_Database_API.lvlib:Rec Fetch Recordset Data (R).vi->NI_Database_API.lvlib:DB Tools Select Data.vi->real_web.vi->real_web.vi.ProxyCaller 

 

What should I do ? It seems that it reads the data into the memory and it does not have enough space for it. When I read the data I want to write them to a file.

 

1-9-2013 12-35-48 AM.jpg

0 Kudos
Message 1 of 6
(3,620 Views)
Solution
Accepted by topic author ManiAm

Read it in chunks. Write each chunk to the file and then repeat with the next chunk. Trying to read and process the entire database in one shot is not very practical and as you found out won't work.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 2 of 6
(3,613 Views)

Thank you Mark for the sugestion.

Can you give me a hint on how to read database in chunks ?

Is there any block for that ? Or I should use SQL command ?

 

Thanks!

0 Kudos
Message 3 of 6
(3,609 Views)

You will have to do it via your SQL query. Determine some type of reasonable selection criteria to get your data and use that to retrieve the data in chunks.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 4 of 6
(3,607 Views)
An alternative is to perform a single query and process the results in chunks.

When you perform a query ADO creates what is called a recordset. The DCT tries to read the entire recordset at once. The alternative is to not try and read all the records in one go. Rather:

* read a bunch of records from the recordset
* process those records
* repeat until the recordset property EOF goes true

The advantage is that you don't have to worry about how to define a bunch of individual queries that return datasets of the right size.

Mike...

Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion

"... after all, He's not a tame lion..."

For help with grief and grieving.
Message 5 of 6
(3,589 Views)

@mikeporter wrote:
An alternative is to perform a single query and process the results in chunks.

When you perform a query ADO creates what is called a recordset. The DCT tries to read the entire recordset at once. The alternative is to not try and read all the records in one go. Rather:

* read a bunch of records from the recordset
* process those records
* repeat until the recordset property EOF goes true

The advantage is that you don't have to worry about how to define a bunch of individual queries that return datasets of the right size.

Mike...

However it would still require 2GB+ for the record set so you will still run into memory issues on the machine.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 6 of 6
(3,569 Views)