01-08-2013 05:37 PM - edited 01-08-2013 05:38 PM
Hi,
I am using Labview database connectivity toolkit to connect to my sqlitev3 database (I am using ODBC). I simply connect to the database (using open_connection block) and read all the values in it (using select_data block). The problem is that my database is huge (around 2GB), and I get this error:
NI_Database_API.lvlib:Rec Fetch Recordset Data (R).vi->NI_Database_API.lvlib:DB Tools Select Data.vi->real_web.vi->real_web.vi.ProxyCallerADO Error: 0x8007000E Exception occured in Provider: Not enough storage is available to complete this operation. in NI_Database_API.lvlib:Rec Fetch Recordset Data (R).vi->NI_Database_API.lvlib:DB Tools Select Data.vi->real_web.vi->real_web.vi.ProxyCaller
What should I do ? It seems that it reads the data into the memory and it does not have enough space for it. When I read the data I want to write them to a file.
Solved! Go to Solution.
01-08-2013 06:00 PM
Read it in chunks. Write each chunk to the file and then repeat with the next chunk. Trying to read and process the entire database in one shot is not very practical and as you found out won't work.
01-08-2013 06:19 PM
Thank you Mark for the sugestion.
Can you give me a hint on how to read database in chunks ?
Is there any block for that ? Or I should use SQL command ?
Thanks!
01-08-2013 06:25 PM
You will have to do it via your SQL query. Determine some type of reasonable selection criteria to get your data and use that to retrieve the data in chunks.
01-09-2013 04:40 AM
01-09-2013 09:03 AM
@mikeporter wrote:
An alternative is to perform a single query and process the results in chunks.
When you perform a query ADO creates what is called a recordset. The DCT tries to read the entire recordset at once. The alternative is to not try and read all the records in one go. Rather:
* read a bunch of records from the recordset
* process those records
* repeat until the recordset property EOF goes true
The advantage is that you don't have to worry about how to define a bunch of individual queries that return datasets of the right size.
Mike...
However it would still require 2GB+ for the record set so you will still run into memory issues on the machine.