Page 1 of 1

Improve stability for exporting large tables

PostPosted: Fri Feb 27, 2015 12:48 pm
by stefaneidelloth
If I export a MySQL table with 3 000 000 entries I get an out of memory exception.
Suggestion : export the data of large tables step by step and free the memory in between.

I use Eclipse Luna and installed DBeaver from the update site http://dbeaver.jkiss.org/update/3.1.5/

Re: Improve stability for exporting large tables

PostPosted: Fri Feb 27, 2015 9:53 pm
by Serge
This feature is already implemented.
For MySQL use by-segment export (see screenshot).

Re: Improve stability for exporting large tables

PostPosted: Sun Mar 01, 2015 9:02 pm
by stefaneidelloth
Ok, I see. Thanks a lot.

Re: Improve stability for exporting large tables

PostPosted: Mon Mar 02, 2015 10:18 am
by stefaneidelloth
Today I tried the segment option. I do not get an out of memory exception but the process hangs
at about 5 % if I use segments of 100 000. If I use segments of 10 000 the program hangs at about
14 % and shows the following error:

Error occurred during hourly_values_of_year_entry data load
Reaseon:
GC overhead limit exceeded

The following SQlite file includes an example table that I could not transfer from MySQL to SQlite with DBeaver.
(I transfered it with another tool.)
http://www.filedropper.com/hourlyvalues (about 76 MB)

Re: Improve stability for exporting large tables

PostPosted: Mon Mar 02, 2015 6:08 pm
by Serge
Yes, there is a problem when importing big amount of data into table (export works ok).
It will be fixed in the next version.
Thanks for report