Page 1 of 1
How to optimize this codding
Posted: 20.10.2009, 07:30
by andresayang
Hi
With delphi 2007, i have lot of Insert commands to execute
I do something like this:
var
Connection: IZConnection;
Statement: IZStatement;
...
Statement:= Connection.CreateStatement;
Statement.ExecuteQuery(sqltosend.Text);
Statement.Close;
...
sqltosend is a StringList containing more or less 80 000 Insert query !
How to configure the Connection and the Statement to have this command the fastest as possible ?
(I mean autocommit, and all other parameters)
Thanks a lot
Posted: 20.10.2009, 18:04
by WorldWalker
Hi,
Did you try ZSQLProcessor?
Posted: 20.10.2009, 20:03
by andresayang
Hi, I will have a look !
The thing is that my code was quite simple at the beginning: Backup one database and restore the backup inside another one.
But now, the app is doing more and more and I only adapted the fisrt codding and would like to preserve it (to do not rebuild all from 0).
Thanks
Posted: 24.10.2009, 14:31
by mdaems
andresayang,
I'm afraid you have the most optimal setting when sending the sql stringlist as one statement. Normally, even with autocommit, there's only one commit in your case.
You could consider splitting the script into smaller blocks with a commit in between if too big transactions slow down your database server, but I think that depends on the server you're using.
Mark
Posted: 25.10.2009, 04:34
by andresayang
Thanks mdaems
I have notice that if I send a sql stringlist containing 2 files at the same time (ie 160 000 inserts), it is faster than sending them separately. That was why I was asking if there was any parameters to change to increase the speed.
Posted: 27.10.2009, 04:48
by manz
dear andresayang
you may optimize your mysql server configuration to handling it with the good chosen engine ... such as InnoDB.
Manz
Posted: 28.10.2009, 11:16
by andresayang
manz wrote:dear andresayang
you may optimize your mysql server configuration to handling it with the good chosen engine ... such as InnoDB.
Manz
Sorry, I am working with Postgres !
But I started with Zeoslib, my small soft is working perfectly for what i want to do with it.
The normal way of working is with Stringlist containing around 500 insert queries, so It is not "so slow". I have the soft and a database on my laptop for development and what I do with it is not exactly the "normal way" of working, so I have bigger file to Import / Exports (much more easy to optimize the code for speed when working on huge files).
I only wanted to have a small explanation on what effect of the connection parameters on the speed of the transactions.
Rgds
Posted: 28.10.2009, 19:20
by niztor
I think the most important parameter for speed is Autocommit. if you set Autocommit := True every insert statement you execute starts its own transaction, so the database need to setup a transaction enviroment for every statement. Setting Autocommit:= False before the bulk insert implies just one transaction for the database.
Using triggers on tables is another source of delay on bulk inserts.
if you didn't do yet, please check postgres manual for transactions and triggers topics.
sorry my poor english X)
Posted: 29.10.2009, 11:08
by andresayang
Thanks,
I will check if i can disable some triggers. For Autocommit, I have found this already and it is set to false.
I have to read Postgres doc (I already read lot of it to learn SQL language).
Rgds