Re: Perl won't read large table
Posted by:
Apachez
Date: August 24, 2006 04:37PM
Another hint is to really process one row at the time by enabling use_result as:
$dbh->{'mysql_use_result'} = 1;
Otherwise the api will try to fetch all rows into memory (or at least a large chunk of them) before it starts to process them. By using use_result no such "precaching" will be perform and your queries will most likely get a slight speedup in total processing time.
The difference is that in default mode the client will copy all rows in order to faster release the server from its connection based memory. With use_result enabled the server will use its connection based memory until last row has been processed (sort memory and the others in my.cnf which are per connection memory usage/buffers).
Subject
Written By
Posted
Re: Perl won't read large table
August 24, 2006 04:37PM
Sorry, you can't reply to this topic. It has been closed.
Content reproduced on this site is the property of the respective copyright holders.
It is not reviewed in advance by Oracle and does not necessarily represent the opinion
of Oracle or any other party.