MySQL Forums
Forum List  »  Connector/Python

Exporting 5 million records
Posted by: Anjanesh Lekshminarayanan
Date: December 01, 2009 01:13PM

I have a table with 5 million rows of data.
I need to do a export without adding the PK column. mysqldump cant select columns.

sql = """SELECT `col2`,`col3`,`col4`,`col5`,`col6`,`col7` FROM `tbl1`;"""
cursor.execute(sql)

while True:
    r = cursor.fetchone()
    if r is None: break
    print "INSERT IGNORE INTO `tbl2` VALUES " , ("",str(r['col2']),r['col3'],str(r['col4']),str(r['col5']),str(r['col6']),str(r['col7'])),";"

python export.py > insert.sql

But this is taking forever.
I understand if fetchall() first gets all the data into the buffer, but I though fetchone would take one row at a time in buffer.

freemem :
             total       used       free     shared    buffers     cached
Mem:       8061084    8014704      46380          0       1228      40300
-/+ buffers/cache:    7973176      87908
Swap:      2000084    1673828     326256
So how I do select all rows and print/update each one ?

Anjanesh

Options: ReplyQuote


Subject
Written By
Posted
Exporting 5 million records
December 01, 2009 01:13PM


Sorry, you can't reply to this topic. It has been closed.

Content reproduced on this site is the property of the respective copyright holders. It is not reviewed in advance by Oracle and does not necessarily represent the opinion of Oracle or any other party.