MySQL Forums
Forum List  »  Newbie

Importing huge dump files into mysql
Posted by: Piyush Popli
Date: January 21, 2010 01:35AM

Hi all,
I'm new to MySQL, so please bear with me if you find this query redundant or frivolous.. :(

I'm trying to import a .sql dump file (approx 22G in size) into mysql.. Its been 62hrs now and the import is still not complete..
In the first few hours mysqld would use a lot of CPU, but now it seems to have gone dormant, even though the whole import is not complete.. The reason I can say this is cause the mysqlshow command hangs on the 7th table.. there are about 150 tables in the dump..

The steps I followed were..
1.) mysqldump -u<username> -p<password> --no-create-db --no-create-info <database-name> > dump.sql
2.) mysql -u<username> -p<password> <database-name> < dump.sql

Few points:
1.) I use --no-create-db and --no-create-info options cause I already have the schema of the Db created at the destination. The schema is perfectly fine.. I've tested it and compared to the source.. they're identical..
2.) I was required to cp the file over the network so I bzip2'ed it with
"bzip2 --best dump.sql" and then bzunzip2'ed it with "bzunip2 dump.sql.bz2"
3.) I googled this issue and found that a few other people were facing this issue
4.) Both my source and destination DB are 5.1 versions

OS(Source and Destination):Red Hat Enterprise Linux ES release 4 (Nahant Update 5)
MySQL(Source) : 5.1.32-community
MYSQL(Destination): 5.1.30-ndb-6.3.20-cluster-gpl-log

Any help would be greatly appreciated.. :)

Thanks
Piyush

Options: ReplyQuote


Subject
Written By
Posted
Importing huge dump files into mysql
January 21, 2010 01:35AM


Sorry, you can't reply to this topic. It has been closed.

Content reproduced on this site is the property of the respective copyright holders. It is not reviewed in advance by Oracle and does not necessarily represent the opinion of Oracle or any other party.