Perl script to parse Postgresql database into MySQL at XX intervals
Hey all
I was hoping someone could give me some advice, I have a requirement to "parse" a Postgresql (not migrate) database (single database) into an equivalent MySQL database to be used offline for another purpose, the way that I have it working now is like this:
1) Perl (DBI) script executes every 40mins performs a SELECT on the SOURCE tables and INSERTS into MySQL (DESTINATION)
Problems I have and "not sures":
1) In my script I had to specify ALL of the column names to SELECT on (big long list) and ALL of the column names again to INSERT on (can I not just perform a SELECT * and then INSERT *?? I couldn't get this to work)
2) The SOURCE DATABASE automatically updates once every 30mins, so my script runs just after this is complete (different scheduler, not the same, so we hope they don't overlap - this is not ideal)
3) Must I use SELECT // INSERT? is this the only way? it is very slow (at the Postgresql end :-) ) is there a more efficient method? (I am using filters in my TSQL on the POSTGRESQL end but I dont need to, I only did this to cut down the query time!)
I did consider using ODBC but I'm not seeing any benefit of using this instead of the above.
Thanks all - any input would be greatfully appreciated
Edited 1 time(s). Last edit at 12/16/2010 04:51AM by Martin Thorpe.
Subject
Views
Written By
Posted
Perl script to parse Postgresql database into MySQL at XX intervals
6192
December 16, 2010 04:47AM
4478
December 17, 2010 01:26AM
Sorry, you can't reply to this topic. It has been closed.
Content reproduced on this site is the property of the respective copyright holders.
It is not reviewed in advance by Oracle and does not necessarily represent the opinion
of Oracle or any other party.