Linking 200M and 100M tables
We're importing two big tables from an external system. The tables are loaded from CSV files and contain 200M and 100M rows. The 200M rows table has FK to the 100M rows table.
In order to link them we import original foreign keys, 2 columns, that are NOT relevant to our DB. Then, once the tables are imported, we generate our own FK by joining the imported tables through the original FKs.
Both tables are indexed on the original FKs (one index on both columns).
We run a statement similar to following:
update 200M-rows-table 200M inner join 100M-rows-table 100M
on (100M.orig_fk_column1 = 200M.orig_fk_column1 and 100M.orig_fk_column2 = 200M.orig_fk_column2)
set 200M.new_fk = 100M.pk
We run updates in 100K rows batches.
The problem is that each batch takes too much time, and there is huge difference between durations of different batches.
Some 100K batches finish the update in 20sec (which seems to me too long for 100K rows), while other batches take 100 secs - 5 times longer.
The question is how we can troubleshoot and possibly tune the performance of this update?
I would greatly appreciate any ideas or tips.
Thanks,
Philopator.
Subject
Views
Written By
Posted
Linking 200M and 100M tables
3356
April 20, 2011 03:00PM
Sorry, you can't reply to this topic. It has been closed.
Content reproduced on this site is the property of the respective copyright holders.
It is not reviewed in advance by Oracle and does not necessarily represent the opinion
of Oracle or any other party.