Re: Decreasing Checkpoints & allowing for heavy load
Posted by:
Adam D
Date: October 10, 2005 01:23AM
Is there some sort of gauge to determine a fair value for these based on some average runtime information such as;
Hourly; selects; 50,000, deletes: 15,000, updates: 15,000, inserts: 15,000
It probably highly depends on row size. But would you have some 'fair' calculation similar to the one used in the docs about memory requirements per datanode for settings like these. As its quite hard to predict what 'load' settings required in the cluster without actually having the cluster in production. Would be great if something like that was around.
Adam
Subject
Views
Written By
Posted
2093
September 26, 2005 12:46AM
1498
October 06, 2005 05:14PM
Re: Decreasing Checkpoints & allowing for heavy load
1407
October 10, 2005 01:23AM
1487
October 10, 2005 05:35AM
1425
October 12, 2005 01:27AM
Sorry, you can't reply to this topic. It has been closed.
Content reproduced on this site is the property of the respective copyright holders.
It is not reviewed in advance by Oracle and does not necessarily represent the opinion
of Oracle or any other party.