Re: Decreasing Checkpoints & allowing for heavy load
Posted by:
Adam D
Date: October 10, 2005 01:23AM
Is there some sort of gauge to determine a fair value for these based on some average runtime information such as;
Hourly; selects; 50,000, deletes: 15,000, updates: 15,000, inserts: 15,000
It probably highly depends on row size. But would you have some 'fair' calculation similar to the one used in the docs about memory requirements per datanode for settings like these. As its quite hard to predict what 'load' settings required in the cluster without actually having the cluster in production. Would be great if something like that was around.
Adam
Subject
Views
Written By
Posted
2083
September 26, 2005 12:46AM
1493
October 06, 2005 05:14PM
Re: Decreasing Checkpoints & allowing for heavy load
1397
October 10, 2005 01:23AM
1482
October 10, 2005 05:35AM
1418
October 12, 2005 01:27AM
Sorry, you can't reply to this topic. It has been closed.
This forum is currently read only. You can not log in or make any changes. This is a temporary situation.
Content reproduced on this site is the property of the respective copyright holders.
It is not reviewed in advance by Oracle and does not necessarily represent the opinion
of Oracle or any other party.