MySQL Forums
Forum List  »  Optimizer & Parser

Re: At Performance Limit for 2 GB RAM?
Posted by: Jay Pipes
Date: July 22, 2005 02:13PM

One more thing, Nick. If this is the type of query you will be issuing constantly, you may do well to convert the y_metadatatemp table to the MEMORY storage engine, essentially Pinning the table into memory if you're familiar with SQL Server or Oracle.

Because, as you rightly pointed out, you will not be able to place the entire y_statsinfo PRIMARY KEY index in the key_cache, you will undoubtedly run into a situation where MySQL is "thrashing" disk and memory blocks because it needs to repeatedly move blocks into and out of memory because of other requests. By using the MEMORY storage engine for the y_metadatatemp table, you ensure that the index records from this table (which are used in *every* query) are not bumped out of the key_cache.

Of course, keeping a copy of the y_metadatatemp table in MyISAM/InnoDB would be necessary since the MEMORY table data is lost on a restart.

Just a thought.

BTW, another question worth asking is when you say 1900 transactions per minute, are you actually referring to INSERT/UPDATE requests (transactional), or are you referring to SELECT requests. There's a big difference and different strategies are taken to combat each.

Jay Pipes
Community Relations Manager, North America, MySQL Inc.

Got Cluster? http://www.mysql.com/cluster
Personal: http://jpipes.com

Options: ReplyQuote


Subject
Views
Written By
Posted
Re: At Performance Limit for 2 GB RAM?
3512
July 22, 2005 02:13PM
3338
August 04, 2005 07:17PM


Sorry, you can't reply to this topic. It has been closed.

Content reproduced on this site is the property of the respective copyright holders. It is not reviewed in advance by Oracle and does not necessarily represent the opinion of Oracle or any other party.