MySQL Forums
Forum List  »  Optimizer & Parser

Re: At Performance Limit for 2 GB RAM?
Posted by: Nick Matheson
Date: August 01, 2005 11:18AM

Jay-

The metadatatemp table is required because it allows selection of a subset of a given stations reports from the y_statsinfo table.

The main metadata table consists of date, offset, region and other information. I then select a group of metadata_ids based on various parameters (putting the results in the y_metadatatemp table) and then retrieve the corresponding data from the y_statsinfo table which has one entry for every stn_id and metadata_id pairing.

There are a total of ~5000 metadata points. In theory a query could select anything from all 5K to just a few. Then for each of these metadata points it retrieves the data for one station out of approximately 4000 stations. So even if I have selected every metadata point I would be returning at most 5000 rows since I am only return data for 1 station. As I understand it a full table scan on statsinfo shouldn't happen since I am returning at most 0.02% of the total rows with any query.

Thanks for the input. I guess I am trying to understand how performance varies with various operations.

I have baselined some simple queries to give me upper bounds and I am trying to understand if 1 sec for 5K and 5 sec for 50K rows is two low or is it simply the query I have here.

For example:

1. Generating a sum on a key value over a table with a million rows based on a keyed index range I can process 800,000 rows per second. (This trend is linear from 1 row selected through the entire data set.)
2. For a join of two tables of a million rows each and summing a value over a subset of the results based on a keyed index range I can process 125,000 rows per second. (Again a linear trend.)
3. For both of the above tests I also ran tests on smaller tables sizes and it show that increasing table size had no effect on the result time, only the number of rows returned.

But I am only getting less than 5,000-10,000 rows per second on my test tables of 20 and 200 million rows.

Again I am just looking for a sanity check. Since my baseline results show very linear performance and much better performance (10-100 times) I just want to know if it is something I have done.

Nick

Options: ReplyQuote


Subject
Views
Written By
Posted
Re: At Performance Limit for 2 GB RAM?
3110
August 01, 2005 11:18AM
3079
August 04, 2005 07:17PM


Sorry, you can't reply to this topic. It has been closed.

Content reproduced on this site is the property of the respective copyright holders. It is not reviewed in advance by Oracle and does not necessarily represent the opinion of Oracle or any other party.