MySQL Forums
Forum List  »  Optimizer & Parser

how to optimise query for 10+ millions records
Posted by: Azhar Rahi
Date: February 09, 2012 03:30AM

Hello How all are here.. I have recently written a query which runs perfect under the small records but gets slow when the records exceeds upto millions. The records in the table can be increased upto millions within 3,4 days. I have also applied indexes on the columns and used them in where clause, but in some cases it still scans the whole table.
So the query has a left join with another table, so that if the records does not match in the outer table, then it returns the null records. The problem is that the outer table contains a column 'DestinationName', on which we have made a filter. Well its quite possible that user want to see all destinations.So what I have have done to fulfil this, is to apply a condition with ifnull statement in where clause:

i.e.

IFNULL(destination.DestinationName,'Unknown') LIKE @l_Destination

Now when user asks for records against all destinations, it scans the whole table and hence ignore the indexes. I got this by the query anaylyzing using Expain keyword. But if I pass any keyword in @l_Destination, then the Explain shows that it has hit the indexes. So we may say that for all destinations, there can be null destinations as well, in which case the indexes are skipped.

Can anybody helps me regarding this whole scanning? So that we can avoid this.

Also we have to show only 100 records on the Page, but we are doing paging in the code, not in the database. Can it help us also to stop full scanning of the table?

Options: ReplyQuote


Subject
Views
Written By
Posted
how to optimise query for 10+ millions records
2758
February 09, 2012 03:30AM


Sorry, you can't reply to this topic. It has been closed.

Content reproduced on this site is the property of the respective copyright holders. It is not reviewed in advance by Oracle and does not necessarily represent the opinion of Oracle or any other party.