MySQL Forums
Forum List  »  General

Re: Optimised way to search over 2 milllion poi data in mysql
Posted by: Rick James
Date: September 02, 2014 07:22PM

Part of the problem is that FLOAT and DECIMAL are (were?) not supported for PARTITION KEYs. So, that conversion is to turn lat and lng into INTs, then use them in RANGE partitioning. Yes, it is messy, but it seems to be the only way to achieve the goal -- partitioning on latitude; PK on longitude.

Note also how I split the one big table into two big tables -- one with a minimal number of columns -- this one does the heavy lifting of the search; one with the other columns -- this one carries the bulky stuff you might need when you find the one row (or 10 pizza parlors) you are after.

Tackling it a different way...
* How much RAM do you have?
* How big is that one big table? (SHOW TABLE STATUS)
* What ENGINE are you using? (SHOW CREATE TABLE)
* How much cache have you configured? (SHOW VARIABLES LIKE '%buffer%')

With the answers to the above, we can see if you can perform all the "find nearest" queries in RAM always. If so, then the I/O cost is not a problem. (My algorithm assumes your data is so big that it cannot be kept in RAM, so it works hard to avoid I/O.)

Options: ReplyQuote


Subject
Written By
Posted
Re: Optimised way to search over 2 milllion poi data in mysql
September 02, 2014 07:22PM


Sorry, you can't reply to this topic. It has been closed.

Content reproduced on this site is the property of the respective copyright holders. It is not reviewed in advance by Oracle and does not necessarily represent the opinion of Oracle or any other party.