Re: Optimised way to search over 2 milllion poi data in mysql
Posted by:
Rick James
Date: September 02, 2014 07:22PM
Part of the problem is that FLOAT and DECIMAL are (were?) not supported for PARTITION KEYs. So, that conversion is to turn lat and lng into INTs, then use them in RANGE partitioning. Yes, it is messy, but it seems to be the only way to achieve the goal -- partitioning on latitude; PK on longitude.
Note also how I split the one big table into two big tables -- one with a minimal number of columns -- this one does the heavy lifting of the search; one with the other columns -- this one carries the bulky stuff you might need when you find the one row (or 10 pizza parlors) you are after.
Tackling it a different way...
* How much RAM do you have?
* How big is that one big table? (SHOW TABLE STATUS)
* What ENGINE are you using? (SHOW CREATE TABLE)
* How much cache have you configured? (SHOW VARIABLES LIKE '%buffer%')
With the answers to the above, we can see if you can perform all the "find nearest" queries in RAM always. If so, then the I/O cost is not a problem. (My algorithm assumes your data is so big that it cannot be kept in RAM, so it works hard to avoid I/O.)
Subject
Written By
Posted
September 01, 2014 11:42AM
September 01, 2014 08:04PM
September 01, 2014 09:40PM
September 02, 2014 12:11AM
September 02, 2014 02:11AM
Re: Optimised way to search over 2 milllion poi data in mysql
September 02, 2014 07:22PM
September 04, 2014 04:48PM
September 04, 2014 08:58PM
September 06, 2014 10:18PM
September 07, 2014 12:25AM
September 07, 2014 04:09AM
September 07, 2014 10:40AM
September 10, 2014 02:36AM
September 10, 2014 03:42PM
September 12, 2014 04:41AM
September 12, 2014 11:20AM
September 12, 2014 02:21PM
September 16, 2014 10:15PM
September 16, 2014 11:34PM