Re: millions of records
Kevin,
Guess what, I ate my own words with a fork when I said all of that in my previous reply to you.
HERE IS WHAT I DID. You could use this type of method to organize your data, like with the customer id # idea i told you, where you take the first 3 digits (100, or 101), and create a table, and store all of the customers with Id beginning with 100 in the table called 100. I did a HUGE simulation, and I have INCREDIBLE results!!! Mysql is more powerful than i could have ever imagined in my entire life!!!
I call this the 39 Million Row March :P
Creating 39 million lines to simulate Emails + Id #'s
example:
Once you have created your KEY DATA Table (i did an example below), you can query it for the ID #, and take the ID #, get the first 3 digits, and then use that 3 digits to locate the TABLE which the user's data is in... This will allow you to split millions of rows into seperate tables... Lets say you have your ID #'s setup like this:
Customers 1 to 500,000 have ID #'s 10099999 to 39099999... This would create 290 tables. That might be a bit too much for your case, so you might want to lower it to first digit, or first 2 digits.
I think you get where i am going with this right? I think you should experiment, and get your math together on this one and do it right, because I have to do this with my web site.
WANT TO TEST IT?
IF YOU WANT TO SEE HOW MYSQL CAN HANDLE 39 MILLION ROWS OF KEY DATA
Create a table, call it userlocation (or whatever you want)
Then add a column "email" and "id", make the email 100 Varchar, and id 15 Varchar.
Remember, this is all you need, you don't want extra stuff in this table, it will cause a lot of slow-down.
So if My ID # is 10399999 then i would be stored in a table called 103 (or table 10 if you used 2 digits). You could work this any way you want... by first digit, first two digits, first three digits, even first 4, but first 3 and 4 starts to create a lot of tables, i don't know how that would work. but this method will allow you to split them
Then in PHP, make a script to like this:
# WARNING WARNING WARNING
# SCRIPT WILL EAT AROUND 1.09 GIGABYTES
# WARNING WARNING WARNING
<?
# Creates exactly 39 million rows
# May take up to 14 minutes depending on machine / hd space
echo "Writing...<br><br>";
$fp=fopen("C:\mil.txt","wb");
for ($x=1000000; $x < 40000000; $x++) {
$fw=fwrite($fp, "$x@aol.com" . chr(9) . $x . "\r\n");
}
fclose($fp);
echo "Finished writing...";
?>
# NOW TO GET THE 39 MILLION ROWS INTO MYSQL
After you have your C:\mil.txt file ready, you should now LOAD DATA INFILE to that userlocation file (remember, this is just a performance test).
HERE IS THE SYNTAX TO DO THIS:
LOAD DATA INFILE 'C:/mil.txt' INTO TABLE userlocation;
Give it a minute or so, its 39 million lines :P
__________________________________________________________
LOAD DATA STATEMENT RETURN TIME: 1 minute 6.91 seconds. (39 Million Rows)
___________________________________________________________
INDEX YOUR EMAIL COLUMN, BECAUSE YOU WILL HAVE TO SEARCH BY IT.
MySQL Time: 14 minutes.
Cardinality of Email SHOULD BE 39 million
Click the LIGHTNING icon on the row of EMAIL for your table in PHPMyAdmin.
___________________________________________________________
Querying the table:
SELECT * FROM `byemail` WHERE email = '24123765@aol.com'
Query return time: 0.05 sec...
This seems to work.
Now you have the email address, and you can return the ID # from that row to locate the user's table.
Good luck! :)