MySQL Forums
Forum List  »  Performance

Seeking advice on setup
Posted by: Levi Tedder
Date: October 27, 2010 05:43AM


I have the following table:

CREATE TABLE "content_description" (
"md5hash" char(32) NOT NULL,
"date" datetime NOT NULL,
"NameA" varchar(100) DEFAULT NULL,
"NameB" varchar(100) DEFAULT NULL,
"Description" text NOT NULL

The table will hold over 2 million rows of data, and increase with about 10.000 rows per day (updated every hour). The size of 'Description' varies. 'md5hash' is actually unique and could've been the primary key, but I've created an 'id' column of type integer in case a char(32) pk column would slow things down. I currently have about 1.2 million rows which reports a size of 1.6 GB.

This table is the only thing we have on a separate server with mysql 5.1.50. It's a windows 2000 server with 4GB ram.

Users are supposed to search the 'Description' column for anything they want to. The result will be the 200 latest (by 'date') rows that matches their search.

I've set it up at the moment as myisam and a full text index on 'Description'. Would this be the most efficient table type for this, or should innoDB be used?

Which configuration variables should I take care of to enable as fast searches as possible - maybe make better use of the memory available?

I'm hoping to be able to search the table within seconds, at the moment it seems to choke when searching for data I know are in a lot of rows.

My current search:
SELECT NameA,NameB,date,md5hash
FROM content_description
WHERE MATCH(Description) AGAINST ('searchstring')
LIMIT 0,200

Thank you very much for any input!


Options: ReplyQuote

Written By
Seeking advice on setup
October 27, 2010 05:43AM
October 28, 2010 08:34AM
October 28, 2010 08:34AM
October 28, 2010 02:10PM

Sorry, you can't reply to this topic. It has been closed.

Content reproduced on this site is the property of the respective copyright holders. It is not reviewed in advance by Oracle and does not necessarily represent the opinion of Oracle or any other party.