MySQL Forums
Forum List  »  Knowledge Base

Large table optimization
Posted by: Viorel Serbu
Date: December 16, 2009 06:38AM

I have an application, which is on test for now, that will produce a very large amount of database records.
In short, there are several continuous data acquisition systems that take 2 scans on every second for 20 samples. Each sample value is inserted in a table together with the acquisition date - id, year, month, day, hour, sec, msec, systime, value.
systime is a bigint, value is a float and the others are int’s..

The keywords here are “continuous” and “500ms sampling rate”

The table has no indexes and is using innodb engine.
Te instance installation is standard no special tuning.

At this moment the samples are inserted in the acval table and after few days of working with limited functionality the systems already produced 3 million records in this table.

Up to now the inserts have no problems and they are able to keep the pace with the sampling rate (for 5 samples out of 20), but the selects start to worry me. It takes around 7 sec to execute a query and it needs 3 to 8 queries in order to display the data. A query looks like that
select year, moth, day, hour, minute, avg(systime), avg(value) from acval where systime >= t1 and systime <= t2 and id = x group by year, month, day, hour, minute, order by systime

I’m asking my self what will happen for 30 millions, or 300 millions, or... of records.

Any suggestion here about how to optimize is welcome.

Options: ReplyQuote

Written By
Large table optimization
December 16, 2009 06:38AM
December 28, 2009 12:29PM
December 29, 2009 10:45AM

Sorry, you can't reply to this topic. It has been closed.

Content reproduced on this site is the property of the respective copyright holders. It is not reviewed in advance by Oracle and does not necessarily represent the opinion of Oracle or any other party.