Open Table Cache (1024 limit vs. 100 mil. opened tables)
Posted by: Kim Waldorff Østergaard
Date: February 27, 2019 09:17AM

Hi,

I'm rather new to MySql but our company administrates a Azure-based Wordpress site having a MySQL @ClearDB. It is not performing well.

Started looking a various things, came to open_tables caching and suspects something is very wrong here.... Also the MySQL Workbench shows "Table Open Cache" efficiency being 0% which triggered me initially.

A few results from queries:


show global status like 'open%' -->

# Variable_name, Value
'Open_files', '249'
'Open_streams', '0'
'Open_table_definitions', '991'
'Open_tables', '1024'
'Opened_files', '353286840'
'Opened_table_definitions', '105680951'
'Opened_tables', '108351330'


SHOW GLOBAL STATUS LIKE 'Uptime%' -->

# Variable_name, Value
'Uptime', '6549312'
'Uptime_since_flush_status', '6549312'

Some connection query results -->

# Variable_name, Value
Max_used_connections 146
max_connections 1000


I've watched the Opened_tables over 24 minutes, and below are results at time 0, +11m and +23m:

time count increase
15:50 108351330
16:01 108362801 11471
16:13 108375427 12626

So around 10000 increase per 10 minutes, roughly at this time a day (afternoon).

If any suggestions for quick win/optimizations, I'm all ears. Have not yet fully dived into the table cache and max connection logic.

Options: ReplyQuote




Sorry, you can't reply to this topic. It has been closed.

Content reproduced on this site is the property of the respective copyright holders. It is not reviewed in advance by Oracle and does not necessarily represent the opinion of Oracle or any other party.