> We are deleting old data based on datum column, " begin transaction: select id from availability where datum < 'yesterday', loop begins: delete from availability where id='?'; loop end: commit;
Terribly inefficient.
DELETE FROM availability WHERE datum < NOW() - INTERVAL 24 HOUR LIMIT 1000;
Run it every hour, or more frequently if you think there are more than 1000 per hour. See also:
http://mysql.rjweb.org/doc.php/deletebig
Do ANALYZE TABLE whenever there is a problem. It will _usually_ solve the problem. You need 5.6.xx (5.6.8 or later) to get a more permanent solution -- "persistent statistics". But upgrade to 5.5 for now; move to 5.6 after you have shaken down 5.5.
> What are the monitoring tools (or) any do you prefer, for being proactive.
Slowlog, with long_query_time = 1.