"Doing anything that locks those tables takes forever" suggests you're using MyISAM. If the tables are not InnoDB, convert!
A well designed table with 65M rows and adequate indexing should query pretty comfortably. You wouldn't want to put that much data on a shared hosting server. You'd need a dedicated server where you can choose RAM, disk space and CPU power for fast imports & queries.
Then tune the server and the tables, see
http://mysql.rjweb.org/doc.php/memory,
http://mysql.rjweb.org/doc.php/ricksrots#indexing
Typically, database files have lots of air, so they compress a lot. If each of the 65M rows has an ID and 28 floats, the table might be 9-10GB, compressible to 1-2GB, and that could be painlessly ftp-ed to a hosting server, then uncompressed and imported.
That said, if nine tenths of the data can be done without for most purposes, you could install an archiving process that leaves orig data intact but does 10% sampling for current query access.