Re: Archive Some Data from Huge Table (>1TB)
Posted by:
Ravi Rai
Date: May 07, 2019 11:35PM
Hi All,
I need to archive some data(Based on condition) from a huge table(size 1TB). My Table structure is as follows. This table is having foreign keys and also have TEXT type
column. Need suggestion with minimum down time.
MYSQL Version - 5.6
CREATE TABLE `emp_details` (
`id` BIGINT(20) NOT NULL AUTO_INCREMENT,
`emp_ref_no` BIGINT(20) NOT NULL,
`emp_client_id` BIGINT(20) NOT NULL,
`date_of_joining` DATE NULL DEFAULT NULL,
`job_status` INT(11) NULL DEFAULT NULL,
`user_request` TEXT NULL,
`user_response` MEDIUMTEXT NULL,
`user_raw_response` LONGTEXT NULL,
`is_online` TINYINT(1) NULL DEFAULT '0',
`user_request_id` VARCHAR(20) NULL DEFAULT NULL,
`errors` TEXT NULL,
`date_of_user_request` DATE NULL DEFAULT NULL,
`user_process_def_key` VARCHAR(100) NULL DEFAULT NULL,
`active` TINYINT(1) NOT NULL DEFAULT '1',
`user_eligibility_rule_status` INT(2) NULL DEFAULT '0',
`is_processed_by_batch` TINYINT(1) NULL DEFAULT '0',
PRIMARY KEY (`id`),
INDEX `fk_user_reference_id` ` (`user_reference_id`),
INDEX `fk_user_client_id` (`user_client_id`),
CONSTRAINT `fk_user_client_id` FOREIGN KEY (`user_client_id`) REFERENCES `m_user`(`id`),
CONSTRAINT `fk_user_reference_id` FOREIGN KEY (`user_app_reference_id`) REFERENCES `m_user_app_reference` (`id`)
)
COLLATE='utf8_general_ci'
ENGINE=InnoDB
AUTO_INCREMENT=2400278677;