MySQL Forums
Forum List  »  Performance

max_allowed_packet error
Posted by: Ra Nala
Date: February 02, 2016 05:12AM

Hi Sir, Madam.

I am trying to load millions of records using one of the ETL tool called Pentaho data integration and facing error like max_allowed_packet while loading millions of records.

i tried to increase max_allowed_packet =64M in my.ini file and restarted the service and effected correctly. when i apply show variables command below is the result.

max_allowed_packet =67108864

Still i am facing issue with max_allowed_packet. Please suggest me how can i achieve this issue.


Variable_name Value
------------------------------------------------------ -----------------------
auto_increment_increment 1
auto_increment_offset 1
autocommit ON
automatic_sp_privileges ON
avoid_temporal_upgrade OFF
back_log 80
basedir D:\MySQLinstallation\
big_tables OFF
bind_address *
binlog_cache_size 32768
binlog_checksum CRC32
binlog_direct_non_transactional_updates OFF
binlog_error_action IGNORE_ERROR
binlog_format STATEMENT
binlog_gtid_simple_recovery OFF
binlog_max_flush_queue_time 0
binlog_order_commits ON
binlog_row_image FULL
binlog_rows_query_log_events OFF
binlog_stmt_cache_size 32768
binlogging_impossible_mode IGNORE_ERROR
block_encryption_mode aes-128-ecb
bulk_insert_buffer_size 8388608
character_set_client utf8
character_set_connection utf8
character_set_database utf8
character_set_filesystem binary
character_set_results
character_set_server utf8
character_set_system utf8
character_sets_dir D:\MySQLinstallation\share\charsets\
collation_connection utf8_general_ci
collation_database utf8_general_ci
collation_server utf8_general_ci
completion_type NO_CHAIN
concurrent_insert AUTO
connect_timeout 10
core_file OFF
datadir D:\MySQLinstallation\Data\
date_format %Y-%m-%d
datetime_format %Y-%m-%d %H:%i:%s
default_storage_engine InnoDB
default_tmp_storage_engine InnoDB
default_week_format 0
delay_key_write ON
delayed_insert_limit 100
delayed_insert_timeout 300
delayed_queue_size 1000
disconnect_on_expired_password ON
div_precision_increment 4
end_markers_in_json OFF
enforce_gtid_consistency OFF
eq_range_index_dive_limit 10
error_count 0
event_scheduler OFF
expire_logs_days 0
explicit_defaults_for_timestamp OFF
external_user
flush OFF
flush_time 0
foreign_key_checks ON
ft_boolean_syntax + -><()~*:""&|
ft_max_word_len 84
ft_min_word_len 4
ft_query_expansion_limit 20
ft_stopword_file (built-in)
general_log OFF
general_log_file SNOWBREEZE.log
group_concat_max_len 1024
gtid_executed
gtid_mode OFF
gtid_next AUTOMATIC
gtid_owned
gtid_purged
have_compress YES
have_crypt NO
have_dynamic_loading YES
have_geometry YES
have_openssl DISABLED
have_profiling YES
have_query_cache YES
have_rtree_keys YES
have_ssl DISABLED
have_symlink YES
host_cache_size 279
hostname SNOWBREEZE
master_info_repository FILE
master_verify_checksum OFF
max_allowed_packet 67108864

Options: ReplyQuote


Subject
Views
Written By
Posted
max_allowed_packet error
2027
February 02, 2016 05:12AM
748
February 06, 2016 07:26PM


Sorry, you can't reply to this topic. It has been closed.

Content reproduced on this site is the property of the respective copyright holders. It is not reviewed in advance by Oracle and does not necessarily represent the opinion of Oracle or any other party.