You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
myloader stuck for hours (more than 10+) sorting the tables in metadata file.
From pstack:
Thread 1 (Thread 0x7fcbfa34a880 (LWP 14448)):
#0 0x00000000004283f6 in compare_dbt_short () #1 0x00007fcbf8e433a0 in g_list_insert_sorted_real () from /lib64/libglib-2.0.so.0 #2 0x000000000042845e in refresh_table_list_without_table_hash_lock () #3 0x0000000000424a51 in append_new_db_table () #4 0x000000000042629e in process_metadata_global () #5 0x0000000000428c85 in process_directory () #6 0x000000000041f039 in main ()
One CPU core is getting maxed at 100%. Probably the CPU where this thread is running.
To Reproduce
Command executed:
mydumper with all the parameters
mydumper --outputdir=/data2/mysql/mydumper/mydumper_export_no_load_data --logfile=/data2/mysql/mydumper/mydumper_no_load_data.log --no-backup-locks --threads=32 -v 3
myloader with all the parameters
myloader --directory=/data2/mysql/mydumper/mydumper_export_no_load_data --logfile=/data2/mysql/mydumper/myloader.log --threads=32 --max-threads-for-schema-creation=16 -v 3 --debug
What mydumper and myloader version has been used?
mydumper --version
mydumper v0.16.1-3, built against MySQL 5.7.44-48 with SSL support
myloader --version
myloader v0.16.1-3, built against MySQL 5.7.44-48 with SSL support
Expected behavior
myloader should not take so long on sorting the metadata file.
Log
Last Snippet in log when its stuck:
2024-05-02 21:45:43 [DEBUG] - [T17] Thread 17: Starting import
2024-05-02 21:45:43 [DEBUG] - [T17] refresh_db_queue <- THREAD
2024-05-02 21:45:43 [DEBUG] - [T26] Thread 26: Starting import
2024-05-02 21:45:43 [DEBUG] - [CJT] refresh_db_queue -> THREAD (24 loaders waiting)
2024-05-02 21:45:43 [DEBUG] - [T26] refresh_db_queue <- THREAD
2024-05-02 21:45:43 [DEBUG] - [CJT] Thread is asking for job again
2024-05-02 21:45:43 [DEBUG] - [T25] Thread 25: Starting import
2024-05-02 21:45:43 [DEBUG] - [CJT] No job available
2024-05-02 21:45:43 [DEBUG] - [T25] refresh_db_queue <- THREAD
2024-05-02 21:45:43 [DEBUG] - [CJT] refresh_db_queue -> THREAD (25 loaders waiting)
2024-05-02 21:45:43 [DEBUG] - [CJT] Thread is asking for job again
2024-05-02 21:45:43 [DEBUG] - [CJT] No job available
2024-05-02 21:45:43 [DEBUG] - [CJT] refresh_db_queue -> THREAD (26 loaders waiting)
2024-05-02 21:45:43 [DEBUG] - [CJT] Thread is asking for job again
2024-05-02 21:45:43 [DEBUG] - [CJT] No job available
2024-05-02 21:45:43 [DEBUG] - [T28] Thread 28: Starting import
2024-05-02 21:45:43 [DEBUG] - [T28] refresh_db_queue <- THREAD
2024-05-02 21:45:43 [DEBUG] - [CJT] refresh_db_queue -> THREAD (27 loaders waiting)
2024-05-02 21:45:43 [DEBUG] - [CJT] Thread is asking for job again
2024-05-02 21:45:43 [DEBUG] - [CJT] No job available
2024-05-02 21:45:43 [DEBUG] - [T29] Thread 29: Starting import
2024-05-02 21:45:43 [DEBUG] - [T32] Thread 32: Starting import
2024-05-02 21:45:43 [DEBUG] - [T29] refresh_db_queue <- THREAD
2024-05-02 21:45:43 [DEBUG] - [T32] refresh_db_queue <- THREAD
2024-05-02 21:45:43 [DEBUG] - [CJT] refresh_db_queue -> THREAD (28 loaders waiting)
2024-05-02 21:45:43 [DEBUG] - [CJT] Thread is asking for job again
2024-05-02 21:45:43 [DEBUG] - [CJT] No job available
2024-05-02 21:45:43 [DEBUG] - [CJT] refresh_db_queue -> THREAD (29 loaders waiting)
2024-05-02 21:45:43 [DEBUG] - [CJT] Thread is asking for job again
2024-05-02 21:45:43 [DEBUG] - [T30] Thread 30: Starting import
2024-05-02 21:45:43 [DEBUG] - [CJT] No job available
2024-05-02 21:45:43 [DEBUG] - [T30] refresh_db_queue <- THREAD
2024-05-02 21:45:43 [DEBUG] - [CJT] refresh_db_queue -> THREAD (30 loaders waiting)
2024-05-02 21:45:43 [DEBUG] - [CJT] Thread is asking for job again
2024-05-02 21:45:43 [DEBUG] - [CJT] No job available
2024-05-02 21:45:43 [DEBUG] - [T31] Thread 31: Starting import
2024-05-02 21:45:43 [DEBUG] - [T31] refresh_db_queue <- THREAD
2024-05-02 21:45:43 [DEBUG] - [CJT] refresh_db_queue -> THREAD (31 loaders waiting)
2024-05-02 21:45:43 [DEBUG] - [CJT] Thread is asking for job again
2024-05-02 21:45:43 [DEBUG] - [CJT] No job available
2024-05-02 21:45:45 [DEBUG] - [MDT] Reading metadata: metadata
2024-05-02 21:45:45 [DEBUG] - [MDT] metadata: quote character is `
metadata file is 94MB big.
Environment (please complete the following information):
uname -a
Linux amzn2.x86_64 #1 SMP x86_64 x86_64 x86_64 GNU/Linux
The text was updated successfully, but these errors were encountered:
Describe the bug
myloader stuck for hours (more than 10+) sorting the tables in metadata file.
From pstack:
Thread 1 (Thread 0x7fcbfa34a880 (LWP 14448)):
#0 0x00000000004283f6 in compare_dbt_short ()
#1 0x00007fcbf8e433a0 in g_list_insert_sorted_real () from /lib64/libglib-2.0.so.0
#2 0x000000000042845e in refresh_table_list_without_table_hash_lock ()
#3 0x0000000000424a51 in append_new_db_table ()
#4 0x000000000042629e in process_metadata_global ()
#5 0x0000000000428c85 in process_directory ()
#6 0x000000000041f039 in main ()
One CPU core is getting maxed at 100%. Probably the CPU where this thread is running.
To Reproduce
Command executed:
mydumper --outputdir=/data2/mysql/mydumper/mydumper_export_no_load_data --logfile=/data2/mysql/mydumper/mydumper_no_load_data.log --no-backup-locks --threads=32 -v 3
myloader --directory=/data2/mysql/mydumper/mydumper_export_no_load_data --logfile=/data2/mysql/mydumper/myloader.log --threads=32 --max-threads-for-schema-creation=16 -v 3 --debug
What mydumper and myloader version has been used?
mydumper --version
mydumper v0.16.1-3, built against MySQL 5.7.44-48 with SSL support
myloader --version
myloader v0.16.1-3, built against MySQL 5.7.44-48 with SSL support
Expected behavior
myloader should not take so long on sorting the metadata file.
Log
Last Snippet in log when its stuck:
metadata file is 94MB big.
Environment (please complete the following information):
uname -a
Linux amzn2.x86_64 #1 SMP x86_64 x86_64 x86_64 GNU/Linux
The text was updated successfully, but these errors were encountered: