-
Notifications
You must be signed in to change notification settings - Fork 440
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] myloader 0.15.1.3 import speed is slow #1377
Comments
After importing several tables, it gets stuck. |
Hi @jiugem, |
@davidducos Hi, I tried mydumper-0.15.2-6.el7.x86_64.rpm The mydumper export file is not large, totaling 444MB. |
hi, just to remove the decompression thing perhaps testing by uncompressing all the file before and then import and see if this change the speed ? i saw a x20 times in compression in the dumping in 0.15 and perhaps there is the same lock in loading ? regards, |
Hi @jiugem |
Thanks, `util.dumpSchemas(['db1','db3'],'/data/backup') 117% (16.96M rows / ~14.44M rows), 21.20K rows/s, 3.40 MB/s uncompressed, 1.07 MB/s compressed util.loadDump("/data/backup") Executing DDL - done ` |
Hi @jiugem, please, can you test again with the latest prerelease? |
Hi @davidducos
|
Hi @jiugem, |
Installed, but always report this, I don't know what's going on, CentOS 7发自我的 iPhone在 2024年2月26日,21:54,David Ducos ***@***.***> 写道:
Hi @jiugem,
That means that zstd command was not found on the default locations. Did you install zstd?
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: ***@***.***>
|
Hi @jiugem, The standard paths for zstd command are "/usr/bin/zstd", "/bin/zstd". Do you have those? If none of them are there, execute |
Hi @jiugem, about the initial issue that you reported. I tested with 50k tables and I was not able to reproduce your issue. Please, can you share a test case? |
MySQL version 8.0.33
mydumper-0.15.1-3.el7.x86_64.rpm
The number of exported tables was 50,000, and the speed was OK. The import was very slow. After importing more than 180 tables, it got stuck. The amount of data in each table is not large, less than 1GB.
The text was updated successfully, but these errors were encountered: