Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Slow performance when using compression #60 #1037

merged 2 commits into from Sep 6, 2017


Copy link

@DrNeutron DrNeutron commented Aug 8, 2017

Looking at the performance of Paramiko when trying to copy a 1.3Gb CSV file, to be moved between two machines connected by a 100Mbps lan segment. The root cause of the slow performance with compression is in the choice of compression level in paramiko/paramiko/

lib Compression level Time to send (secs) Data over WAN MBytes Compression %
0 117 1,301 0%
1 33 317 76%
2 35 291 78%
3 37 269 79%
4 41 266 80%
5 53 245 81%
6 64 232 82%
7 74 227 83%
8 159 221 83%
9 278 219 83%

The change here is to use the default level of zlib compression (6) rather than the max of 9 which is a bad trade off in CPU and time vs the actual gain on compression

DrNeutron added 2 commits Aug 8, 2017
The previous setting of the compression level to 9 is a poor trade off in CPU and time used for compression vs the size gain over the default level of compression in zlib which is 6.
Copy link

@bitprophet bitprophet commented Aug 10, 2017

Only gut reactions:

  • Was anyone actually relying on level 9 compression in a manner that will be problematic at level 6? I'm gonna guess "lol, no" but I guess we'll see. Almost guaranteed the change will help far more folks than it hurts...
  • Be nice to make this configurable but it looks like that'll require poking holes in a handful of related APIs, so I'm fine punting until somebody has a serious complaint.

Thanks again for this.

Copy link

@bitprophet bitprophet commented Aug 10, 2017

I'll be focusing on Paramiko releases, including the next feature release, next Monday, fwiw.

Copy link

@bz2 bz2 commented Sep 1, 2017

Just as additional review, this change looks sensible to me.

Copy link

@bitprophet bitprophet commented Sep 5, 2017

Reproducing this locally:

  • Airport Extreme w/ one wired and one wireless [802.11n] client
  • both ends are Apple laptops
  • the side running Paramiko is a 2017 MBP 13" w/ TB
  • Python 2.7
  • 1.7 GB text file generated from concatenating a bunch of Project Gutenberg
  • Fabric 2 Connection.put() + connect_kwargs(compression=True)
  • runs of 5 consecutive executions run under zsh's time builtin

I'm not seeing results anywhere near as dramatic as OP's "level 6 compression only takes 1/4 the time of level 9 compression", though there is still a noticeable difference.

With compression set to 9:

112.70s user 7.75s system 104% cpu 1:55.58 total
107.96s user 7.07s system 104% cpu 1:50.36 total
109.76s user 7.42s system 103% cpu 1:53.21 total
108.46s user 7.25s system 103% cpu 1:51.48 total
107.69s user 7.27s system 104% cpu 1:50.17 total

Set to 6:

97.72s user 8.40s system 104% cpu 1:42.02 total
94.92s user 7.11s system 101% cpu 1:40.84 total
95.55s user 8.20s system 102% cpu 1:41.38 total
92.29s user 7.16s system 104% cpu 1:34.86 total
93.46s user 7.05s system 103% cpu 1:37.23 total

For funsies, set to 3:

55.46s user 6.56s system 104% cpu 59.166 total
56.70s user 6.85s system 103% cpu 1:01.31 total
56.32s user 6.80s system 105% cpu 1:00.01 total
56.46s user 6.83s system 104% cpu 1:00.37 total
55.66s user 7.12s system 105% cpu 59.751 total

But anyways, it's basically moot since OpenSSH demonstrably sets it to 6 by default, and that's always our basic guideline (well, mine. clearly wasn't original author's. unless SSH changed its behavior sometime in the last 15 years…)

@bitprophet bitprophet merged commit 5a363f6 into paramiko:master Sep 6, 2017
3 checks passed
3 checks passed
codecov/patch 100% of diff hit (target 75.17%)
codecov/project 75.24% (+0.06%) compared to 08f5037
continuous-integration/travis-ci/pr The Travis CI build passed
bitprophet added a commit that referenced this pull request Sep 6, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
None yet
Linked issues

Successfully merging this pull request may close these issues.

None yet

3 participants