-
Notifications
You must be signed in to change notification settings - Fork 17.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cmd/link: combining dwarf on darwin can be very slow #12259
Comments
Can you try again with -ldflags="-v=2"? I'm willing to have a look at this if you can tell me how to reproduce it on linux (I don't have access to OS X). |
Just tried this on linux. Linux build time is ok. This is output of a build with added -ldflags="-v=2" on OS X:
|
Heh well that wasn't very helpful. IMO the next thing to try would be
running the go command with -x -work, setting the $WORK env var that
prints, running the cmd/link command line, add -cpuprofile slowlink.pprof
and seeing what that profile reveals.
|
From your log, it seems it's the external linker (ld64) that's taking the
majority of time, probably due to the static linking of v8.
Have you tried to link a simple C++ program (that simply references all v8
symbols Go program is using) with static v8 libraries and see how long it
takes?
|
If I understand correctly both 1.4.2 6l and 1.5 link are using clang++ in link step. 1.4.2
1.5
And little more readable:
1.5
|
Oh, one thing that is new on OS X on darwin is the stuff around combining the debug information. Try -ldflags=-s ? |
Flag -s has impact, build time dropped from ~16 to ~2s. time go build time go build -ldflags "-s" |
OK, that explains the slow down on OS X, but I still can't explain
the slow down on linux (
http://kokizzu.blogspot.com/2015/08/go-15-compile-speed.html)
|
Sure, but surely that should be a different bug. I think it makes sense to retitle this one "cmd/link: combining dwarf on darwin can be very slow" or something. |
Perhaps we can do something in 1.5.1. Label this issue as Go1.5.1
tentatively.
|
@minux is there a separate issue open to track the linux compile speed issues? |
No, please feel free to file a new one. But please remember to include a
way to reproduce the problem.
|
The fact that -s helps does not imply that the problem is the DWARF combining code, as implied by the current issue title. It could be that clang++ is slow to generate the debug info during host link on the Mac, or it could be that dsymutil, used to pull it out, is slow. We need to understand which one. I can't reproduce this on my Mac, because I don't have pkg-config. Can you try applying this diff:
Then rebuild with make.bash and then see what that does to the timings. Thanks. |
A 15 second link is annoying but probably not worth a significant change in a point release. |
Version with two returns makes difference. Before:
With both returns:
After removing first return:
|
OK, great. Thanks. That confirms it really is machoCombineDwarf and not clang++. |
Hmm. I got v8worker working but I can't reproduce the problem still. I see the combining step happening but it takes under a second. Of course, I probably have a different version of Xcode than you do, that is probably writing out different files. If you can still see this happening, please try adding
Thanks. |
It's not better with the latest head.
Here is the profile output: I have Xcode 7.0 (7A220), and xcode command line tools.
|
I too have a codebase that uses v8worker and compiles much more slowly than it used to. I see 3s builds with
OS X 10.10.5. Does this help? Do we need to create a better reproduction case? Is there more information I can provide? (Is this too many questions in one place? 😉 ) |
How embarrassing. The kext that fixes that refuses to load at boot like it's supposed to and I forgot about it. Hopefully, this is better and sorry for your wasted time: link-pprof-sethwklein-psignalfix-2016-01-06.tar.gz (BTW, if you need to check what versions I'm running, I pasted a bunch of them a couple-four posts up.) |
@sethwklein, thanks, fascinating data. Can you also attach the final binary?
|
I probably could post the final binary from the previous run, but I'd rather not. It's my personal information manager and may have personal information in it. But here's a run with the test case from the OP which also exhibits 16s builds normally and sub 2s with When wondering what could differ between my setup and yours, @rsc, I realize that I installed I also have a list in |
@sethwklein Thanks. I looked at the tgz you provided when you posted it 6 days ago, and the profile suggests that we're just spending a lot of time in Write. The writes are not buffered, so you'd think maybe there are lots of small writes happening, but I don't see that either. It looks like it copies one whole DWARF section at a time. So I thought maybe there are many (like 100s) of DWARF sections. But there are not. So I can't tell why it should be so bad. I may poke at this again over the next couple weeks, but it may have to wait until Go 1.7. I know that's not great. |
Not going to get to this until Go 1.7. Sorry. |
@rsc I am sorry, but I think you are wrong. Removing first return enables back Next I ran
|
by disabling dwarf on building LLVM. golang/go#12259
I'm noticing ~15x increase in build time when switched to go 1.5.
It is somehow connected with a package github.com/ry/v8worker - binding for v8 javascript engine.
I'm building on OS X.
This is a trivial program:
1.4 build time:
real 0m1.075s
user 0m0.964s
sys 0m0.108s
1.5 build time:
real 0m16.816s
user 0m16.225s
sys 0m1.226s
Go 1.5 spends almost all of the time on the link step:
/usr/local/Cellar/go/1.5/libexec/pkg/tool/darwin_amd64/link -o $WORK/slow_build/_obj/exe/a.out -L $WORK -L /Users/ianic/work/web/src/golang/pkg/darwin_amd64 -extld=clang++ -buildmode=exe -buildid=14ca37ec220e553c7bf9829fe97751df5a041f5f $WORK/slow_build.a
1.4 output for: time go build -v -x
WORK=/var/folders/dx/p9b300dd27dcbqcrz0h9kn800000gn/T/go-build829145301
slow_build
mkdir -p $WORK/slow_build/_obj/
mkdir -p $WORK/slow_build/_obj/exe/
cd /Users/ianic/work/web/src/golang/src/slow_build
/usr/local/Cellar/go/1.4.2/libexec/pkg/tool/darwin_amd64/6g -o $WORK/slow_build.a -trimpath $WORK -p slow_build -complete -D _/Users/ianic/work/web/src/golang/src/slow_build -I $WORK -I /Users/ianic/work/web/src/golang/pkg/darwin_amd64 -pack ./main.go
cd .
/usr/local/Cellar/go/1.4.2/libexec/pkg/tool/darwin_amd64/6l -o $WORK/slow_build/_obj/exe/a.out -L $WORK -L /Users/ianic/work/web/src/golang/pkg/darwin_amd64 -extld=clang++ $WORK/slow_build.a
mv $WORK/slow_build/_obj/exe/a.out slow_build
real 0m1.075s
user 0m0.964s
sys 0m0.108s
1.5 output for: time go build -v -x
WORK=/var/folders/dx/p9b300dd27dcbqcrz0h9kn800000gn/T/go-build145414805
slow_build
mkdir -p $WORK/slow_build/_obj/
mkdir -p $WORK/slow_build/_obj/exe/
cd /Users/ianic/work/web/src/golang/src/slow_build
/usr/local/Cellar/go/1.5/libexec/pkg/tool/darwin_amd64/compile -o $WORK/slow_build.a -trimpath $WORK -p main -complete -buildid 14ca37ec220e553c7bf9829fe97751df5a041f5f -D _/Users/ianic/work/web/src/golang/src/slow_build -I $WORK -I /Users/ianic/work/web/src/golang/pkg/darwin_amd64 -pack ./main.go
cd .
/usr/local/Cellar/go/1.5/libexec/pkg/tool/darwin_amd64/link -o $WORK/slow_build/_obj/exe/a.out -L $WORK -L /Users/ianic/work/web/src/golang/pkg/darwin_amd64 -extld=clang++ -buildmode=exe -buildid=14ca37ec220e553c7bf9829fe97751df5a041f5f $WORK/slow_build.a
mv $WORK/slow_build/_obj/exe/a.out slow_build
real 0m16.816s
user 0m16.225s
sys 0m1.226s
The text was updated successfully, but these errors were encountered: