Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regression: Startup times for influx && influxd have increased by ~3 secs each #18292

Closed
jsteenb2 opened this issue May 29, 2020 · 9 comments
Closed
Assignees

Comments

@jsteenb2
Copy link
Contributor

The startup times for both influx and influxd have dramatically increased since the merge of algo-w

Steps to reproduce:
List the minimal actions needed to reproduce the behavior.

  1. checkout and build from algo-w merge branch: git checkout d22380dc8b && make build
  2. time it: time influx -h or time influxd -h
  3. checkout preceeding commit: git checkout 97dc4d18db && make build
  4. time it: time influx -h or time influxd -h

Expected behavior:
Expected startup time for influx and influxd to not increase.

Actual behavior:
Roughly a 3s bump on both

@rickspencer3
Copy link
Contributor

I assume that this is a total 6 second delay in start up?

Even 3 seconds is annoying, and we should fix this. However, for now, I think we need to keep the Flux team 100% focused on releasing the performance improvements that are already in the pipeline.

@jsteenb2
Copy link
Contributor Author

jsteenb2 commented Jun 1, 2020

this issue is to catch the regression, it has become very slow. I trust you all to do the prioritization 👍.

I would like to see some tests added to verify these startup times don't regress and are not caught until someone notices it in their CLI. The flux deps have been the driver behind the slow startup times, would be nice to keep this in our field of vision moving forward.

@mark-rushakoff
Copy link
Contributor

Last time this happened, it was unnecessary Flux compilation at startup. See #15441 and #15777. Maybe there's something similar at play this time.

@jsteenb2
Copy link
Contributor Author

jsteenb2 commented Jun 1, 2020

@mark-rushakoff I believe that's right. The regression happens at the merger of algo-w

@nathanielc nathanielc added this to the Sprint 20-Q2-6 milestone Jun 1, 2020
@jsternberg
Copy link
Contributor

This seems to be due to a refactor that is loosely connected with the algorithm w refactor. Previously, the bulk of the work for initialization came from finalizing the builtins. When we refactored it, a lot of the work for initialization is now in RegisterPackage instead since we perform semantic analysis there instead of delaying it to finalizing the builtins.

There's probably two different ways to handle this. One would be to stop doing initialization of the runtime using func init() so that we aren't dependent on imports for the initialization order. Another might be to refactor the runtime registration so it goes back to saving the ast and delaying the semantic analysis to finalizing the builtins.

@adrian-thurston
Copy link
Contributor

Possibly init() is the wrong place to do this work. The init() function could be used to log that the flux package exists, but later on the initialization work can happen before any flux code is evaluated.

@wolffcm
Copy link

wolffcm commented Jun 25, 2020

Possible solution to this issue:

  • Modify the builtins tool so that it finds stdlib packages that have a function named StdlibInit(), and generates a function InitFlux() in packages.go that will 1) call all those functions and 2) perform the finalizing steps for having a workable Flux runtime
  • Refactor flux repl, influx and influxd to call InitFlux()

Source code for builtins tool, relevant code:
https://github.com/influxdata/flux/blob/master/internal/cmd/builtin/cmd/generate.go#L156

@wolffcm
Copy link

wolffcm commented Jun 29, 2020

We think there's a simple fix, timeboxing to one day, if it takes longer, we'll need to put it into next sprint.

@jsternberg
Copy link
Contributor

The fix for this is in flux v0.70.0 which was included in #18783.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants