Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Migrate to jsii-srcmak #276

Merged
merged 4 commits into from
Aug 20, 2020
Merged

Migrate to jsii-srcmak #276

merged 4 commits into from
Aug 20, 2020

Conversation

jsteinich
Copy link
Collaborator

Address #272

Python integration test is currently failing, but all other tests are passing.

@jsteinich
Copy link
Collaborator Author

It's also extremely slow, so hopefully that's related to the failure

@jsteinich
Copy link
Collaborator Author

I believe the error I'm encountering is the same as cdk8s-team/cdk8s#273 since both the aws provider and the vpc module are getting set to the generated namespace.

@jsteinich
Copy link
Collaborator Author

@eladb Before I start profiling things, do you have any idea why jsii-srcmak would be so much slower than jsii + jsii-pacmak in a simple wrapper script?

@eladb
Copy link
Collaborator

eladb commented Aug 3, 2020

@eladb Before I start profiling things, do you have any idea why jsii-srcmak would be so much slower than jsii + jsii-pacmak in a simple wrapper script?

No idea. I haven't noticed this myself but now I am curious. Let me know what you find out!

@skorfmann
Copy link
Contributor

It's also extremely slow, so hopefully that's related to the failure

Yes, for Python it feels pretty slow indeed.

I believe the error I'm encountering is the same as awslabs/cdk8s#273 since both the aws provider and the vpc module are getting set to the generated namespace.

Looks like there might be a fix on master. But the build failed for some reason and therefore no package was published. Perhaps we can give it a shot straight from github?

@eladb
Copy link
Collaborator

eladb commented Aug 3, 2020

Looks like there might be a fix on master. But the build failed for some reason and therefore no package was published. Perhaps we can give it a shot straight from github?

I am on it

@eladb
Copy link
Collaborator

eladb commented Aug 3, 2020

Looks like there might be a fix on master. But the build failed for some reason and therefore no package was published. Perhaps we can give it a shot straight from github?

I am on it

Released

@skorfmann
Copy link
Contributor

It's also extremely slow, so hopefully that's related to the failure

Yes, for Python it feels pretty slow indeed.

Well, without measuring it - the current version feels slow as well. :)

@skorfmann
Copy link
Contributor

Released

Thanks!

@skorfmann skorfmann mentioned this pull request Aug 4, 2020
const manifest = JSON.parse(await fs.readFile(jsiiPath, 'utf-8'));

// patch cdktf version in manifest because it's not stable
manifest.dependencies.cdktf = '999.999.999';
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you elaborate on this number?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It could be anything. That value is just what cdk8s uses.

The idea is that the dependency version doesn't provide any meaningful information to the snapshot and so should be effectively ignored.

@jsteinich
Copy link
Collaborator Author

In my testing, the python integration test went from 4.5 minutes to 27 minutes with this change. I don't think we should merge until that time is cut down.

I haven't definitively tracked down the cause yet. My current hunch is that this line: https://github.com/aws/jsii-srcmak/blob/b3ca41e005af735adef79422033d179192875dfc/src/srcmak.ts#L20 is causing a lot of unnecessary file io.

@skorfmann
Copy link
Contributor

In my testing, the python integration test went from 4.5 minutes to 27 minutes with this change. I don't think we should merge until that time is cut down.

Ok, that's a huge difference. I'll try this on my machine again. Which kind of setup do you have? Linux, Mac, Windows?

@jsteinich
Copy link
Collaborator Author

Ok, that's a huge difference. I'll try this on my machine again. Which kind of setup do you have? Linux, Mac, Windows?

That's running the same docker container used by the build workflow using docker desktop wsl2 backend on a wsl2 filesystem from Windows.

The github runner does also show a decent increase in time as well: going from ~7 minutes for all integration tests to ~23 minutes.

@eladb
Copy link
Collaborator

eladb commented Aug 4, 2020

I haven't definitively tracked down the cause yet. My current hunch is that this line: https://github.com/aws/jsii-srcmak/blob/b3ca41e005af735adef79422033d179192875dfc/src/srcmak.ts#L20 is causing a lot of unnecessary file io.

Sounds like this is something we should be able to fix. Can you guys raise an issue towards jsii-srcmak and we will tackle in short order?

@skorfmann
Copy link
Contributor

The github runner does also show a decent increase in time as well: going from ~7 minutes for all integration tests to ~23 minutes.

I'm experementing with #98 and a full jsii build takes 9 minutes. That's with cdktf@latest and "jsii": "^1.9.0",. So, it's probably not jsii-srcmak which got slower, but jsii itself?

Screenshot 2020-08-05 at 12 10 29

Also, we probably should upgrade and align all jsii dependencies.

@jsteinich
Copy link
Collaborator Author

I haven't definitively tracked down the cause yet. My current hunch is that this line: https://github.com/aws/jsii-srcmak/blob/b3ca41e005af735adef79422033d179192875dfc/src/srcmak.ts#L20 is causing a lot of unnecessary file io.

Sounds like this is something we should be able to fix. Can you guys raise an issue towards jsii-srcmak and we will tackle in short order?

While there is some unnecessary work being done (already being called from a temporary directory), this doesn't appear to be the issue. I'm still working to track down the main cause, but once I have more conclusive information I'll make sure to file anything.

@eladb
Copy link
Collaborator

eladb commented Aug 5, 2020

Copying @RomainMuller

@RomainMuller
Copy link

I suggest you give 0x a shot to try to figure out where time is spent.

@skorfmann
Copy link
Contributor

@jsteinich thanks for creating the issue in the jsii repo

@skorfmann
Copy link
Contributor

skorfmann commented Aug 7, 2020

is my assumption correct, that this is mostly a CI issue around integration tests? I did some local testing again, and the times for cdktf get for the aws example were ranging between one minute (Mac OSX) and roughly two minutes (Docker hashicorp/jsii-terraform), for both master and this branch.

If so, I'd probably ignore the performance regression for now.

If it's an actual user facing issue, we should submit a PR to fix aws/jsii#1856

Thoughts?

@jsteinich
Copy link
Collaborator Author

mostly a CI issue around integration tests

The other place where it will show up is the first time someone sets up a python project, or every project init if they use a different virtual environment per project.

@skorfmann
Copy link
Contributor

mostly a CI issue around integration tests

The other place where it will show up is the first time someone sets up a python project, or every project init if they use a different virtual environment per project.

So, a cdktf init --language python would cause it?.

@jsteinich
Copy link
Collaborator Author

So, a cdktf init --language python would cause it?.

If it's the first time running on a system / virtual env, then yes. Subsequent runs use cached data and are just subject to the standard slowness.

@skorfmann
Copy link
Contributor

Subsequent runs use cached data and are just subject to the standard slowness.

Do you know how to clear that cache to reproduce easily?

@jsteinich
Copy link
Collaborator Author

Do you know how to clear that cache to reproduce easily?

Easiest repro I know is creating a new pipenv.
psf/black#248 (comment) might work for clearing.

@skorfmann
Copy link
Contributor

Easiest repro I know is creating a new pipenv.

Hm, I believe I did that, and in a Docker container I'd get a new one anyway, I guess.

@jsteinich
Copy link
Collaborator Author

what do you think, push for a fix in jsii or just merge it?

Doesn't look like it would be too difficult to create a PR to make black optional. I don't know how long it would take to get changes approved and released though. We'd also have to get it added to jsii-srcmak as well, but that seems more likely to go quickly.

I'm not sure the urgency of better Windows support, Java support, and anything else that this is holding up though. If any of those are wanted for the next release, I think it would make sense to get this merged in, but then hold on releasing until performance was improved.

@eladb
Copy link
Collaborator

eladb commented Aug 10, 2020

We can short circuit black in srcmak in the meantime (just include a stub and add it to the beginning of the PATH when we execute Pac-Pak).

Once merge it will be immediately released.

@skorfmann
Copy link
Contributor

I'm not sure the urgency of better Windows support, Java support, and anything else that this is holding up though.

It would be good to get this merged, so we don't have too many parallel tracks here.

We can short circuit black in srcmak in the meantime (just include a stub and add it to the beginning of the PATH when we execute Pac-Pak).

Also a good idea 👍

@eladb
Copy link
Collaborator

eladb commented Aug 10, 2020

I'm not sure the urgency of better Windows support, Java support, and anything else that this is holding up though.

It would be good to get this merged, so we don't have too many parallel tracks here.

We can short circuit black in srcmak in the meantime (just include a stub and add it to the beginning of the PATH when we execute Pac-Pak).

Also a good idea 👍

Would you guys like to pick this up?

@jsteinich
Copy link
Collaborator Author

Would you guys like to pick this up?

I'd feel comfortable stubbing black within cdktf, but I haven't managed to get a custom version of jsii-srcmak to work when calling from cdktf yet.

@eladb
Copy link
Collaborator

eladb commented Aug 10, 2020

On it

@eladb
Copy link
Collaborator

eladb commented Aug 10, 2020

Okay, coming up: aws/jsii#1864 (by @RomainMuller)

@jsteinich
Copy link
Collaborator Author

@eladb Any idea when the next jsii release will be?

@skorfmann
Copy link
Contributor

skorfmann commented Aug 19, 2020

@eladb Any idea when the next jsii release will be?

Just got released a few hours ago https://github.com/aws/jsii/releases/tag/v1.11.0

@skorfmann
Copy link
Contributor

This is likely relevant: #328

@jsteinich
Copy link
Collaborator Author

@eladb Any idea when the next jsii release will be?

Just got released a few hours ago https://github.com/aws/jsii/releases/tag/v1.11.0

Looks like jsii-srcmak doesn't pass tests when upgrading to jsii 1.11.0. Once that's fixed and released (let me know if I can help), then I'll get this updated.

@eladb
Copy link
Collaborator

eladb commented Aug 19, 2020

@eladb Any idea when the next jsii release will be?

Just got released a few hours ago https://github.com/aws/jsii/releases/tag/v1.11.0

Looks like jsii-srcmak doesn't pass tests when upgrading to jsii 1.11.0. Once that's fixed and released (let me know if I can help), then I'll get this updated.

Released jsii-srcmak with jsii 1.11.0

@jsteinich jsteinich marked this pull request as ready for review August 20, 2020 03:46
@skorfmann
Copy link
Contributor

Awesome work @jsteinich and thanks for the quick support @eladb @RomainMuller

@skorfmann skorfmann merged commit 76c3e12 into hashicorp:master Aug 20, 2020
@anubhavmishra
Copy link
Member

anubhavmishra commented Aug 20, 2020

@skorfmann can we close #272?

@github-actions
Copy link
Contributor

I'm going to lock this pull request because it has been closed for 30 days. This helps our maintainers find and focus on the active issues. If you've found a problem that seems related to this change, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Dec 14, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants