-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LPS-163590 - Update bundler and add package.json dependencies back in #2753
LPS-163590 - Update bundler and add package.json dependencies back in #2753
Conversation
CI is automatically triggering the following test suites:
|
✔️ ci:test:sf - 1 out of 1 jobs passed in 18 minutesClick here for more details.Base Branch:Branch Name: master Sender Branch:Branch Name: LPS-163590 1 Successful Jobs:For more details click here. |
Jenkins Build:test-portal-source-format#3440 Jenkins Report:jenkins-report.html Jenkins Suite:sf Pull Request:liferay-frontend#2753 Testray Routine:EE Pull Request Testray Build:[master] ci:test:sf - bryceosterhaus > liferay-frontend - PR#2753 - 2022-09-28[03:55:49] Testray Importer:publish-testray-report#12646 |
👌 |
Does this PR mean I would have to add These occurrences are rare if they even exist, but I think we should be careful about merging either without coordinating. |
Jenkins Build:test-portal-acceptance-pullrequest(master)#5184 Jenkins Report:jenkins-report.html Jenkins Suite:relevant Pull Request:liferay-frontend#2753 Testray Routine:EE Pull Request Testray Build:[master] ci:test:relevant - bryceosterhaus > liferay-frontend - PR#2753 - 2022-09-28[04:16:23] Testray Importer:publish-testray-report#12652 |
Thank you very much for mentioning me, @bryceosterhaus! 🙂 I downloaded this branch to my machine and I'll compare its results against the master branch then I get back to you. In addition, just to let you know, I set the property |
@kresimir-coko yeah going forward we would add all dependencies to package.json, including frontend-js-web. However, even if this PR gets merged and you forget to add |
Hi @bryceosterhaus, I ran All results mentioned above are based on data extracted using the property A few months ago, Brian Chan created a script to measure Thanks |
Hi @bryceosterhaus, I just finished to run the script created by Brian and I realized the following:
Please check the evidences below: Sorry about the quality of the images, but I've had some issues to take screen shots. 😁 Therefore, the impact of merging the @bryceosterhaus, feel free to share this analysis with Brian if he asks about perfomance decrease related to this pull request. Thanks |
@natocesarrego thanks so much for the detailed report! So just to be clear, the worst case scenario you are seeing is
Am I correct in seeing that? For this point
That seems like a really large separation, especially when considered against your other tests. Do you think the 7 minute difference is really due to my changes or do you think it stems from the other processes that were going on? I will use your info and continue to look into this next week and hopefully can get this PR more in line with master's build times |
Hi @bryceosterhaus, Answering your questions:
Yes
Honestly, I don't think your proposed changes are causing the 7 minutes delay, probably, it might be caused by a high load in the network traffic. IMHO, we can consider my latest runs as a performance analysis. Even though, I believe that it worth take a look on the reports mentioned here, since they can assist us identify the biggest build time offenders. 🙂 Thanks for your partnership! |
Just wanted to note some things about the tests...
3 may sound overkill but it's the only correct way to check these things 🤷 Note that I admit that I haven't done it that way either when I've made performance changes, but maybe we should start doing it 🤔 In the end, we are talking about hundreds of machines running ant all every day... |
Yeah I agree, 8 seconds seems pretty negligible compared to 8min.
I am getting a VM for running nightly performance tests, and one thing we could also add on that machine is testing how long the bundle process takes in DXP. That way we could track larger deviations. The nightly performance checks should only be run once or twice a day, so we can utilize the other machine time with build times for our FE tools.
I've run some tests to iterate several |
bf0998c
to
07821a7
Compare
ci:test:sf |
ci:test:relevant |
ci:test:sf |
Because this PR touches pretty much every module and triggers so many tests in portal, I think it makes sense to forward to BChan once SF is green. The previous unique failure seemed to be flakey, as it passed locally for me. |
✔️ ci:test:sf - 1 out of 1 jobs passed in 23 minutesClick here for more details.Base Branch:Branch Name: master Sender Branch:Branch Name: LPS-163590 1 Successful Jobs:For more details click here. |
ci:test:relevant |
manual forward at brianchandotcom#124234 |
I additionally spawned this issue, liferay/liferay-frontend-projects#1026 to investigate |
Jenkins Build:test-portal-source-format#520 Jenkins Report:jenkins-report.html Jenkins Suite:sf Pull Request:liferay-frontend#2753 Testray Routine:EE Pull Request Testray Build:[master] ci:test:sf - bryceosterhaus > liferay-frontend - PR#2753 - 2022-10-07[01:00:10] Testray Importer:publish-testray-report#1611 |
Jenkins Build:test-portal-acceptance-pullrequest(master)#4140 Jenkins Report:jenkins-report.html Jenkins Suite:relevant Pull Request:liferay-frontend#2753 Testray Routine:EE Pull Request Testray Build:[master] ci:test:relevant - bryceosterhaus > liferay-frontend - PR#2753 - 2022-10-06[23:47:12] Testray Importer:publish-testray-report#9463 |
Jenkins Build:test-portal-acceptance-pullrequest(master)#4924 Jenkins Report:jenkins-report.html Jenkins Suite:relevant Pull Request:liferay-frontend#2753 Testray Routine:EE Pull Request Testray Build:[master] ci:test:relevant - bryceosterhaus > liferay-frontend - PR#2753 - 2022-10-07[01:26:43] Testray Importer:publish-testray-report#12466 |
~Reviewed. Test failures are unrelated. ✔️ ~
Sorry, this analysis was meant for #2775 |
https://issues.liferay.com/browse/LPS-163590
We previously declared all dependencies in package.json files but then realized this was causing significant slow down in build times because we were re-bundling dependencies that were already provided to the platform. So with this change, we now look to the global npmscripts.config.js for the bundler config and only bundle the dependency if it isn't already provided by another module.
With this change, we can disable and remove the SF rule that removed all of the package.json dependencies.
cc @ling-alan-huang please take a look at this and see if I need to do anything else with the SF rule.
cc @natocesarrego since you originally discoevered the slow down of build times, could you verify here that by adding back in dependencies that we aren't slowing down the build times again? I tested it myself and it was consistent with current master builds, but just want you to verify as well in case you have more sophisticated tools for checking build speeds.