Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat: parallelize state-keep #5504

Merged
merged 1 commit into from
Feb 15, 2023

Conversation

Somefive
Copy link
Collaborator

@Somefive Somefive commented Feb 14, 2023

Signed-off-by: Somefive yd219913@alibaba-inc.com

Description of your changes

Parallelize state-keep to enhance the state-keep efficiency.

Effect: In the load-testing with 3k applications, the average reconcile time could be reduced by 30%~40% while using state-keep. (Each application contains 1 deployment, 1 secret, 1 configmap)

I have:

  • Read and followed KubeVela's contribution process.
  • Related Docs updated properly. In a new feature or configuration option, an update to the documentation is necessary.
  • Run make reviewable to ensure this PR is ready for review.
  • Added backport release-x.y labels to auto-backport this PR if necessary.

How has this code been tested

Special notes for your reviewer

Signed-off-by: Somefive <yd219913@alibaba-inc.com>
@codecov
Copy link

codecov bot commented Feb 14, 2023

Codecov Report

Base: 47.04% // Head: 59.58% // Increases project coverage by +12.54% 🎉

Coverage data is based on head (779c541) compared to base (059f248).
Patch coverage: 71.69% of modified lines in pull request are covered.

Additional details and impacted files
@@             Coverage Diff             @@
##           master    #5504       +/-   ##
===========================================
+ Coverage   47.04%   59.58%   +12.54%     
===========================================
  Files         307      305        -2     
  Lines       47023    46954       -69     
===========================================
+ Hits        22123    27979     +5856     
+ Misses      22127    16051     -6076     
- Partials     2773     2924      +151     
Flag Coverage Δ
apiserver-e2etests 35.75% <39.62%> (+0.01%) ⬆️
apiserver-unittests 36.31% <ø> (+0.05%) ⬆️
core-unittests 55.36% <71.69%> (?)
e2e-multicluster-test ?
e2e-rollout-tests 20.99% <39.62%> (+0.02%) ⬆️
e2etests 26.56% <39.62%> (-0.02%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
pkg/resourcekeeper/statekeep.go 51.11% <67.39%> (+9.86%) ⬆️
pkg/resourcekeeper/cache.go 95.16% <100.00%> (ø)
pkg/resourcekeeper/gc.go 77.41% <100.00%> (+19.94%) ⬆️
pkg/auth/identity.go 8.45% <0.00%> (-67.61%) ⬇️
pkg/auth/kubeconfig.go 0.00% <0.00%> (-52.85%) ⬇️
pkg/resourcetracker/tree.go 18.49% <0.00%> (-45.29%) ⬇️
pkg/policy/envbinding/placement.go 40.00% <0.00%> (-36.25%) ⬇️
pkg/auth/privileges.go 39.32% <0.00%> (-30.06%) ⬇️
pkg/utils/jwt.go 35.00% <0.00%> (-20.00%) ⬇️
pkg/utils/util/cmd.go 0.00% <0.00%> (-20.00%) ⬇️
... and 155 more

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

Copy link
Member

@FogDong FogDong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be backported?

@wonderflow
Copy link
Collaborator

no need to backport

@wonderflow
Copy link
Collaborator

But can we have a number of how the effiency up, do we have some pressure test results.

@Somefive
Copy link
Collaborator Author

But can we have a number of how the effiency up, do we have some pressure test results.

Added to the description.

@Somefive Somefive merged commit e209b28 into kubevela:master Feb 15, 2023
@Somefive Somefive deleted the feat/parallelize-state-keep branch February 15, 2023 02:01
zhaohuiweixiao pushed a commit to zhaohuiweixiao/kubevela that referenced this pull request Mar 7, 2023
Signed-off-by: Somefive <yd219913@alibaba-inc.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants