Skip to content

Conversation

cangeli-orange
Copy link
Contributor

I know this solution is not acceptable, but I noticed that rebuilding the tokens map instead of updating it was faster and less CPU consuming when adding a node to a cluster of more than 40 nodes and 10 keyspaces.

@datastax-bot
Copy link

Hi @cangeli-orange, thanks for your contribution!

In order for us to evaluate and accept your PR, we ask that you sign a contribution license agreement. It's all electronic and will take just minutes.

Sincerely,
DataStax Bot.

@datastax-bot
Copy link

Thank you @cangeli-orange for signing the Contribution License Agreement.

Cheers,
DataStax Bot.

@mpenick
Copy link
Contributor

mpenick commented Apr 27, 2017

Interesting, thanks for the patch. I'll investigate why updating is slower...

@mpenick
Copy link
Contributor

mpenick commented May 3, 2017

I found a bug that fixes the issue of updating vs rebuilding. Patch incoming...

@mpenick
Copy link
Contributor

mpenick commented May 8, 2017

@mpenick mpenick closed this May 8, 2017
mpenick added a commit that referenced this pull request Dec 10, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants