Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When considering the size of shadow replica shards, set size to 0 #17509

Merged
merged 1 commit into from Apr 7, 2016

Conversation

dakrone
Copy link
Member

@dakrone dakrone commented Apr 4, 2016

Otherwise, when trying to calculate the amount of disk usage after the
shard has been allocated, it has incorrectly subtracted the shadow
replica size.

Resolves #17460

Otherwise, when trying to calculate the amount of disk usage *after* the
shard has been allocated, it has incorrectly subtracted the shadow
replica size.

Resolves elastic#17460
@dakrone dakrone changed the title When considering the size of shadow replica indices, set size to 0 When considering the size of shadow replica shards, set size to 0 Apr 4, 2016
@abeyad
Copy link

abeyad commented Apr 7, 2016

LGTM!

@dakrone dakrone merged commit 8e01b09 into elastic:master Apr 7, 2016
@dakrone dakrone deleted the disk-info-ignore-shadow-size branch May 13, 2016 16:44
@lcawl lcawl added :Distributed/Distributed A catch all label for anything in the Distributed Area. If you aren't sure, use this one. and removed :Allocation labels Feb 13, 2018
@clintongormley clintongormley added :Distributed/Allocation All issues relating to the decision making around placing a shard (both master logic & on the nodes) and removed :Distributed/Distributed A catch all label for anything in the Distributed Area. If you aren't sure, use this one. labels Feb 14, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
>bug :Distributed/Allocation All issues relating to the decision making around placing a shard (both master logic & on the nodes) v2.2.2 v2.3.2 v5.0.0-alpha2
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants