Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add ability to drain a pool #11990

Closed
jgspratt opened this issue Apr 7, 2021 · 6 comments
Closed

Add ability to drain a pool #11990

jgspratt opened this issue Apr 7, 2021 · 6 comments

Comments

@jgspratt
Copy link

jgspratt commented Apr 7, 2021

A minio cluster consists of one or more pools. Over time, it is typical to expand a minio cluster by adding additional pools.

Consider the following example:

Initial startup command:

minio server http://host{1...4}/export{1...4}

Startup command after adding larger hosts 5...8:

minio server http://host{1...4}/export{1...4} http://host{5...8}/export{1...16}

When it is time to retire hardware, it is important to be able to migrate the data in parallel while keeping the cluster online.

Currently, in the above example situation, buckets would begin being spanned across both the original hosts 1...4 as well as hosts 5...8. It is not possible to tell minio, "Please empty hosts 1...4 because I intend to retire them soon."

As a solution, consider the following command as a possibility:

mc admin pool drain http://host{1...4}/export{1...4}

Discussion

An argument for data-rate throughput or priority should be available so that the drain can happen in the background while providing high quality service to the consumers.

Minio would then either exit with code 0 indicating that the pool had begun draining or print an error message indicating an exception and exit with code 1.

It is important that pool draining persist through cluster reboots (or, perhaps preferably, return as "draining paused", which would be a read-only mode).

It is important that the pool draining is guaranteed to finish (new objects should be directed to online pools so that it is not possible that draining the pool could end up going slower than new data was filling the pool).

It is important that the pool draining take place in parallel, using all of the network interfaces of the source servers and the destination servers.

In addition, consider the following commands food for thought:

  • mc admin pool pause-drain http://host{1...4}/export{1...4} - pause the draining process (for example, to mitigate a performance problem)
  • mc admin pool resume-drain http://host{1...4}/export{1...4} - self explanatory, but provide the arguments for tuning the rate of draining here
  • mc admin pool status http://host{1...4}/export{1...4} - shows the status of the pool ("operational", "draining for X minutes: Y% (Z GiB/TiB/PiB & ## objects) remaining", "draining paused: Y% (Z GiB/TiB/PiB & ## objects) remaining", "maintenance mode"
  • mc admin pool enter-maintenance http://host{1...4}/export{1...4} - put the pool into maintenance mode (read-only).
  • mc admin pool exit-maintenance http://host{1...4}/export{1...4} - Opposite of enter-maintenance
@harshavardhana
Copy link
Member

This is currently a work in progress and will be available in future releases.

@stale
Copy link

stale bot commented May 8, 2021

This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 15 days if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label May 8, 2021
@jgspratt
Copy link
Author

jgspratt commented May 8, 2021

I would still like this feature.

@daniel-naegele
Copy link

Any ETA/current status on this? Is there a branch on which contribution is possible?
This feature would help alot especially for smaller projects in order to start small, scale big very fast and retire old pools, which effectively add nothing but cost to the project.

A workaround would be to start a complete new minio cluster and replicate the buckets there. But this will require you to pay double the price for server rent as you need to have two separate cluster online for some time.

@klauspost
Copy link
Contributor

@Butzlabben #12757

@harshavardhana
Copy link
Member

This is already added and merged.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Jan 23, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

4 participants