Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Provides a throttling strategy based on the number of pending blocking tasks. #4073

Open
wants to merge 16 commits into
base: main
Choose a base branch
from

Conversation

TheWeaVer
Copy link
Contributor

Motivation:

Give a way to blocking request throttling strategy to prevent the situation that worsens rapidly

Modifications:

  • Add new throttling strategy based on the giving limit size

Result:

@codecov
Copy link

codecov bot commented Feb 9, 2022

Codecov Report

Attention: Patch coverage is 12.00000% with 44 lines in your changes are missing coverage. Please review.

Project coverage is 74.00%. Comparing base (bc2454a) to head (b94e0f4).
Report is 100 commits behind head on main.

❗ Current head b94e0f4 differs from pull request most recent head b9e34a1. Consider uploading reports for the commit b9e34a1 to get more accurate results

Files Patch % Lines
...ommon/util/DefaultLimitedBlockingTaskExecutor.java 0.00% 41 Missing ⚠️
...meria/common/util/LimitedBlockingTaskExecutor.java 0.00% 2 Missing ⚠️
.../armeria/server/throttling/ThrottlingStrategy.java 0.00% 1 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##             main    #4073       +/-   ##
===========================================
+ Coverage        0   74.00%   +74.00%     
- Complexity      0    18099    +18099     
===========================================
  Files           0     1530     +1530     
  Lines           0    67141    +67141     
  Branches        0     8479     +8479     
===========================================
+ Hits            0    49686    +49686     
- Misses          0    13392    +13392     
- Partials        0     4063     +4063     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@trustin
Copy link
Member

trustin commented Mar 14, 2022

This PR has an inevitable race condition where the number of pending tasks can exceed the desired limit. How about providing an API that wraps a given Executor, ExecutorService or other well known executor-like interface like the following:

myWrappedExecutor = ThrottlingStrategy.wrap(myExecutor, 10 /* or myLimitSupplier */)

Server
  .builder()
  .blockingTaskExecutor(myWrappedExecutor)
  .decorator(myWrappedExecutor.asDecorator())
  // or
  .decorator(ThrottlingService.newDecorator(myWrappedExecutor.throttlingStrategy()))
  ...

@trustin trustin modified the milestones: 1.15.0, 1.16.0 Mar 22, 2022
@TheWeaVer
Copy link
Contributor Author

Let me try! Thank you @trustin

@trustin
Copy link
Member

trustin commented Apr 7, 2022

@TheWeaVer Great! Let me stay tuned ❤️

@ikhoon ikhoon modified the milestones: 1.16.0, 1.17.0 Apr 15, 2022
@minwoox minwoox modified the milestones: 1.17.0, 1.18.0 Jun 27, 2022
@ikhoon ikhoon modified the milestones: 1.18.0, 1.19.0 Jul 22, 2022
@ikhoon ikhoon modified the milestones: 1.19.0, 1.20.0 Aug 30, 2022
@TheWeaVer TheWeaVer force-pushed the issue/3829 branch 2 times, most recently from 2ec2463 to b94e0f4 Compare September 16, 2022 14:52
@minwoox minwoox modified the milestones: 1.20.0, 1.21.0 Sep 29, 2022
@ikhoon ikhoon modified the milestones: 1.21.0, 1.22.0 Dec 12, 2022
@@ -113,6 +116,18 @@ public static <T extends Request> ThrottlingStrategy<T> rateLimiting(
return new RateLimitingThrottlingStrategy<>(requestsPerSecond, name);
}

/**
* Returns a new {@link ThrottlingStrategy} that provides a throttling strategy based on given
* {@link SettableIntSupplier} by comparing it to the size of the tasks of the {@link BlockingTaskExecutor}.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
* {@link SettableIntSupplier} by comparing it to the size of the tasks of the {@link BlockingTaskExecutor}.
* {@link IntSupplier} by comparing it to the size of the tasks of the {@link BlockingTaskExecutor}.

* @param limitSupplier the {@link IntSupplier} which indicates limit of the tasks
* @param name the name of the {@link ThrottlingStrategy}
*/
public static <T extends Request> ThrottlingStrategy<T> blockingTaskLimiting(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we also add a version that accepts an int instead of IntSupplier for a simpler use case where a user just wants to enforce a static limit?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I add new method 😄

@TheWeaVer TheWeaVer requested a review from trustin June 1, 2023 12:43
@ikhoon ikhoon modified the milestones: 1.24.0, 1.25.0 Jun 2, 2023
@jrhee17 jrhee17 modified the milestones: 1.25.0, 1.26.0 Jul 18, 2023
@minwoox minwoox modified the milestones: 1.26.0, 1.27.0 Oct 11, 2023
@ikhoon ikhoon modified the milestones: 1.27.0, 1.28.0 Jan 5, 2024
@github-actions github-actions bot requested a review from jrhee17 March 21, 2024 03:53
Copy link
Contributor

@jrhee17 jrhee17 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the initiative!

I think as it stands, it is difficult to keep sync between the executor state and our DefaultBlockingTaskExecutor.

What do you think of simply following micrometer's approach and try to access the number of queued tasks directly like suggested here: #4073 (comment)

For instance, I think it's fine to watch (ThreadPoolExecutor#getQueue#size, ForkJoinPool::getQueuedTaskCount) and throttle requests based on this value.
If an unsupported ScheduledExecutorService is used to throttle requests, I think we can just throw an exception on initialization.

Let me know what you think @line/dx

#4073 (comment)

Nevermind, I saw this comment late.

try {
command.run();
} finally {
taskCounter.decrementAndGet();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think taskCounter will be decremented on each scheduled task invocation, but no increment occurs - this may not correctly reflect the number of pending tasks

}
});
submitted = true;
return future;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If future.cancel is called immediately without given a chance to run, I think it is possible that taskCounter is incremented only without decrementing.

I think we may also need to wrap the returned future and check if cancel is called successfully.

Copy link
Contributor

@jrhee17 jrhee17 Mar 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i.e. We may need something like this:

  class BlockingFuture<T> implements ScheduledFuture<T> {
    private final Future<T> delegate;
    BlockingFuture(Future<T> delegate) {
        this.delegate = delegate;
    }
    @Override
    public boolean cancel(boolean mayInterruptIfRunning) {
        boolean cancelled = delegate.cancel(mayInterruptIfRunning);
        if (cancelled) {
            taskCounter.decrementAndGet();
        }
        return cancelled;
    }

@jrhee17
Copy link
Contributor

jrhee17 commented Mar 21, 2024

Can you also check the CI failures? I think this PR should be almost done once my comments are addressed 🙇

@jrhee17 jrhee17 modified the milestones: 1.28.0, 1.29.0 Apr 2, 2024
@github-actions github-actions bot added the Stale label May 6, 2024
@minwoox minwoox modified the milestones: 1.29.0, 1.30.0 May 21, 2024
@github-actions github-actions bot removed the Stale label Jun 7, 2024
@github-actions github-actions bot added the Stale label Jul 7, 2024
@ikhoon ikhoon removed this from the 1.30.0 milestone Aug 1, 2024
@github-actions github-actions bot removed the Stale label Aug 19, 2024
@github-actions github-actions bot added the Stale label Oct 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
new feature sprint Issues for OSS Sprint participants Stale
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Provides a throttling strategy based on the number of pending blocking tasks.
5 participants