Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deadlock on ConcurrentRequestQueue.WaitForBelowRequestLimit() #3414

Closed
yyjdelete opened this issue May 22, 2019 · 7 comments
Closed

Deadlock on ConcurrentRequestQueue.WaitForBelowRequestLimit() #3414

yyjdelete opened this issue May 22, 2019 · 7 comments
Labels
bug Bug report / Bug fix
Milestone

Comments

@yyjdelete
Copy link

Hi! Thanks for reporting this feature/bug/question!

Please keep / fill in the relevant info from this template so that we can help you as best as possible.

QUESTIONS are preferred on StackOverflow. You could expect a faster response there (but do include all the info below). Use the tag "nlog" https://stackoverflow.com/questions/ask

For .NET Core users, please check the platform support: https://github.com/NLog/NLog/wiki/platform-support

NLog version: 4.6.3

Platform: .Net 4.7.2/ .NET Core 2

Current NLog config (xml or C#, if relevant)
It happen when set <default-wrapper xsi:type="AsyncWrapper" overflowAction="Block" forceLockingQueue="false" />(or netcore2.0 without forceLockingQueue). And can be reproducted with the below code.

            var queue = new NLog.Targets.Wrappers.ConcurrentRequestQueue(10, NLog.Targets.Wrappers.AsyncTargetWrapperOverflowAction.Block);

            for (var j = 0; j < 3; ++j)
            {
                int cur = j;
                _ = Task.Run(() =>
                {
                    for (int i = 0; i < 20; ++i)
                    {
                        queue.Enqueue(default(NLog.Common.AsyncLogEventInfo));
                        Console.WriteLine($"Thread {cur}: PostEnqueue {i + 1}");
                    }
                });
            }
            Task.Run(() =>
            {
                Thread.Sleep(1000);
                int left = 20 * 3;
                while (left > 0)
                {
                    var tmp = queue.DequeueBatch(3);
                    left -= tmp.Length;
                    if (tmp.Length != 0)
                        Console.WriteLine($"DequeueBatch left={left}");
                }
                Console.WriteLine("Finished");
            }).Wait();

You can see some threads are block on line133(Monitor.Enter(_logEventInfoQueue);), but the queue is empty at that time. Seems Monitor.Wait() should not change the value of lockTaken at line141.

private long WaitForBelowRequestLimit()
{
long currentCount;
bool lockTaken = false;
try
{
// Attempt to yield using SpinWait
currentCount = TrySpinWaitForLowerCount();
// If yield did not help, then wait on a lock
while (currentCount > RequestLimit)
{
if (!lockTaken)
{
InternalLogger.Debug("Blocking because the overflow action is Block...");
Monitor.Enter(_logEventInfoQueue);
lockTaken = true;
InternalLogger.Trace("Entered critical section.");
}
else
{
InternalLogger.Debug("Blocking because the overflow action is Block...");
if (!Monitor.Wait(_logEventInfoQueue, 100))
lockTaken = false;
else
InternalLogger.Trace("Entered critical section.");
}
currentCount = Interlocked.Read(ref _count);
}
if (lockTaken)
{
Monitor.PulseAll(_logEventInfoQueue);
}
}
finally
{
if (lockTaken)
Monitor.Exit(_logEventInfoQueue);
}
return currentCount;
}

  • What is the current result?
    Still block even if the queue is empty at that time.
  • What is the expected result?
    No block after the queue be empty.
  • Did you checked the Internal log?
  • Please post full exception details (message, stacktrace, inner exceptions)
  • Are there any workarounds? yes/no
  • Is there a version in which it did work?
  • Can you help us by writing an unit test?
@welcome
Copy link

welcome bot commented May 22, 2019

Hi! Thanks for opening your first issue here! Please make sure to follow the issue template - so we could help you better!

@snakefoot
Copy link
Contributor

snakefoot commented May 22, 2019

You are completely right. For some reason I thought the Monitor.Wait with timeout would be a special throttle (and would not have acquired lock on return false).

But I can see that Monitor.Wait will always acquire the lock even if timeout has expired.

Very embarrassing. I have created PR #3415

@304NotModified 304NotModified added this to the 4.6.4 milestone May 22, 2019
@304NotModified 304NotModified added the bug Bug report / Bug fix label May 22, 2019
@yyjdelete
Copy link
Author

yyjdelete commented May 23, 2019

Another edge case, try set requestLimit in the above code to 2 or less(less than the count of threads which does Enqueue).

Seems the count should be decrease before Wait , and increase again after Wait to avoid that. And all compare to RequestLimit in WaitForBelowRequestLimit(>=) and TrySpinWaitForLowerCount(<) should be changed to follow the count.
And timeout of Wait in WaitForBelowRequestLimit is still needed for the case that TryEnter in DequeueBatch not success.

                    case AsyncTargetWrapperOverflowAction.Block:
                        {
                            Interlocked.Decrement(ref _count);
                            currentCount = WaitForBelowRequestLimit();
                            Interlocked.Increment(ref _count);
                        }
                        break;

BTW:
The doc of ConcurrentRequestQueue/AsyncRequestQueue.DequeueBatch say (-1 means everything), but -1 actually not work and get an ArgumentOutOfRangeException.

@snakefoot
Copy link
Contributor

snakefoot commented May 23, 2019 via email

@snakefoot
Copy link
Contributor

snakefoot commented May 23, 2019

@yyjdelete Updated #3416 to include Interlocked.Decrement and Interlocked.Increment when Block. Though special edge-case, then I prefer correct behavior independent of RequestLimit and number of producer-threads.

Thanks again.,

@304NotModified
Copy link
Member

fixed by #3415

@snakefoot
Copy link
Contributor

@yyjdelete NLog 4.6.4 has been released: https://www.nuget.org/packages/NLog/4.6.4

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Bug report / Bug fix
Projects
None yet
Development

No branches or pull requests

3 participants