Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QUERY] Receiving Unauthorized Access Error on RenewToken Periodically with Azure Service Bus Queue Listener #11619

Closed
tstepanski opened this issue Apr 27, 2020 · 32 comments
Assignees
Labels
Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. question The issue doesn't require a change to the product in order to be resolved. Most issues start as that Service Attention This issue is responsible by Azure service team. Service Bus

Comments

@tstepanski
Copy link

tstepanski commented Apr 27, 2020

I am receiving a Microsoft.Azure.ServiceBus.ServiceBusException (message below with sensitive information removed) periodically in my queue receiver. The SAS key has send/listen access and the error seems inconsequential as processing continues as normal. However, the message is creating a signal to noise problem in my dashboards (receiving 10-70 errors per day). Any ideas on why this is happening? The listener is running in an Azure App Service, but I don't think that matters. I have adjusted my retry logic to use a RetryExponential with a 1 second to 1 minute backoff with 5 retries.

Original StackOverflow Question

Packages

Net Core 3.1

Microsoft.Azure.ServiceBus, Version=4.1.3.0, Culture=neutral, PublicKeyToken=7e34167dcc6d6d8c

Error Message

The link 'xxx;xxx:xxx:xxx:source(address:xxx):xxx' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://xxx.servicebus.windows.net/xxx'.. TrackingId:xxx, SystemTracker:xxx, Timestamp:2020-04-27T09:36:04 The link 'xxx;xxx:xxx:xxx:source(address:xxx):xxx' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://xxx.servicebus.windows.net/xxx'.. TrackingId:xxx, SystemTracker:xxx, Timestamp:2020-04-27T09:36:04

Source

internal delegate TClient ClientFactory<out TClient>(string connectionString, string entityPath,
    RetryPolicy retryPolicy);

internal delegate Task OnMessageCallback<in TMessage>(TMessage message,
    CancellationToken cancellationToken = default) where TMessage : ICorrelative;

internal sealed class ReceiverClientWrapper<TMessage> : IReceiverClientWrapper<TMessage>
    where TMessage : ICorrelative
{
    // ReSharper disable once StaticMemberInGenericType
    private static readonly Regex TransientConnectionErrorRegex =
        new Regex(
            @"(The link '([a-f0-9-]+);([0-9]*:)*source\(address:([a-z0-9_]+)\):([a-z0-9_]+)' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim\(s\) are required to perform this operation. Resource: 'sb:\/\/([a-z0-9-_.\/]+)'.. TrackingId:([a-z0-9_]+), SystemTracker:([a-z0-9]+), Timestamp:([0-9]{4}(-[0-9]{2}){2}T([0-9]{2}:){2}[0-9]{2}) )+",
            RegexOptions.Compiled | RegexOptions.Multiline | RegexOptions.IgnoreCase);

    private readonly IReceiverClient _receiverClient;
    private readonly IMessageConverter<TMessage> _messageConverter;
    private readonly ILogger _logger;
    private readonly int _maximumConcurrency;

    public ReceiverClientWrapper(IReceiverClient receiverClient, IMessageConverter<TMessage> messageConverter,
        ILogger logger, int maximumConcurrency)
    {
        _receiverClient = receiverClient;
        _messageConverter = messageConverter;
        _logger = logger;
        _maximumConcurrency = maximumConcurrency;
    }

    public Task SubscribeAsync(OnMessageCallback<TMessage> onMessageCallback,
        OnFailureCallback onFailureCallback, CancellationToken cancellationToken = default)
    {
        var messageHandlerOptions = CreateMessageHandlerOptions(onFailureCallback, cancellationToken);

        async Task Handler(Message message, CancellationToken token)
        {
            var convertedMessage = _messageConverter.Convert(message);

            await onMessageCallback(convertedMessage, cancellationToken);
            await _receiverClient.CompleteAsync(message.SystemProperties.LockToken);
        }

        _receiverClient.RegisterMessageHandler(Handler, messageHandlerOptions);

        return Task.CompletedTask;
    }

    private MessageHandlerOptions CreateMessageHandlerOptions(OnFailureCallback onFailureCallback,
        CancellationToken cancellationToken)
    {
        async Task HandleExceptionAsync(ExceptionReceivedEventArgs arguments)
        {
            var exception = arguments.Exception;

            if (TransientConnectionErrorRegex.IsMatch(exception.Message))
            {
                _logger.LogWarning(exception, @"Transient connectivity error occurred");

                return;
            }

            await onFailureCallback(exception, cancellationToken);
        }

        return new MessageHandlerOptions(HandleExceptionAsync)
        {
            AutoComplete = false,
            MaxConcurrentCalls = _maximumConcurrency
        };
    }

    public async ValueTask DisposeAsync()
    {
        await _receiverClient.CloseAsync();
    }
}

internal sealed class SenderClientWrapper<TMessage> : ISenderClientWrapper<TMessage> where TMessage : ICorrelative
{
    private readonly ISenderClient _senderClient;
    private readonly IMessageConverter<TMessage> _messageConverter;

    public SenderClientWrapper(ISenderClient senderClient, IMessageConverter<TMessage> messageConverter)
    {
        _senderClient = senderClient;
        _messageConverter = messageConverter;
    }

    public Task SendAsync(TMessage message, CancellationToken cancellationToken = default)
    {
        var internalMessage = _messageConverter.Convert(message);

        return _senderClient.SendAsync(internalMessage);
    }

    public Task SendAsync(IEnumerable<TMessage> messages, CancellationToken cancellationToken = default)
    {
        var internalMessages = messages
            .Select(_messageConverter.Convert)
            .ToImmutableArray();

        return _senderClient.SendAsync(internalMessages);
    }

    public async ValueTask DisposeAsync()
    {
        await _senderClient.CloseAsync();
    }
}

internal abstract class AbstractClientWrapperFactory
{
    private const int MaximumRetryCount = 5;
    private static readonly TimeSpan MinimumRetryBackOff = TimeSpan.FromSeconds(1);
    private static readonly TimeSpan MaximumRetryBackOff = TimeSpan.FromMinutes(1);

    protected AbstractClientWrapperFactory(IOptions<MessageBusConfiguration> options)
    {
        Options = options;
    }

    protected IOptions<MessageBusConfiguration> Options { get; }

    protected static string GetEntityPath<TMessage>() where TMessage : class
    {
        var messageAttribute = typeof(TMessage).GetCustomAttribute<AbstractMessageAttribute>();

        if (messageAttribute == null)
        {
            throw new ArgumentException($@"Message requires {nameof(AbstractMessageAttribute)}");
        }

        return messageAttribute.EntityName;
    }

    protected TClient CreateClientEntity<TMessage, TClient>(ClientFactory<TClient> clientFactory)
        where TMessage : class
    {
        var entityPath = GetEntityPath<TMessage>();
        var retryPolicy = CreateRetryPolicy();

        return clientFactory(Options.Value.ConnectionString, entityPath, retryPolicy);
    }

    protected static IQueueClient QueueClientFactory(string connectionString, string entityPath,
        RetryPolicy retryPolicy)
    {
        return new QueueClient(connectionString, entityPath, retryPolicy: retryPolicy);
    }

    private static RetryPolicy CreateRetryPolicy()
    {
        return new RetryExponential(MinimumRetryBackOff, MaximumRetryBackOff, MaximumRetryCount);
    }
}

internal sealed class SenderClientWrapperFactory : AbstractClientWrapperFactory, ISenderClientWrapperFactory
{
    private readonly IMessageConverterFactory _messageConverterFactory;

    public SenderClientWrapperFactory(IMessageConverterFactory messageConverterFactory,
        IOptions<MessageBusConfiguration> options) : base(options)
    {
        _messageConverterFactory = messageConverterFactory;
    }

    public ISenderClientWrapper<TEvent> CreateTopicClient<TEvent>() where TEvent : class, IEvent
    {
        return CreateWrapper<TEvent, ITopicClient>(TopicClientFactory);
    }

    public ISenderClientWrapper<TRequest> CreateQueueClient<TRequest>() where TRequest : class, IRequest
    {
        return CreateWrapper<TRequest, IQueueClient>(QueueClientFactory);
    }

    private ISenderClientWrapper<TMessage> CreateWrapper<TMessage, TClient>(ClientFactory<TClient> clientFactory)
        where TMessage : class, ICorrelative
        where TClient : ISenderClient
    {
        var clientEntity = CreateClientEntity<TMessage, TClient>(clientFactory);
        var messageConverter = _messageConverterFactory.Create<TMessage>();

        return new SenderClientWrapper<TMessage>(clientEntity, messageConverter);
    }

    private static ITopicClient TopicClientFactory(string connectionString, string entityPath,
        RetryPolicy retryPolicy)
    {
        return new TopicClient(connectionString, entityPath, retryPolicy);
    }
}

internal sealed class ReceiverClientWrapperFactory : AbstractClientWrapperFactory, IReceiverClientWrapperFactory
{
    private readonly IMessageConverterFactory _messageConverterFactory;
    private readonly ILogger<ReceiverClientWrapperFactory> _logger;

    public ReceiverClientWrapperFactory(IOptions<MessageBusConfiguration> options,
        IMessageConverterFactory messageConverterFactory,
        ILogger<ReceiverClientWrapperFactory> logger) : base(options)
    {
        _messageConverterFactory = messageConverterFactory;
        _logger = logger;
    }

    public IReceiverClientWrapper<TEvent> CreateTopicClient<TEvent>() where TEvent : class, IEvent
    {
        return CreateReceiverClientWrapper<TEvent, ISubscriptionClient>(SubscriptionClientFactory);
    }

    public IReceiverClientWrapper<TRequest> CreateQueueClient<TRequest>() where TRequest : class, IRequest
    {
        return CreateReceiverClientWrapper<TRequest, IQueueClient>(QueueClientFactory);
    }

    private IReceiverClientWrapper<TMessage> CreateReceiverClientWrapper<TMessage, TClient>(
        ClientFactory<TClient> clientFactory)
        where TMessage : class, ICorrelative
        where TClient : IReceiverClient
    {
        var clientEntity = CreateClientEntity<TMessage, TClient>(clientFactory);
        var messageConverter = _messageConverterFactory.Create<TMessage>();

        return new ReceiverClientWrapper<TMessage>(clientEntity, messageConverter, _logger,
            Options.Value.MaximumConcurrency);
    }

    private ISubscriptionClient SubscriptionClientFactory(string connectionString, string entityPath,
        RetryPolicy retryPolicy)
    {
        return new SubscriptionClient(connectionString, entityPath, Options.Value.SubscriberName,
            retryPolicy: retryPolicy);
    }
}

internal sealed class RequestService<TRequest> : IRequestService<TRequest> where TRequest : class, IRequest
{
    private readonly Lazy<ISenderClientWrapper<TRequest>> _senderClient;
    private readonly Lazy<IReceiverClientWrapper<TRequest>> _receiverClient;

    public RequestService(ISenderClientWrapperFactory senderClientWrapperFactory,
        IReceiverClientWrapperFactory receiverClientWrapperFactory)
    {
        _senderClient =
            new Lazy<ISenderClientWrapper<TRequest>>(senderClientWrapperFactory.CreateQueueClient<TRequest>,
                LazyThreadSafetyMode.PublicationOnly);

        _receiverClient
            = new Lazy<IReceiverClientWrapper<TRequest>>(receiverClientWrapperFactory.CreateQueueClient<TRequest>,
                LazyThreadSafetyMode.PublicationOnly);
    }

    public Task PublishRequestAsync(TRequest requestMessage, CancellationToken cancellationToken = default)
    {
        return _senderClient.Value.SendAsync(requestMessage, cancellationToken);
    }

    public Task PublishRequestAsync(IEnumerable<TRequest> requestMessages,
        CancellationToken cancellationToken = default)
    {
        return _senderClient.Value.SendAsync(requestMessages, cancellationToken);
    }

    public Task SubscribeAsync(OnRequestCallback<TRequest> onRequestCallback, OnFailureCallback onFailureCallback,
        CancellationToken cancellationToken = default)
    {
        return _receiverClient
            .Value
            .SubscribeAsync((message, token) => onRequestCallback(message, cancellationToken), onFailureCallback,
                cancellationToken);
    }

    public async ValueTask DisposeAsync()
    {
        if (_senderClient.IsValueCreated)
        {
            await _senderClient.Value.DisposeAsync();
        }

        if (_receiverClient.IsValueCreated)
        {
            await _receiverClient.Value.DisposeAsync();
        }
    }

    public Task ThrowIfNotReadyAsync(CancellationToken cancellationToken = default)
    {
        return _senderClient.Value.SendAsync(ImmutableArray<TRequest>.Empty, cancellationToken);
    }
}
@ghost ghost added needs-triage This is a new issue that needs to be triaged to the appropriate team. customer-reported Issues that are reported by GitHub users external to the Azure organization. question The issue doesn't require a change to the product in order to be resolved. Most issues start as that labels Apr 27, 2020
@AlexGhiondea AlexGhiondea added Service Attention This issue is responsible by Azure service team. Service Bus labels Apr 27, 2020
@ghost
Copy link

ghost commented Apr 27, 2020

Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @jfggdl.

@ghost ghost removed the needs-triage This is a new issue that needs to be triaged to the appropriate team. label Apr 27, 2020
@nemakam nemakam assigned anqyan and unassigned nemakam Apr 28, 2020
@shankarsama
Copy link

@3437CasaVerde Please take a look at this issue.

@anqyan
Copy link

anqyan commented Apr 29, 2020

Hi @tstepanski , is it possible to share us the detail of
Resource: 'sb://xxx.servicebus.windows.net/xxx'.. TrackingId:xxx
you noted in the error message? Basically resource and tracking id. Thanks.

@tstepanski
Copy link
Author

@3437CasaVerde For security purposes, would I be able to PM those details to you?

@anqyan
Copy link

anqyan commented Apr 29, 2020

@tstepanski Yes please. My work email is anqyan@microsoft.com.

@tstepanski
Copy link
Author

@3437CasaVerde I have sent you some examples.

I will relay any details regarding a diagnosis and solution shared via email with this GitHub issue after removing any sensitive data.

@anqyan
Copy link

anqyan commented Apr 30, 2020

Hey @tstepanski I checked a few examples you emailed me and logs on my end. It looks like that the SAS token expired and Service Bus was trying to re-authorize the connection then got this exception. I can have a wider and deeper investigation.

Meanwhile, could you please:

  1. In your client code, use the connection string from Azure Portal: Home > [your Service Bus Namespace] > Shared access policies > RootManageSharedAccessKey > Primary Connection String
  2. Then restart your App Service

@tstepanski
Copy link
Author

@3437CasaVerde Thank you for checking on that, very odd. As you likely saw, I am using a shared access policy I created with Listen/Read privileges, but I can use RootManageSharedAccessKey for the time being. I will do that and check after a few hours to see if any more exceptions have been caught.

@tstepanski
Copy link
Author

@3437CasaVerde After the move to RootManageSharedAccessKey, I have not gotten anymore renewal exceptions, but I am seeing ServiceBusCommunicationExceptions with an inner exception of OperationCancelledExceptions. I don't know if this is related or not? I have multiple receivers (several topics/queues) and they all seem to disconnect at roughly the same time. If it is an expiration problem, they likely all expire and then reconnect together. This is obviously problematic, I would like that to be isolated to the SDK if it is a known condition of Azure Service Bus. In addition, using RootManageSharedAccessKey is also a security risk long term, so your investigation is still invaluable. I don't know if this helps or increases confusion.

@AlexGhiondea AlexGhiondea added the Client This issue points to a problem in the data-plane of the library. label May 4, 2020
@tstepanski
Copy link
Author

@AlexGhiondea Thanks for that!

@anqyan Any updates?

@cpunella
Copy link

Hi all, we have the same issue, but in our case we cannot use the RootManageSharedAccessKey for security purpose! What we can do?

It seems a breaking changes between old client (.net full) and the new one. We are getting this error after the upgrading.

@anqyan can I send you the TrackingID to verify that also on you side you see the same error of this topic?

Thanks

@anqyan
Copy link

anqyan commented May 26, 2020

Hi, @cpunella yes please send me related client side logs to anqyan@microsoft.com thanks.

@marcelvwe
Copy link

Hi, we are receiving the same error. I've sent related logs per mail.

We are not using the RootManageSharedAccessKey itself, but instead an alternative generated access key with Manage, Send, Listen claims.

@slandsaw
Copy link

slandsaw commented Jun 8, 2020

Some of our devs are experiencing the same issue. I would be very interested in any updates on this issue.

@yuhaii
Copy link

yuhaii commented Jun 12, 2020

Our customer send Schedule Message in Azure function app and met similar issue " Unauthorized access. 'Send' claim(s) are required to perform this operation ".

But after reboot the function app, the issue mitigated.

Hello @anqyan , could you please help advise? is it SDK issue? thanks.

@wiliambuzatto
Copy link

wiliambuzatto commented Jun 15, 2020

Hi, i am having the same issue, but it happens between some days (1 to 2). When it happens, it show more then 20 lines on application Insights. (screen below).

Another curiosity, I have seven queues in this ServiceBus, but only two of them are logging the issue.

image

@edit:
Error Message:
The link '73770380-dde5-4a3f-a6e1-24611250fab1;32:37:38:source(address:myQueueName):MessageReceiver2myQueueName' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://myServiceBusName.servicebus.windows.net/myQueueName'.. TrackingId:ef04de1bd38a4440b4a8f0bc6c7984d7_G19, SystemTracker:gateway7, Timestamp:2020-06-15T11:14:19Reference: ef04de1bd38a4440b4a8f0bc6c7984d7_G19, 6/15/2020 11:14:20 AM The link '73770380-dde5-4a3f-a6e1-24611250fab1;32:37:38:source(address:myQueueName):MessageReceiver2myQueueName' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://myServiceBusName.servicebus.windows.net/myQueueName'.. TrackingId:ef04de1bd38a4440b4a8f0bc6c7984d7_G19, SystemTracker:gateway7, Timestamp:2020-06-15T11:14:19

@anqyan
Copy link

anqyan commented Jul 15, 2020

Hi @wiliambuzatto @yuhaii @slandsaw @marcelvwe how often do you/the customers encounter this exception recently?
Is it possible for you to sent the exception logs within the past month to anqyan@microsoft.com?

@danmgs
Copy link

danmgs commented Jul 25, 2020

Hi

Same here
I have a console subscribing to messages from service bus.
I let the console running during some days and when i come back see the same (transient ?) exception "ServiceBusException".

My console .exe uses:

  • Nuget Package Microsoft.Azure.ServiceBus 4.1.3
  • Listen policy token

Below:
2020-07-24T17:43:45.9505394+02:00 => OK receiving the message on the 24th July (expected behaviour)
2020-07-25T01:34:38.6155008+02:00 => ERROR on the 25 July.

Logs:

2020-07-24T17:43:45.9505394+02:00 [INF] (ConsoleApp.AzureServicesBus.Subscriber/xxxxx/17) Received message with transaction Id=357bbb40
2020-07-25T01:34:38.6155008+02:00 [ERR] (ConsoleApp.AzureServicesBus.Subscriber/xxxxx/40) Message handler encountered an exception Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The link '995xxxxx;61:66:67:source(address:xxxxx/xxxxx/xxxxx):MessageReceiver2xxxxx/xxxxx/xxxxx' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://xxxxx'.. TrackingId:xxxxx, SystemTracker:gateway7, Timestamp:2020-07-24T23:34:38
 ---> Microsoft.Azure.ServiceBus.ServiceBusException: The link '995cad87-b390-4d64-847b-f708946dcd9a;61:66:67:source(address:xxxxx):MessageReceiver2xxxxx/Subscriptions/xxxxx' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://xxxxx'.. TrackingId:xxxxx, SystemTracker:gateway7, Timestamp:2020-07-24T23:34:38
   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)
   --- End of inner exception stack trace ---
   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)
   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.<>c__DisplayClass64_0.<<ReceiveAsync>b__0>d.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)
   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)
   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(Int32 maxMessageCount, TimeSpan operationTimeout)
   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(TimeSpan operationTimeout)
   at Microsoft.Azure.ServiceBus.MessageReceivePump.<MessagePumpTaskAsync>b__11_0().
Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The link '995cad87-b390-4d64-847b-f708946dcd9a;61:66:67:source(address:xxxxx):MessageReceiver2xxxxx/Subscriptions/xxxxx' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://xxxxx'.. TrackingId:xxxxx, SystemTracker:gateway7, Timestamp:2020-07-24T23:34:38
 ---> Microsoft.Azure.ServiceBus.ServiceBusException: The link '995cad87-b390-4d64-847b-f708946dcd9a;61:66:67:source(address:xxxxx):MessageReceiver2xxxxx/Subscriptions/xxxxx' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://xxxxx'.. TrackingId:xxxxx, SystemTracker:gateway7, Timestamp:2020-07-24T23:34:38
   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)
   --- End of inner exception stack trace ---
   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)
   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.<>c__DisplayClass64_0.<<ReceiveAsync>b__0>d.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)
   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)
   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(Int32 maxMessageCount, TimeSpan operationTimeout)
   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(TimeSpan operationTimeout)
   at Microsoft.Azure.ServiceBus.MessageReceivePump.<MessagePumpTaskAsync>b__11_0()
2020-07-25T01:34:38.6516424+02:00 [INF] (ConsoleApp.AzureServicesBus.Subscriber/xxxxx/40) Exception context for troubleshooting:
2020-07-25T01:34:38.6540578+02:00 [INF] (ConsoleApp.AzureServicesBus.Subscriber/xxxxx/40) - Endpoint: xxxxx.servicebus.windows.net
2020-07-25T01:34:38.6568597+02:00 [INF] (ConsoleApp.AzureServicesBus.Subscriber/xxxxx/40) - Entity Path: xxxxx
2020-07-25T01:34:38.6596546+02:00 [INF] (ConsoleApp.AzureServicesBus.Subscriber/xxxxx/40) - Executing Action: Receive

I see nothing in the dead letter queue though.
It's happening different times of the day in a sporadic way.

@anqyan
Copy link

anqyan commented Aug 19, 2020

Hi all,

The exception is caused by cbs(claim-based security) token re-authorization failures in Service Bus. How cbs re-authorization works in Service Bus: Client sends a cbs token to put security related access info in the service side cache. On service side there is a timer which schedules re-authorization on the access info of the client. Normally client always send the token before the timer fires, so the AMQP link between client and Service Bus can continue working.

So, in this case, there could be failures or errors on the client side (inside Microsoft.Azure.ServiceBus SDK) that prevented the token from being sent to the service side before the timer fired. Hence the listen claim of the AMQP link became expired and got deleted by the service. To know more about what happens in the client regarding the exception, we need to enable tracing in the SDK, so that MessagingEventSource can log useful information:

  1. Download PerfView: https://github.com/microsoft/perfview/blob/master/documentation/Downloading.md
  2. Run PerfView.exe /OnlyProviders=*Microsoft-Azure-ServiceBus Collect alongside your application.
  3. Stop the collection of traces once the exception occurs. Given the random nature of this exception, this step will be difficult.
  4. Share the .etl files with anqyan@microsoft.com

Meanwhile please treat this exception as not productivity impacting in your services.

@RemcoBlok
Copy link

RemcoBlok commented Sep 7, 2020

@anqyan I pm'd you last week with traces I collected using PerfView.

Regarding your comment 'Meanwhile please treat this exception as not productivity impacting in your services.', my experience is that once the error occurs the client is unusable and my application becomes unavailable. I would treat this as a critical issue.

@eriksteinebach
Copy link

Hi, we are also experiencing this exception. We are using the RootManageSharedAccessKey for the connection. I see it in multiple environments, but more often in our QA environment instead of PROD environment. So maybe it happens more often if the environment is not as busy?

A quick test showed me that the subscription was still active and picking up messages after receiving this error. So for us it doesn't look like it is critical. Maybe it is the way how we handle the exception?

@MartinDemberger
Copy link

In the last days this error occures often on all of our 3 systems.
We use RootManageSharedAccessKey .

@iamandymcinnes
Copy link

iamandymcinnes commented Nov 9, 2020

I've just updated to netcore3.1 on the queue services we run in AKS and our functions apps, I see this all the time since updating to the new Servicebus library. I also use the RootManageSharedAccessKey as we use MassTransit in most of our system and then just use the new management client to create some subscriptions and queues for the function apps and bit's that don't use MassTransit. I get 3 "errors" now that I didn't see before. These don't as far as I can tell stop messages getting processed though, but make your logs look like something is going wrong. And doesn't fill me with confidence that there isn't a problem:

Exception on Receiver during "Receive" The connection was inactive for more than the allowed 60000 milliseconds and is closed by container

Exception on Receiver during "Receive" The link is force detached. Code: RenewToken. Details: Unauthorized access. The access has expired.

The link is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation.

@iamandymcinnes
Copy link

iamandymcinnes commented Nov 9, 2020

@anqyan Have you managed to see anything with the traces others have sent you already regarding this? My experience so far has been that I updated from netcore2.2 to netcore3.1 and updated the associated ServiceBus libraries. I also updated my function apps to v3 and then reprovisioned with our existing provisioning mechanism. I also trashed our web apps, function apps, aks pods and bus, as we repopulate the SAS token as part of the provision mechanism. This worked as I've seen it successfully process messages and all of our smoke tests have passed, but I now see these intermittent issues. To me it feels very much like something has changed perhaps around the token refresh or connection in the SDK which is making it flake intermittently like we are seeing, or you are now surfacing errors that were being swallowed previously. Perhaps it's around some longer timeout scenario as others have pointed out, my finding also support this as some of the message handlers on our bus only run very intermittently i.e. with gaps of days in between them. Here is an example of one of my AKS pods it successfully receives a message then we see numerous errors in the following hours:

2020-11-09T10:27:03.515916049Z {"@t":"2020-11-09T10:27:03.5156071Z","@m":"RECEIVE sb://mybus.servicebus.windows.net/BrandSensitivities 3a8d0000-ffdb-0003-f852-08d88499fd2f \"Project.Pricing.Refinement.Messages.GetBrandSensitivitiesMessage\" \"Project.Pricing.Refinement.Worker.BrandSensitivitiesConsumer\"(00:00:03.8436958)","@i":"a4f71ef5","@l":"Debug","InputAddress":"sb://mybus.servicebus.windows.net/BrandSensitivities","MessageId":"3a8d0000-ffdb-0003-f852-08d88499fd2f","MessageType":"Project.Pricing.Refinement.Messages.GetBrandSensitivitiesMessage","ConsumerType":"Project.Pricing.Refinement.Worker.BrandSensitivitiesConsumer","Duration":"00:00:03.8436958","SourceContext":"MassTransit.ReceiveTransport","MemoryUsage":11415904,"MachineName":"refinement-worker-66974f4b56-28rsn","ThreadId":31,"ProcessId":1,"ProcessName":"dotnet","AssemblyName":"Project.Pricing.Refinement.Worker","AssemblyVersion":"1.0.0.0"}
2020-11-09T12:50:40.920099337Z {"@t":"2020-11-09T12:50:40.9176899Z","@m":"Exception on Receiver sb://mybus.servicebus.windows.net/SubdailyForecast during \"Receive\" ActiveDispatchCount(0) ErrorRequiresRecycle(False)","@i":"9bb9e1e5","@l":"Error","@x":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The connection was inactive for more than the allowed 60000 milliseconds and is closed by container 'ee053a7b-a226-4129-94fd-e40d148e6fcb'.\n ---> Microsoft.Azure.ServiceBus.ServiceBusException: The connection was inactive for more than the allowed 60000 milliseconds and is closed by container 'ee053a7b-a226-4129-94fd-e40d148e6fcb'.\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   --- End of inner exception stack trace ---\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.<>c__DisplayClass65_0.<<ReceiveAsync>b__0>d.MoveNext()\n--- End of stack trace from previous location where exception was thrown ---\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(Int32 maxMessageCount, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.MessageReceivePump.<MessagePumpTaskAsync>b__12_0()","InputAddress":"sb://mybus.servicebus.windows.net/SubdailyForecast","Action":"Receive","activeDispatch":0,"requiresRecycle":false,"SourceContext":"MassTransit","ExceptionDetail":{"HResult":-2146233088,"Message":"The connection was inactive for more than the allowed 60000 milliseconds and is closed by container 'ee053a7b-a226-4129-94fd-e40d148e6fcb'.","Source":"Microsoft.Azure.ServiceBus","InnerException":{"HResult":-2146233088,"Message":"The connection was inactive for more than the allowed 60000 milliseconds and is closed by container 'ee053a7b-a226-4129-94fd-e40d148e6fcb'.","Source":"Microsoft.Azure.ServiceBus","IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusException"},"IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException"},"MemoryUsage":4463840,"MachineName":"refinement-worker-66974f4b56-28rsn","ThreadId":20,"ProcessId":1,"ProcessName":"dotnet","AssemblyName":"Project.Pricing.Refinement.Worker","AssemblyVersion":"1.0.0.0"}
2020-11-09T12:50:40.922044439Z {"@t":"2020-11-09T12:50:40.9205822Z","@m":"Exception on Receiver sb://mybus.servicebus.windows.net/SubdailyForecast during \"Receive\" ActiveDispatchCount(0) ErrorRequiresRecycle(False)","@i":"9bb9e1e5","@l":"Error","@x":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The connection was inactive for more than the allowed 60000 milliseconds and is closed by container 'ee053a7b-a226-4129-94fd-e40d148e6fcb'.\n ---> Microsoft.Azure.ServiceBus.ServiceBusException: The connection was inactive for more than the allowed 60000 milliseconds and is closed by container 'ee053a7b-a226-4129-94fd-e40d148e6fcb'.\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   --- End of inner exception stack trace ---\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.<>c__DisplayClass65_0.<<ReceiveAsync>b__0>d.MoveNext()\n--- End of stack trace from previous location where exception was thrown ---\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(Int32 maxMessageCount, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.MessageReceivePump.<MessagePumpTaskAsync>b__12_0()","InputAddress":"sb://mybus.servicebus.windows.net/SubdailyForecast","Action":"Receive","activeDispatch":0,"requiresRecycle":false,"SourceContext":"MassTransit","ExceptionDetail":{"HResult":-2146233088,"Message":"The connection was inactive for more than the allowed 60000 milliseconds and is closed by container 'ee053a7b-a226-4129-94fd-e40d148e6fcb'.","Source":"Microsoft.Azure.ServiceBus","InnerException":{"HResult":-2146233088,"Message":"The connection was inactive for more than the allowed 60000 milliseconds and is closed by container 'ee053a7b-a226-4129-94fd-e40d148e6fcb'.","Source":"Microsoft.Azure.ServiceBus","IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusException"},"IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException"},"MemoryUsage":4635072,"MachineName":"refinement-worker-66974f4b56-28rsn","ThreadId":20,"ProcessId":1,"ProcessName":"dotnet","AssemblyName":"Project.Pricing.Refinement.Worker","AssemblyVersion":"1.0.0.0"}
2020-11-09T12:50:40.923672140Z {"@t":"2020-11-09T12:50:40.9224378Z","@m":"Exception on Receiver sb://mybus.servicebus.windows.net/SubdailyForecast during \"Receive\" ActiveDispatchCount(0) ErrorRequiresRecycle(False)","@i":"9bb9e1e5","@l":"Error","@x":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The connection was inactive for more than the allowed 60000 milliseconds and is closed by container 'ee053a7b-a226-4129-94fd-e40d148e6fcb'.\n ---> Microsoft.Azure.ServiceBus.ServiceBusException: The connection was inactive for more than the allowed 60000 milliseconds and is closed by container 'ee053a7b-a226-4129-94fd-e40d148e6fcb'.\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   --- End of inner exception stack trace ---\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.<>c__DisplayClass65_0.<<ReceiveAsync>b__0>d.MoveNext()\n--- End of stack trace from previous location where exception was thrown ---\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(Int32 maxMessageCount, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.MessageReceivePump.<MessagePumpTaskAsync>b__12_0()","InputAddress":"sb://mybus.servicebus.windows.net/SubdailyForecast","Action":"Receive","activeDispatch":0,"requiresRecycle":false,"SourceContext":"MassTransit","ExceptionDetail":{"HResult":-2146233088,"Message":"The connection was inactive for more than the allowed 60000 milliseconds and is closed by container 'ee053a7b-a226-4129-94fd-e40d148e6fcb'.","Source":"Microsoft.Azure.ServiceBus","InnerException":{"HResult":-2146233088,"Message":"The connection was inactive for more than the allowed 60000 milliseconds and is closed by container 'ee053a7b-a226-4129-94fd-e40d148e6fcb'.","Source":"Microsoft.Azure.ServiceBus","IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusException"},"IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException"},"MemoryUsage":4798912,"MachineName":"refinement-worker-66974f4b56-28rsn","ThreadId":20,"ProcessId":1,"ProcessName":"dotnet","AssemblyName":"Project.Pricing.Refinement.Worker","AssemblyVersion":"1.0.0.0"}
2020-11-09T14:00:40.861460164Z {"@t":"2020-11-09T14:00:40.8588899Z","@m":"Exception on Receiver sb://mybus.servicebus.windows.net/SubdailyForecast during \"Receive\" ActiveDispatchCount(0) ErrorRequiresRecycle(False)","@i":"9bb9e1e5","@l":"Error","@x":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:370:371:source(address:SubdailyForecast):MessageReceiver7SubdailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/subdailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:40\n ---> Microsoft.Azure.ServiceBus.ServiceBusException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:370:371:source(address:SubdailyForecast):MessageReceiver7SubdailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/subdailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:40\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   --- End of inner exception stack trace ---\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.<>c__DisplayClass65_0.<<ReceiveAsync>b__0>d.MoveNext()\n--- End of stack trace from previous location where exception was thrown ---\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(Int32 maxMessageCount, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.MessageReceivePump.<MessagePumpTaskAsync>b__12_0()","InputAddress":"sb://mybus.servicebus.windows.net/SubdailyForecast","Action":"Receive","activeDispatch":0,"requiresRecycle":false,"SourceContext":"MassTransit","ExceptionDetail":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:370:371:source(address:SubdailyForecast):MessageReceiver7SubdailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/subdailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:40","Source":"Microsoft.Azure.ServiceBus","InnerException":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:370:371:source(address:SubdailyForecast):MessageReceiver7SubdailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/subdailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:40","Source":"Microsoft.Azure.ServiceBus","IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusException"},"IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException"},"MemoryUsage":4324560,"MachineName":"refinement-worker-66974f4b56-28rsn","ThreadId":104,"ProcessId":1,"ProcessName":"dotnet","AssemblyName":"Project.Pricing.Refinement.Worker","AssemblyVersion":"1.0.0.0"}
2020-11-09T14:00:40.865337267Z {"@t":"2020-11-09T14:00:40.8625839Z","@m":"Exception on Receiver sb://mybus.servicebus.windows.net/SubdailyForecast during \"Receive\" ActiveDispatchCount(0) ErrorRequiresRecycle(False)","@i":"9bb9e1e5","@l":"Error","@x":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:370:371:source(address:SubdailyForecast):MessageReceiver7SubdailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/subdailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:40\n ---> Microsoft.Azure.ServiceBus.ServiceBusException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:370:371:source(address:SubdailyForecast):MessageReceiver7SubdailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/subdailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:40\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   --- End of inner exception stack trace ---\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.<>c__DisplayClass65_0.<<ReceiveAsync>b__0>d.MoveNext()\n--- End of stack trace from previous location where exception was thrown ---\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(Int32 maxMessageCount, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.MessageReceivePump.<MessagePumpTaskAsync>b__12_0()","InputAddress":"sb://mybus.servicebus.windows.net/SubdailyForecast","Action":"Receive","activeDispatch":0,"requiresRecycle":false,"SourceContext":"MassTransit","ExceptionDetail":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:370:371:source(address:SubdailyForecast):MessageReceiver7SubdailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/subdailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:40","Source":"Microsoft.Azure.ServiceBus","InnerException":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:370:371:source(address:SubdailyForecast):MessageReceiver7SubdailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/subdailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:40","Source":"Microsoft.Azure.ServiceBus","IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusException"},"IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException"},"MemoryUsage":4529240,"MachineName":"refinement-worker-66974f4b56-28rsn","ThreadId":104,"ProcessId":1,"ProcessName":"dotnet","AssemblyName":"Project.Pricing.Refinement.Worker","AssemblyVersion":"1.0.0.0"}
2020-11-09T14:00:40.868602871Z {"@t":"2020-11-09T14:00:40.8662850Z","@m":"Exception on Receiver sb://mybus.servicebus.windows.net/SubdailyForecast during \"Receive\" ActiveDispatchCount(0) ErrorRequiresRecycle(False)","@i":"9bb9e1e5","@l":"Error","@x":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:370:371:source(address:SubdailyForecast):MessageReceiver7SubdailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/subdailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:40\n ---> Microsoft.Azure.ServiceBus.ServiceBusException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:370:371:source(address:SubdailyForecast):MessageReceiver7SubdailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/subdailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:40\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   --- End of inner exception stack trace ---\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.<>c__DisplayClass65_0.<<ReceiveAsync>b__0>d.MoveNext()\n--- End of stack trace from previous location where exception was thrown ---\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(Int32 maxMessageCount, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.MessageReceivePump.<MessagePumpTaskAsync>b__12_0()","InputAddress":"sb://mybus.servicebus.windows.net/SubdailyForecast","Action":"Receive","activeDispatch":0,"requiresRecycle":false,"SourceContext":"MassTransit","ExceptionDetail":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:370:371:source(address:SubdailyForecast):MessageReceiver7SubdailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/subdailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:40","Source":"Microsoft.Azure.ServiceBus","InnerException":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:370:371:source(address:SubdailyForecast):MessageReceiver7SubdailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/subdailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:40","Source":"Microsoft.Azure.ServiceBus","IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusException"},"IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException"},"MemoryUsage":4717576,"MachineName":"refinement-worker-66974f4b56-28rsn","ThreadId":104,"ProcessId":1,"ProcessName":"dotnet","AssemblyName":"Project.Pricing.Refinement.Worker","AssemblyVersion":"1.0.0.0"}
2020-11-09T14:00:43.799076806Z {"@t":"2020-11-09T14:00:43.7970091Z","@m":"Exception on Receiver sb://mybus.servicebus.windows.net/DailyForecast during \"Receive\" ActiveDispatchCount(0) ErrorRequiresRecycle(False)","@i":"9bb9e1e5","@l":"Error","@x":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:374:375:source(address:DailyForecast):MessageReceiver6DailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/dailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43\n ---> Microsoft.Azure.ServiceBus.ServiceBusException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:374:375:source(address:DailyForecast):MessageReceiver6DailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/dailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   --- End of inner exception stack trace ---\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.<>c__DisplayClass65_0.<<ReceiveAsync>b__0>d.MoveNext()\n--- End of stack trace from previous location where exception was thrown ---\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(Int32 maxMessageCount, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.MessageReceivePump.<MessagePumpTaskAsync>b__12_0()","InputAddress":"sb://mybus.servicebus.windows.net/DailyForecast","Action":"Receive","activeDispatch":0,"requiresRecycle":false,"SourceContext":"MassTransit","ExceptionDetail":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:374:375:source(address:DailyForecast):MessageReceiver6DailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/dailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43","Source":"Microsoft.Azure.ServiceBus","InnerException":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:374:375:source(address:DailyForecast):MessageReceiver6DailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/dailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43","Source":"Microsoft.Azure.ServiceBus","IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusException"},"IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException"},"MemoryUsage":4993496,"MachineName":"refinement-worker-66974f4b56-28rsn","ThreadId":105,"ProcessId":1,"ProcessName":"dotnet","AssemblyName":"Project.Pricing.Refinement.Worker","AssemblyVersion":"1.0.0.0"}
2020-11-09T14:00:43.799328207Z {"@t":"2020-11-09T14:00:43.7976612Z","@m":"Exception on Receiver sb://mybus.servicebus.windows.net/BrandSensitivities during \"Receive\" ActiveDispatchCount(0) ErrorRequiresRecycle(False)","@i":"9bb9e1e5","@l":"Error","@x":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:372:373:source(address:BrandSensitivities):MessageReceiver5BrandSensitivities' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/brandsensitivities'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43\n ---> Microsoft.Azure.ServiceBus.ServiceBusException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:372:373:source(address:BrandSensitivities):MessageReceiver5BrandSensitivities' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/brandsensitivities'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   --- End of inner exception stack trace ---\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.<>c__DisplayClass65_0.<<ReceiveAsync>b__0>d.MoveNext()\n--- End of stack trace from previous location where exception was thrown ---\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(Int32 maxMessageCount, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.MessageReceivePump.<MessagePumpTaskAsync>b__12_0()","InputAddress":"sb://mybus.servicebus.windows.net/BrandSensitivities","Action":"Receive","activeDispatch":0,"requiresRecycle":false,"SourceContext":"MassTransit","ExceptionDetail":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:372:373:source(address:BrandSensitivities):MessageReceiver5BrandSensitivities' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/brandsensitivities'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43","Source":"Microsoft.Azure.ServiceBus","InnerException":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:372:373:source(address:BrandSensitivities):MessageReceiver5BrandSensitivities' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/brandsensitivities'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43","Source":"Microsoft.Azure.ServiceBus","IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusException"},"IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException"},"MemoryUsage":5066160,"MachineName":"refinement-worker-66974f4b56-28rsn","ThreadId":104,"ProcessId":1,"ProcessName":"dotnet","AssemblyName":"Project.Pricing.Refinement.Worker","AssemblyVersion":"1.0.0.0"}
2020-11-09T14:00:43.802085809Z {"@t":"2020-11-09T14:00:43.8008135Z","@m":"Exception on Receiver sb://mybus.servicebus.windows.net/BrandSensitivities during \"Receive\" ActiveDispatchCount(0) ErrorRequiresRecycle(False)","@i":"9bb9e1e5","@l":"Error","@x":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:372:373:source(address:BrandSensitivities):MessageReceiver5BrandSensitivities' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/brandsensitivities'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43\n ---> Microsoft.Azure.ServiceBus.ServiceBusException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:372:373:source(address:BrandSensitivities):MessageReceiver5BrandSensitivities' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/brandsensitivities'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   --- End of inner exception stack trace ---\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.<>c__DisplayClass65_0.<<ReceiveAsync>b__0>d.MoveNext()\n--- End of stack trace from previous location where exception was thrown ---\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(Int32 maxMessageCount, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.MessageReceivePump.<MessagePumpTaskAsync>b__12_0()","InputAddress":"sb://mybus.servicebus.windows.net/BrandSensitivities","Action":"Receive","activeDispatch":0,"requiresRecycle":false,"SourceContext":"MassTransit","ExceptionDetail":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:372:373:source(address:BrandSensitivities):MessageReceiver5BrandSensitivities' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/brandsensitivities'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43","Source":"Microsoft.Azure.ServiceBus","InnerException":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:372:373:source(address:BrandSensitivities):MessageReceiver5BrandSensitivities' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/brandsensitivities'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43","Source":"Microsoft.Azure.ServiceBus","IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusException"},"IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException"},"MemoryUsage":5460720,"MachineName":"refinement-worker-66974f4b56-28rsn","ThreadId":104,"ProcessId":1,"ProcessName":"dotnet","AssemblyName":"Project.Pricing.Refinement.Worker","AssemblyVersion":"1.0.0.0"}
2020-11-09T14:00:43.803631811Z {"@t":"2020-11-09T14:00:43.8025844Z","@m":"Exception on Receiver sb://mybus.servicebus.windows.net/BrandSensitivities during \"Receive\" ActiveDispatchCount(0) ErrorRequiresRecycle(False)","@i":"9bb9e1e5","@l":"Error","@x":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:372:373:source(address:BrandSensitivities):MessageReceiver5BrandSensitivities' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/brandsensitivities'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43\n ---> Microsoft.Azure.ServiceBus.ServiceBusException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:372:373:source(address:BrandSensitivities):MessageReceiver5BrandSensitivities' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/brandsensitivities'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   --- End of inner exception stack trace ---\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.<>c__DisplayClass65_0.<<ReceiveAsync>b__0>d.MoveNext()\n--- End of stack trace from previous location where exception was thrown ---\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(Int32 maxMessageCount, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.MessageReceivePump.<MessagePumpTaskAsync>b__12_0()","InputAddress":"sb://mybus.servicebus.windows.net/BrandSensitivities","Action":"Receive","activeDispatch":0,"requiresRecycle":false,"SourceContext":"MassTransit","ExceptionDetail":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:372:373:source(address:BrandSensitivities):MessageReceiver5BrandSensitivities' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/brandsensitivities'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43","Source":"Microsoft.Azure.ServiceBus","InnerException":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:372:373:source(address:BrandSensitivities):MessageReceiver5BrandSensitivities' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/brandsensitivities'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43","Source":"Microsoft.Azure.ServiceBus","IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusException"},"IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException"},"MemoryUsage":5657096,"MachineName":"refinement-worker-66974f4b56-28rsn","ThreadId":104,"ProcessId":1,"ProcessName":"dotnet","AssemblyName":"Project.Pricing.Refinement.Worker","AssemblyVersion":"1.0.0.0"}
2020-11-09T14:00:43.804809712Z {"@t":"2020-11-09T14:00:43.8000813Z","@m":"Exception on Receiver sb://mybus.servicebus.windows.net/DailyForecast during \"Receive\" ActiveDispatchCount(0) ErrorRequiresRecycle(False)","@i":"9bb9e1e5","@l":"Error","@x":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:374:375:source(address:DailyForecast):MessageReceiver6DailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/dailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43\n ---> Microsoft.Azure.ServiceBus.ServiceBusException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:374:375:source(address:DailyForecast):MessageReceiver6DailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/dailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   --- End of inner exception stack trace ---\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.<>c__DisplayClass65_0.<<ReceiveAsync>b__0>d.MoveNext()\n--- End of stack trace from previous location where exception was thrown ---\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(Int32 maxMessageCount, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.MessageReceivePump.<MessagePumpTaskAsync>b__12_0()","InputAddress":"sb://mybus.servicebus.windows.net/DailyForecast","Action":"Receive","activeDispatch":0,"requiresRecycle":false,"SourceContext":"MassTransit","ExceptionDetail":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:374:375:source(address:DailyForecast):MessageReceiver6DailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/dailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43","Source":"Microsoft.Azure.ServiceBus","InnerException":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:374:375:source(address:DailyForecast):MessageReceiver6DailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/dailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43","Source":"Microsoft.Azure.ServiceBus","IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusException"},"IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException"},"MemoryUsage":5363880,"MachineName":"refinement-worker-66974f4b56-28rsn","ThreadId":105,"ProcessId":1,"ProcessName":"dotnet","AssemblyName":"Project.Pricing.Refinement.Worker","AssemblyVersion":"1.0.0.0"}
2020-11-09T14:00:43.806656714Z {"@t":"2020-11-09T14:00:43.8055375Z","@m":"Exception on Receiver sb://mybus.servicebus.windows.net/DailyForecast during \"Receive\" ActiveDispatchCount(0) ErrorRequiresRecycle(False)","@i":"9bb9e1e5","@l":"Error","@x":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:374:375:source(address:DailyForecast):MessageReceiver6DailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/dailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43\n ---> Microsoft.Azure.ServiceBus.ServiceBusException: The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:374:375:source(address:DailyForecast):MessageReceiver6DailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/dailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   --- End of inner exception stack trace ---\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.OnReceiveAsync(Int32 maxMessageCount, TimeSpan serverWaitTime)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.<>c__DisplayClass65_0.<<ReceiveAsync>b__0>d.MoveNext()\n--- End of stack trace from previous location where exception was thrown ---\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.RetryPolicy.RunOperation(Func`1 operation, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(Int32 maxMessageCount, TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.Core.MessageReceiver.ReceiveAsync(TimeSpan operationTimeout)\n   at Microsoft.Azure.ServiceBus.MessageReceivePump.<MessagePumpTaskAsync>b__12_0()","InputAddress":"sb://mybus.servicebus.windows.net/DailyForecast","Action":"Receive","activeDispatch":0,"requiresRecycle":false,"SourceContext":"MassTransit","ExceptionDetail":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:374:375:source(address:DailyForecast):MessageReceiver6DailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/dailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43","Source":"Microsoft.Azure.ServiceBus","InnerException":{"HResult":-2146233088,"Message":"The link 'c7693fab-a071-42e9-818d-8b47d3ae82da;365:374:375:source(address:DailyForecast):MessageReceiver6DailyForecast' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. Resource: 'sb://mybus.servicebus.windows.net/dailyforecast'.. TrackingId:6fff8a567b7f45e3b0febbbb1bb01703_G46, SystemTracker:gateway7, Timestamp:2020-11-09T14:00:43","Source":"Microsoft.Azure.ServiceBus","IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusException"},"IsTransient":true,"ServiceBusNamespace":null,"Type":"Microsoft.Azure.ServiceBus.ServiceBusCommunicationException"},"MemoryUsage":5951064,"MachineName":"refinement-worker-66974f4b56-28rsn","ThreadId":105,"ProcessId":1,"ProcessName":"dotnet","AssemblyName":"Project.Pricing.Refinement.Worker","AssemblyVersion":"1.0.0.0"}

@bandmask
Copy link

As of Sunday 2020-12-13, we started seeing this as well. We are using root managed shared access key.
Our services become unavailable and our customers are affected. This is critical for us and we need to handle this as high impact.
Is there any update on the subject? @anqyan

@anqyan
Copy link

anqyan commented Dec 15, 2020

@bandmask could you please send your namespace and detailed exception stack trace to anqyan@microsoft.com ?

@EnricoMassone
Copy link

EnricoMassone commented Jan 22, 2021

Hi all,

I have followed this issue for a while. Some of the errors reported here are the same that we are getting with our application. I have opened another github issue (#18035). Consider reading the discussion there if you are having similar errors.

The good news is that these errors seems to be transient errors automatically handled by the retry policies of the .NET sdk for Azure Service Bus.

@anqyan
Copy link

anqyan commented Feb 9, 2021

Closing the Issue since not enough data was gathered from client SDK. If you are experiencing this again, please follow the instructions to capture SDK level logs and reopen this issue.

@anqyan anqyan closed this as completed Feb 9, 2021
@andrea-cassioli-maersk
Copy link

We have just experience the very same issue for the first time:

Exception thrown: Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The link 'ZZZ:source(address:XXX):MessageReceiver2YYY' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. 

not clear why this happens. We are not using tokens to authorise, so not sure why there should be a renew token error.

@EnricoMassone
Copy link

We have just experience the very same issue for the first time:

Exception thrown: Microsoft.Azure.ServiceBus.ServiceBusCommunicationException: The link 'ZZZ:source(address:XXX):MessageReceiver2YYY' is force detached. Code: RenewToken. Details: Unauthorized access. 'Listen' claim(s) are required to perform this operation. 

not clear why this happens. We are not using tokens to authorise, so not sure why there should be a renew token error.

Hi @andrea-cassioli-maersk are you using the root access key (RootManageAccessKey in the Azure portal) in order to authenticate with the service bus ?

@andrea-cassioli-maersk
Copy link

@EnricoMassone we use the primary connection string from RootManageSharedAccessKey

@EnricoMassone
Copy link

EnricoMassone commented Mar 10, 2021

@EnricoMassone we use the primary connection string from RootManageSharedAccessKey

That's the same for us. That error, based on my understanding, does not make sense because by definition the root access key should have all the permission claims.

We are getting several strange and unclear errors from the .NET sdk from an ASP.NET core 2.2 application. Our errors are described in this issue. We are not currently able to constantly reproduce them.

I have maybe a good news.

We have rewritten our code to interact with the Azure service bus by using the new .NET sdk named Azure.Messaging.ServiceBus. We are still testing the new implementation, but it seems that the errors have disappeared. So I would suggest you to evaluate its adoption as a mitigation strategy.

If you still want to use the Microsoft.Azure.ServiceBus package (instead of the new one mentioned above) you can follow this guide to programmatically get all the logs raised from the Microsoft.Azure.ServiceBus code. By doing so, you can perform a verbose logging of all the operations done by the library. I have used these logs to open my own issue. I haven't understood the root cause of the errors, but at least now I have plenty of exception stacktraces. You can use them to open a ticket to the Azure support if you like.

From our side, we hope that the new Azure.Messaging.ServiceBus SDK has finally solved our troubles. Some colleagues from other teams have suggested us to give it a shot, because they have experienced several improvements in their own projects by adopting it.

@github-actions github-actions bot locked and limited conversation to collaborators Mar 28, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Client This issue points to a problem in the data-plane of the library. customer-reported Issues that are reported by GitHub users external to the Azure organization. question The issue doesn't require a change to the product in order to be resolved. Most issues start as that Service Attention This issue is responsible by Azure service team. Service Bus
Projects
None yet
Development

No branches or pull requests