You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using this package for the development of a publisher service. The service just takes a CSV data file and reads it line by line. It then sends a message via mqtt for each line. This works nicely but after some time logrus shows this error message in the log:
ERRO[4733] Error publishing message error="Packet Identifiers are exhausted"
What happened here? How can I prevent this? Is the message lost or will the client auto retry sending it?
EDIT: Restarting the service solves the problem, it then works again for some time but after a few minutes it shows the error again.
The text was updated successfully, but these errors were encountered:
So basically it seems that you're hitting the limit of 65535 in-flight packets, since the sendingPackets map is cleaned up after the PUBLISH operation gets an ACK.
Are you sure you are not publishing messages much faster than the MQTT server can handle them? (slow connection to MQTT server, or MQTT server has slow CPU or something). Or is there a bug in the MQTT server where the packets simply don't get ACKed at all? 65535 is quite a generous amount of in-light PUBLISHes to have, so there must be something horribly wrong that you are either doing, or hitting a bug somewhere.
Are you publishing concurrently from many threads?
Hi,
I am using this package for the development of a publisher service. The service just takes a CSV data file and reads it line by line. It then sends a message via mqtt for each line. This works nicely but after some time logrus shows this error message in the log:
What happened here? How can I prevent this? Is the message lost or will the client auto retry sending it?
EDIT: Restarting the service solves the problem, it then works again for some time but after a few minutes it shows the error again.
The text was updated successfully, but these errors were encountered: