-
Notifications
You must be signed in to change notification settings - Fork 84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some udp message are not written #117
Comments
Seems to be related to the length of the If I try to log the following as context it works
As soon I add the next entry it just silently fails.
Any ideas? Anybody? |
A few questions:
|
No error at all. That's the weird part
It seems to fail at about 1452 bytes. The number is not constant. But always in this area.
I deactivated chunking for testing. Seems like chunked messages are not received at all. Experimented with all different sizes.
It might be related to the receiving server, and or the network in between. Currently investigating in this direction as there is no obvious error from the library. |
Since there is no error from the library, I assume that the library does not get any feedback that something went wrong with the UDP send. If I had to guess I'd say something is wrong with the MTU somewhere along the network path. 1452 feels suspiciously close to the ethernet default of 1500. Can you enable chunking with a significantly smaller size (e.g. 500) and see if this works? You can also run |
I was not able to find anything. Meanwhile I switched to TCP and this works for now without issues. If I return to UDP and fins something related I'll update this here just in case. |
Hi there,
Turns out, the split_length of str_split in sendMessageInChunks is too big. The chunk is too big here, as later on all the magic bytes for udp-gelf are added which results in a chunk that is > chunkSize. A quick hack, reserving 15 bytes for the gelf-magic: |
Yeah, this is true. It probably didn't surface yet, because one could adjust the chunk-size manually. But good catch! Thanks. Though I diasgree with 15bytes - I count 12:
|
Please have a look at: |
I will reinvestigate. Feels like an off-by-one-error. If you want to support the cause, you could add your failing tests in a PR. :-) |
Do you know your network's MTU? Because we have to differentiate between 2 things: a) gelf-php creating too big packets, where "too big" means that somehow a UDP packet with a size bigger than b) the packet not reaching its destination because the The initial bug was related to a) not taking the GELF chunk overhead into account. But you may also have a problem with b). Your chunkSize needs to be smaller or equal to the MTU minus at least the UDP overhead (which is AFAIR four 2byte.fields -> 8bytes). |
Hey @bzikarsky , thanks for looking into it! The MTU is set to 1500. In my case I have changed the chunk size to 1400 and everything looks good so far. Anyway I am happy with the current implementation and you can always set your own chunk size through the constructor, so I wouldnt mind. Keep up the good work 👍 |
Ha! OpenVPN! This will eat into your MTU as well. I guess OpenVPN does not properly know about your network's MTU and will not correctly transfer our packets via the wire. maybe this article is of interest for you: https://www.sonassi.com/help/troubleshooting/setting-correct-mtu-for-openvpn |
wooow! Awesome info. Turns out the MTU within the VPN is 1419. Obviously! Ok, now that makes a lot of sense after all :) |
You are very welcome. :) |
Trying to send log messages over
udp
seems to fail for some message. The failure seems to be related tocontext
data.If I remove data from the context the message is sent. Adding back some data makes the sending fail.
Fail in this case means no issue that I could see anywhere. I just debugged it with Wireshark, to check what leaves my machine and what not.
The easiest way to reproduce was to add
json_encode($_SERVER)
as context data, by using Monolog.Using other data structures with
json_encode
works fine. So it might be somehow related to the data itself. Currently no idea what goes wrong.The text was updated successfully, but these errors were encountered: