-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Description
- Dart version 3.11.0-90.0.dev
- OS: Linux
When sending datagrams to a RawDatagramSocket, the return value of send is used to determine whether the datagram was sent correctly. It always returns 0 or the length of the sent datagram, with 0 indicating the datagram was not queued for sending by the OS.
This leads to an ambiguity when sending a datagram with length 0: does a return value of 0 mean the datagram was queued for sending, or does it mean the write failed?
The issue stems from _NativeSocket.send returning 0 when an error is thrown instead of returning -1, as linux does for failed writes, for example.
It could be possible to use the associated error sent on the _RawDatagramSocket to determine if the write failed, but this error is only sent in a later microtask so associating it with a specific send call is difficult.
Of course changing the error return value from 0 to -1 would be a breaking change, so maybe we can have some alternative way of checking if the last send call succeeded?