Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move SocketManager's DecodeError to Common #11003

Merged
merged 1 commit into from Aug 24, 2022

Conversation

sepalani
Copy link
Contributor

@sepalani sepalani commented Aug 22, 2022

This PR moves the socket manager decode error function to Common and should fix a thread safety issue. I plan to use this method on other places not related to Wii sockets such as BBA HLE.

Ready to be reviewed & merged.

EDIT: I didn't know that using strerror_r would be that difficult. I don't know if we (or the compiler) did/do/will enable GNU extension. Here are some references/differences depending on the system:

@sepalani sepalani force-pushed the decode-error branch 2 times, most recently from fab9314 to a333074 Compare August 22, 2022 09:26
@AdmiralCurtiss
Copy link
Contributor

I think you're probably better off just returning an std::string instead of using a thread_local or static buffer.

@sepalani
Copy link
Contributor Author

I think you're probably better off just returning an std::string instead of using a thread_local or static buffer.

I'm forced to use a thread_local/static buffer on Windows to retrieve WinSock errors using FormatMessage and on Linux strerror isn't thread safe. I could return a std::string based on this buffer but that seems overkill.

@AdmiralCurtiss
Copy link
Contributor

Wait, why are you forced into a static buffer on Windows? What's stopping you from just doing

std::string DecodeNetworkError(s32 error_code)
{
  char buffer[1024];
  FormatMessageA(FORMAT_MESSAGE_FROM_SYSTEM | FORMAT_MESSAGE_IGNORE_INSERTS |
                     FORMAT_MESSAGE_MAX_WIDTH_MASK,
                 nullptr, error_code, MAKELANGID(LANG_NEUTRAL, SUBLANG_DEFAULT), buffer,
                 sizeof(buffer), nullptr);
  return std::string(buffer);
}

@sepalani
Copy link
Contributor Author

Wait, why are you forced into a static buffer on Windows? What's stopping you from just doing

std::string DecodeNetworkError(s32 error_code)
{
  char buffer[1024];
  FormatMessageA(FORMAT_MESSAGE_FROM_SYSTEM | FORMAT_MESSAGE_IGNORE_INSERTS |
                     FORMAT_MESSAGE_MAX_WIDTH_MASK,
                 nullptr, error_code, MAKELANGID(LANG_NEUTRAL, SUBLANG_DEFAULT), buffer,
                 sizeof(buffer), nullptr);
  return std::string(buffer);
}

Won't this have a negative impact on performance by creating the buffer each time there is an error, on top of copying it into a std::string? Since we're using non-blocking sockets, this function can be called a lot, especially when connect is called.

@AdmiralCurtiss
Copy link
Contributor

The buffer itself is on the stack, so that's free. The conversion to std::string does indeed cost a copy to the heap, but I kinda doubt it's worth caring about. I guess it depends on how likely you think it is that someone calls DecodeNetworkError() twice in a row and then tries to use both results.

If you want both perf and safety, at the cost of slightly less convenience, you can let the user of DecodeNetworkError() pass in the buffer, eg.

void DecodeNetworkError(char* buffer, size_t buffer_size, s32 error_code)
{
  FormatMessageA(FORMAT_MESSAGE_FROM_SYSTEM | FORMAT_MESSAGE_IGNORE_INSERTS |
                     FORMAT_MESSAGE_MAX_WIDTH_MASK,
                 nullptr, error_code, MAKELANGID(LANG_NEUTRAL, SUBLANG_DEFAULT), buffer,
                 buffer_size, nullptr);
}

@sepalani
Copy link
Contributor Author

The buffer itself is on the stack, so that's free. The conversion to std::string does indeed cost a copy to the heap, but I kinda doubt it's worth caring about. I guess it depends on how likely you think it is that someone calls DecodeNetworkError() twice in a row and then tries to use both results.

If you want both perf and safety, at the cost of slightly less convenience, you can let the user of DecodeNetworkError() pass in the buffer, eg.

void DecodeNetworkError(char* buffer, size_t buffer_size, s32 error_code)
{
  FormatMessageA(FORMAT_MESSAGE_FROM_SYSTEM | FORMAT_MESSAGE_IGNORE_INSERTS |
                     FORMAT_MESSAGE_MAX_WIDTH_MASK,
                 nullptr, error_code, MAKELANGID(LANG_NEUTRAL, SUBLANG_DEFAULT), buffer,
                 buffer_size, nullptr);
}

Most of its uses are for error logs so changing the prototype to take a buffer would defeat its purpose. It's used a bit like strerror but take into account that Windows doesn't use errno for network error. I could return a std::string but I'm fairly confident that the impact will be significant for debug builds.

@sepalani
Copy link
Contributor Author

@AdmiralCurtiss
I did some testing with Monster Hunter 3 and I compared both implementations with thread_local char* and std::string. With the log window enabled, the game (during the connection process) runs at 270% FPS with thread_local char* while it merely reaches 200% FPS with std::string. The dolphin builds were run multiple times and in the same conditions.

I don't know how precise the FPS counter is but that seems to be a significant performance loss. If you know a better way to measure this, I'm open to suggestions as well.

@AdmiralCurtiss
Copy link
Contributor

Doesn't make me happy, but yeah, that's significant.

@AdmiralCurtiss AdmiralCurtiss merged commit a7d358a into dolphin-emu:master Aug 24, 2022
11 checks passed
@sepalani sepalani deleted the decode-error branch August 24, 2022 16:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants