Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SegFault in "nn_usock_handler" at src/aio/usock_posix.inc : Line 601 #1075

Closed
Astra580 opened this issue Apr 12, 2022 · 5 comments
Closed

SegFault in "nn_usock_handler" at src/aio/usock_posix.inc : Line 601 #1075

Astra580 opened this issue Apr 12, 2022 · 5 comments

Comments

@Astra580
Copy link

We are facing segfault issue in nanomsg library. We use PUB/SUB mechanism for communication.
Following nanomsg thread routine is causing the crash.

[Backtrace]
#0 0x00007efd56091a2b in nn_usock_handler (self=0x56094724f870, src=1, type=1, srcptr=) at /usr/src/debug/git/nanomsg/src/aio/usock_posix.inc:601
usock = 0x56094724f870
s =
rc =
sz = 9
srcptr =
type = 1
src = 1
self = 0x56094724f870
usock = 0x56094724f870
#1 0x00007efd5607709e in nn_worker_routine (arg=0x7efd562c06e0 <self+64>) at /usr/src/debug/git/tnanomsg/src/aio/worker_posix.inc:249
rc =
self = 0x7efd562c06e0 <self+64>
pevent = 1
phndl = 0x56094724f8f0
thndl = 0x560947506368
tasks = {head = 0x0, tail = 0x0}
item =
task =
fd = 0x56094724f8e0
timer =
#2 0x00007efd56078ccd in nn_thread_main_routine (arg=) at /usr/src/debug/git/nanomsg/src/utils/thread_posix.inc:35
self =
#3 0x00007efd55144477 in start_thread (arg=0x7efd4e9cf700) at /usr/src/debug/glibc/2.27-r0/git/nptl/pthread_create.c:463
pd = 0x7efd4e9cf700
now =
unwind_buf = {cancel_jmp_buf = {{jmp_buf = {139626410735360, -9064765947889294268, 139626582381214, 139626582381215, 8396800, 139626582381216, 9208144196267959364, 9208121042860401732}, mask_was_saved = 0}}, priv = {pad = {0x0, 0x0, 0x0, 0x0}, data = {prev = 0x0, cleanup = 0x0, canceltype = 0}}}
not_first_call =
#4 0x00007efd54affd9f in clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:95
No locals.

Can anyone suggest if any patch available for this issue?

@gdamore
Copy link
Contributor

gdamore commented Apr 12, 2022

This doesn't look familiar to me. I don't see anything specific in this crash that helps me understand it either.

I'd recommend trying out NNG if you're able to.

If you're still working on this, can you provide details about what the git version you're using is? I see you are using Linux. Also was there anything else you have about activity? If you have a reproduction case that is small, that would help.

@Astra580
Copy link
Author

Currently we are using very old nanomsg version with git nanomsg master commit : 327a29b

We dont want to update directly to latest version available. Below is the scenario of crash.

We are using pub-sub mechanism for communication between process.

At Receiving end we are performing below operation :

Initial socket creation for subscription

-> nn_socket(AF_SP, NN_SUB)
-> nn_setsockopt(NNSocket, NN_SUB, NN_SUB_SUBSCRIBE, "", 0)
-> nn_connect(NNSocket, "URL_STRING")

When data is available to read

-> #define POS_MAX_BUFF_SIZE 8192
-> void* buff = nn_allocmsg(POS_MAX_BUFF_SIZE, 0)
-> nn_recv (fd, (uint8_t *)buff, POS_MAX_BUFFER_SIZE, NN_DONTWAIT)

As per our understanding crash is happening when we call nn_recv().
Unfortunately we could not reproduce the issue again. Crash is observed only once until now.

@gdamore
Copy link
Contributor

gdamore commented Feb 3, 2024

That is an extremely old version version of libnanomsg. Please update and let me know if you still have problems.

@gdamore
Copy link
Contributor

gdamore commented Feb 3, 2024

Essentially, I'm not willing to spend effort trying to diagnose a problem because you don't want to update to something newer. This project is already in sustaining mode, and looking back at 8 year old versions of the code is just not something that is practical normally.

Having said all that, if you want to get support for that ancient version please contact info@staysail.tech and we can discuss an hourly rate.

@gdamore
Copy link
Contributor

gdamore commented Feb 18, 2024

Closing this for now. If you test a newer version and find a problem, let me know. If you want to proceed with this old version please reach out for commercial support options.

@gdamore gdamore closed this as completed Feb 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants