Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

panic during performance benchmark #16

Closed
zhironghsu opened this issue Nov 8, 2022 · 3 comments
Closed

panic during performance benchmark #16

zhironghsu opened this issue Nov 8, 2022 · 3 comments

Comments

@zhironghsu
Copy link

Benchmark command:
mqtt-stresser -broker tcps://localhost:18883 -skip-tls-verification -num-clients 10 -num-messages 80000 -rampup-delay 1s -rampup-size 10 -global-timeout 180s -timeout 20s

It will panic when it got the "An existing connection was forcibly closed by the remote host" error message.

2022/11/07 18:56:43 error writing to buffer io.Writer; write tcp 127.0.0.1:18883->127.0.0.1:60634: wsasend: An existing connection was forcibly closed by the remote host.
2022/11/07 18:56:43 error writing to buffer io.Writer; write tcp 127.0.0.1:18883->127.0.0.1:60633: wsasend: An existing connection was forcibly closed by the remote host.
panic: runtime error: invalid memory address or nil pointer dereference
[signal 0xc0000005 code=0x0 addr=0x0 pc=0x46d467]

goroutine 39 [running]:
github.com/wind-c/comqtt/server/internal/circ.(*Writer).Write(0x0, {0xc001557b00, 0x4f, 0xc0000b7d28?})
C:/Users/test/go/pkg/mod/github.com/wind-c/comqtt@v1.2.0/server/internal/circ/writer.go:80 +0x27
github.com/wind-c/comqtt/server/internal/clients.(*Client).WritePacket(, {{0x4d, 0x3, 0x0, 0x0, 0x0}, {0x0, 0x0, 0x0}, {0x0, ...}, ...})
C:/Users/test/go/pkg/mod/github.com/wind-c/comqtt@v1.2.0/server/internal/clients/clients.go:525 +0x17a
github.com/wind-c/comqtt/server.(*Server).writeClient(
, , {{0x4d, 0x3, 0x0, 0x0, 0x0}, {0x0, 0x0, 0x0}, ...})
C:/Users/test/go/pkg/mod/github.com/wind-c/comqtt@v1.2.0/server/server.go:542 +0x5d
github.com/wind-c/comqtt/server.(*Server).publishToSubscribers(
, {{0x4d, 0x3, 0x0, 0x0, 0x0}, {0x0, 0x0, 0x0}, {0x0, ...}, ...})
C:/Users/test/go/pkg/mod/github.com/wind-c/comqtt@v1.2.0/server/server.go:665 +0x794
github.com/wind-c/comqtt/server.(*Server).processPublish(_, , {{0x4d, 0x3, 0x0, 0x0, 0x0}, {0x0, 0x0, 0x0}, ...})
C:/Users/test/go/pkg/mod/github.com/wind-c/comqtt@v1.2.0/server/process.go:149 +0x678
github.com/wind-c/comqtt/server.(*Server).processPacket(
, _, {{0x4d, 0x3, 0x0, 0x0, 0x0}, {0x0, 0x0, 0x0}, ...})
C:/Users/test/go/pkg/mod/github.com/wind-c/comqtt@v1.2.0/server/process.go:28 +0x138
github.com/wind-c/comqtt/server/internal/clients.(*Client).Read(0xc0000b0280, 0xc0000b9940)
C:/Users/test/go/pkg/mod/github.com/wind-c/comqtt@v1.2.0/server/internal/clients/clients.go:401 +0x19d
github.com/wind-c/comqtt/server.(*Server).EstablishConnection(0xc000120370, {0x59cd6b, 0x2}, {0x62af78, 0xc0000a6700}, {0x628a10?, 0x83f3e0?})
C:/Users/test/go/pkg/mod/github.com/wind-c/comqtt@v1.2.0/server/server.go:362 +0x10f2
github.com/wind-c/comqtt/server/listeners.(*TCP).Serve.func1()
C:/Users/test/go/pkg/mod/github.com/wind-c/comqtt@v1.2.0/server/listeners/tcp.go:113 +0x3d
created by github.com/wind-c/comqtt/server/listeners.(*TCP).Serve
C:/Users/test/go/pkg/mod/github.com/wind-c/comqtt@v1.2.0/server/listeners/tcp.go:112 +0xea

@wind-c
Copy link
Owner

wind-c commented Nov 11, 2022

Known problems in extreme scenarios. Caused when a large number of messages occur frequently on a single client. Single client circle buffer limit, currently optimized circle buffer logic.
Currently a large number of clients, each client small message scenario is fine. (For example: mqtt-stresser -num-clients 1000 -num-messages 100 or mqtt-stresser -num-clients 5000 -num-messages 10, etc.)

@zhironghsu
Copy link
Author

There are some race conditions in internal/clients/clients.go,

internal/clients/clients.go:88
clients.(*Client).GetAll() has no lock protection.

internal/clients/clients.go:278
clients.(*Client).ClearBuffers() has no lock protection

internal/clients/clients.go:515
clients.(*Client).WritePacket()...
The "cl.W" has no lock protection, which will cause panic once the ClearBuffers() is called.

@wind-c
Copy link
Owner

wind-c commented Mar 11, 2023

It has been fixed in the 2.0 release.

@wind-c wind-c closed this as completed Mar 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants