Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Setting runOnDemandStartTimeout or runOnDemandCloseAfter for a path causes segfault/panic in v1.2.0 #2529

Closed
3 of 13 tasks
mrlt8 opened this issue Oct 19, 2023 · 5 comments · Fixed by #2550
Closed
3 of 13 tasks
Labels
bug Something isn't working general

Comments

@mrlt8
Copy link
Contributor

mrlt8 commented Oct 19, 2023

Which version are you using?

v1.2.0

Which operating system are you using?

  • Linux amd64 standard
  • Linux amd64 Docker
  • Linux arm64 standard
  • Linux arm64 Docker
  • Linux arm7 standard
  • Linux arm7 Docker
  • Linux arm6 standard
  • Linux arm6 Docker
  • Windows amd64 standard
  • Windows amd64 Docker (WSL backend)
  • macOS amd64 standard
  • macOS amd64 Docker
  • Other (please describe)

Describe the issue

Setting runOnDemandStartTimeout and/or runOnDemandCloseAfter for a specific path causes a segfault/panic. This issue started with v1.2.0 and was not an issue with v1.1.1.

Describe how to replicate the issue

  1. Set runOnDemandStartTimeout and/or runOnDemandCloseAfter for a specific path
  2. start server
MTX_PATHS_MYPATH_RUNONDEMANDSTARTTIMEOUT=30s ./mediamtx 

Did you attach the server logs?

yes

panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x0 pc=0x16aeb7a]

goroutine 1 [running]:
github.com/bluenviron/mediamtx/internal/conf.(*StringDuration).UnmarshalJSON(0x0, {0xc0003748a0, 0x5, 0x8})
	/s/internal/conf/string_duration.go:28 +0x9a
github.com/bluenviron/mediamtx/internal/conf.(*StringDuration).UnmarshalEnv(0x19cc2a0?, {0xc000590158?, 0xc00016b7a0?}, {0xc000042089?, 0x0?})
	/s/internal/conf/string_duration.go:35 +0x5d
github.com/bluenviron/mediamtx/internal/conf/env.loadEnvInternal(0xc00013bd40?, {0xc00016b7a0, 0x28}, {0x19effe0?, 0xc0002d5380?, 0xc00003b998?})
	/s/internal/conf/env/env.go:33 +0x1202
github.com/bluenviron/mediamtx/internal/conf/env.loadEnvInternal(0xc00013bd40?, {0xc000374400, 0x10}, {0xc0000b3440?, 0xc0002d5180?, 0xc00058fd90?})
	/s/internal/conf/env/env.go:164 +0x1eb5
github.com/bluenviron/mediamtx/internal/conf/env.loadEnvInternal(0xc0000b3440?, {0xc000374400, 0x10}, {0xc00013bd40?, 0xc0002d5180?, 0x22d7f80?})
	/s/internal/conf/env/env.go:28 +0x12ef
github.com/bluenviron/mediamtx/internal/conf/env.Load({0xc000374400, 0x10}, {0xc0000b3440?, 0xc0002d5180})
	/s/internal/conf/env/env.go:224 +0x237
github.com/bluenviron/mediamtx/internal/conf.(*OptionalPath).UnmarshalEnv(0xc0002f2430?, {0xc000374400?, 0x10?}, {0x10?, 0x1978c5e?})
	/s/internal/conf/optional_path.go:64 +0x85
github.com/bluenviron/mediamtx/internal/conf/env.loadEnvInternal(0x19fb660?, {0xc000374400, 0x10}, {0x19efde0?, 0xc0002f2430?, 0xc0005907f0?})
	/s/internal/conf/env/env.go:38 +0x13a4
github.com/bluenviron/mediamtx/internal/conf/env.loadEnvInternal(0x19efde0?, {0xc000374400, 0x10}, {0x19fb660?, 0xc0002f2430?, 0xc00004206a?})
	/s/internal/conf/env/env.go:28 +0x12ef
github.com/bluenviron/mediamtx/internal/conf/env.loadEnvInternal(0x19ca380?, {0xc0003743e5, 0x9}, {0xc0000b3300?, 0xc00015ef40?, 0x1067d39?})
	/s/internal/conf/env/env.go:146 +0x19dd
github.com/bluenviron/mediamtx/internal/conf/env.loadEnvInternal(0x1af4880?, {0xc0003743e5, 0x9}, {0x19ca380?, 0xc00015ef40?, 0xc0003743e0?})
	/s/internal/conf/env/env.go:28 +0x12ef
github.com/bluenviron/mediamtx/internal/conf/env.loadEnvInternal(0x1af4880?, {0x1af6d24, 0x3}, {0x1a99940?, 0xc00015e800?, 0x195ad00?})
	/s/internal/conf/env/env.go:164 +0x1eb5
github.com/bluenviron/mediamtx/internal/conf/env.loadEnvInternal(0x1a99940?, {0x1af6d24, 0x3}, {0x1af4880?, 0xc00015e800?, 0xc0001b6d18?})
	/s/internal/conf/env/env.go:28 +0x12ef
github.com/bluenviron/mediamtx/internal/conf/env.Load({0x1af6d24, 0x3}, {0x1a99940?, 0xc00015e800})
	/s/internal/conf/env/env.go:224 +0x237
github.com/bluenviron/mediamtx/internal/conf.Load({0x0, 0x0}, {0x22c5120, 0x5, 0x5})
	/s/internal/conf/conf.go:262 +0xb7
github.com/bluenviron/mediamtx/internal/core.New({0xc0000a01e0, 0x0, 0x0})
	/s/internal/core/core.go:138 +0x365
main.main()
	/s/main.go:11 +0x52

Did you attach a network dump?

no

@aler9
Copy link
Member

aler9 commented Oct 23, 2023

Thanks for reporting the issue, this is fixed by #2529

@aler9
Copy link
Member

aler9 commented Oct 28, 2023

added in v1.2.1

@mrlt8
Copy link
Contributor Author

mrlt8 commented Nov 3, 2023

@aler9 Is there a replacement for the deprecated settings?

I'm still getting some panics related to onDemand:

panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x0 pc=0xc76783]

goroutine 11 [running]:
github.com/bluenviron/mediamtx/internal/core.(*path).onDemandPublisherStop(0xc00001e3c0, {0xebeec8, 0x14})
	/s/internal/core/path.go:869 +0x143
github.com/bluenviron/mediamtx/internal/core.(*path).doOnDemandPublisherCloseTimer(...)
	/s/internal/core/path.go:504
github.com/bluenviron/mediamtx/internal/core.(*path).runInner(0xc00001e3c0)
	/s/internal/core/path.go:405 +0x653
github.com/bluenviron/mediamtx/internal/core.(*path).run(0xc00001e3c0)
	/s/internal/core/path.go:334 +0x325
created by github.com/bluenviron/mediamtx/internal/core.newPath in goroutine 1
	/s/internal/core/path.go:284 +0x676

@aler9
Copy link
Member

aler9 commented Nov 3, 2023

@mrlt8 open another issue and provide details on how to replicate the new, unrelated crash.

Copy link
Contributor

github-actions bot commented May 7, 2024

This issue is being locked automatically because it has been closed for more than 6 months.
Please open a new issue in case you encounter a similar problem.

@github-actions github-actions bot locked and limited conversation to collaborators May 7, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working general
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants