Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SIGTRAP: trace trap on M1 #2218

Closed
splmatto opened this issue Nov 25, 2021 · 11 comments
Closed

SIGTRAP: trace trap on M1 #2218

splmatto opened this issue Nov 25, 2021 · 11 comments
Labels

Comments

@splmatto
Copy link
Contributor

splmatto commented Nov 25, 2021

Host operating system: output of uname -a

Darwin m1minimatto01 20.6.0 Darwin Kernel Version 20.6.0: Tue Oct 12 18:33:38 PDT 2021; root:xnu-7195.141.8~1/RELEASE_ARM64_T8101 arm64

node_exporter version: output of node_exporter --version

matto@m1minimatto01 node_exporter % git clean -fdx
Removing collector/fixtures/sys/
Removing node_exporter
matto@m1minimatto01 node_exporter % make
>> checking code style
>> checking license header
>> running yamllint on all YAML files in the repository
yamllint not installed so skipping
>> running check for unused/missing packages in go.mod
GO111MODULE=on go mod tidy
>> building binaries
GO111MODULE=on /Users/matto/go/bin/promu --config .promu-cgo.yml build --prefix /Users/matto/go/src/github.com/prometheus/node_exporter 
 >   node_exporter
>> extracting fixtures
if [ -d collector/fixtures/sys/ ] ; then rm -rf collector/fixtures/sys/ ; fi
./ttar -C collector/fixtures/ -x -f collector/fixtures/sys.ttar
WARNING sed unable to handle null bytes, using Python (slow).
touch collector/fixtures/sys/.unpacked
>> running tests
go test -short  ./...
ok  	github.com/prometheus/node_exporter	(cached)
ok  	github.com/prometheus/node_exporter/collector	(cached)
>> vetting code
GO111MODULE=on go vet  ./...
>> checking metrics for correctness
./checkmetrics.sh /Users/matto/go/bin/promtool collector/fixtures/e2e-output.txt
>> checking rules for correctness
find . -name "*rules*.yml" | xargs -I {} /Users/matto/go/bin/promtool check rules {}
Checking ./example-rules.yml
  SUCCESS: 4 rules found

Checking ./docs/example-16-compatibility-rules.yml
  SUCCESS: 88 rules found

Checking ./docs/example-17-compatibility-rules-new-to-old.yml
  SUCCESS: 1 rules found

Checking ./docs/example-17-compatibility-rules.yml
  SUCCESS: 1 rules found

Checking ./docs/example-16-compatibility-rules-new-to-old.yml
  SUCCESS: 88 rules found

>> SKIP running tests in 32-bit mode: not supported on darwin/arm64
>> SKIP running end-to-end tests on darwin

node_exporter, version 1.3.0 (branch: master, revision: 9fbb56c)
build user: matto@m1minimatto01
build date: 20211125-02:29:18
go version: go1.17.3
platform: darwin/arm64

node_exporter command line flags

./node_exporter

Are you running node_exporter in Docker?

No.

What did you do that produced an error?

After starting the binary, curl'd the metrics endpoint:

curl http://localhost:9100/metrics

What did you expect to see?

node_exporter metrics output

What did you see instead?

ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:182 level=info msg="Starting node_exporter" version="(version=1.3.0, branch=master, revision=9fbb56c9c80d5486d6de5e06a224a1eb343b28e7)"
ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:183 level=info msg="Build context" build_context="(go=go1.17.3, user=matto@m1minimatto01, date=20211125-02:29:18)"
ts=2021-11-25T03:33:54.270Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev)($|/)
ts=2021-11-25T03:33:54.270Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^devfs$
ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:108 level=info msg="Enabled collectors"
ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:115 level=info collector=boottime
ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:115 level=info collector=cpu
ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:115 level=info collector=diskstats
ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:115 level=info collector=filesystem
ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:115 level=info collector=loadavg
ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:115 level=info collector=meminfo
ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:115 level=info collector=netdev
ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:115 level=info collector=os
ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:115 level=info collector=powersupplyclass
ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:115 level=info collector=textfile
ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:115 level=info collector=thermal
ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:115 level=info collector=time
ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:115 level=info collector=uname
ts=2021-11-25T03:33:54.270Z caller=node_exporter.go:199 level=info msg="Listening on" address=:9100
ts=2021-11-25T03:33:54.271Z caller=tls_config.go:195 level=info msg="TLS is disabled." http2=false
SIGTRAP: trace trap
PC=0x18b1d1e00 m=0 sigcode=0

goroutine 0 [idle]:
runtime: unknown pc 0x18b1d1e00
stack: frame={sp:0x16d977880, fp:0x0} stack=[0x16d8f8868,0x16d9778e0)
0x000000016d977780:  0x000000016d9777f0  0xa93000018bae0e60 
0x000000016d977790:  0x00000000000003ec  0xb83e9471697700aa 
0x000000016d9777a0:  0x0000000102f06340  0x0000000000000860 
0x000000016d9777b0:  0x0000000102b366e8  0x0000000102a9db00 
0x000000016d9777c0:  0xffffffffffffffa2  0x0000000000000000 
0x000000016d9777d0:  0x0000000134e0aea0  0x000000016d977848 
0x000000016d9777e0:  0x00000001360049f0  0x00000000e00002f0 
0x000000016d9777f0:  0x000000016d977830  0x984800018d778180 
0x000000016d977800:  0xffffffffffffffa2  0x000000016d977880 
0x000000016d977810:  0x000000010296c210  0x0000000000000003 
0x000000016d977820:  0x0000000102f06340  0x0000000000000860 
0x000000016d977830:  0x000000016d977880  0xea0280010296c1fc 
0x000000016d977840:  0x0000000000000000  0x0000000000000000 
0x000000016d977850:  0xffffffffffffffa2  0x0000000000000008 
0x000000016d977860:  0x000001400028e264  0x0000000000000003 
0x000000016d977870:  0x000000000000007d  0x000001400035ad78 
0x000000016d977880: <0x0000000000000000  0xe25f0001024f613c 
0x000000016d977890:  0x0000014000083860  0x00000000000003b0 
0x000000016d9778a0:  0x00000001024f3e10 <runtime.mstart+0x0000000000000010>  0x0000000102f06340 
0x000000016d9778b0:  0x00000001024f3dd4 <runtime.rt0_go+0x00000000000000f4>  0x000000016d9778f8 
0x000000016d9778c0:  0x0000000102f06340  0x000000016d9778f8 
0x000000016d9778d0:  0x00000001024f3ddc <runtime.rt0_go+0x00000000000000fc>  0x0000000000000000 
runtime: unknown pc 0x18b1d1e00
stack: frame={sp:0x16d977880, fp:0x0} stack=[0x16d8f8868,0x16d9778e0)
0x000000016d977780:  0x000000016d9777f0  0xa93000018bae0e60 
0x000000016d977790:  0x00000000000003ec  0xb83e9471697700aa 
0x000000016d9777a0:  0x0000000102f06340  0x0000000000000860 
0x000000016d9777b0:  0x0000000102b366e8  0x0000000102a9db00 
0x000000016d9777c0:  0xffffffffffffffa2  0x0000000000000000 
0x000000016d9777d0:  0x0000000134e0aea0  0x000000016d977848 
0x000000016d9777e0:  0x00000001360049f0  0x00000000e00002f0 
0x000000016d9777f0:  0x000000016d977830  0x984800018d778180 
0x000000016d977800:  0xffffffffffffffa2  0x000000016d977880 
0x000000016d977810:  0x000000010296c210  0x0000000000000003 
0x000000016d977820:  0x0000000102f06340  0x0000000000000860 
0x000000016d977830:  0x000000016d977880  0xea0280010296c1fc 
0x000000016d977840:  0x0000000000000000  0x0000000000000000 
0x000000016d977850:  0xffffffffffffffa2  0x0000000000000008 
0x000000016d977860:  0x000001400028e264  0x0000000000000003 
0x000000016d977870:  0x000000000000007d  0x000001400035ad78 
0x000000016d977880: <0x0000000000000000  0xe25f0001024f613c 
0x000000016d977890:  0x0000014000083860  0x00000000000003b0 
0x000000016d9778a0:  0x00000001024f3e10 <runtime.mstart+0x0000000000000010>  0x0000000102f06340 
0x000000016d9778b0:  0x00000001024f3dd4 <runtime.rt0_go+0x00000000000000f4>  0x000000016d9778f8 
0x000000016d9778c0:  0x0000000102f06340  0x000000016d9778f8 
0x000000016d9778d0:  0x00000001024f3ddc <runtime.rt0_go+0x00000000000000fc>  0x0000000000000000 

goroutine 61 [syscall]:
runtime.cgocall(0x10296c074, 0x1400035acd8)
	/usr/local/go/src/runtime/cgocall.go:156 +0x50 fp=0x1400035aca0 sp=0x1400035ac60 pc=0x10248d330
github.com/prometheus/node_exporter/collector._Cfunc_CFRelease(0x0)
	_cgo_gotypes.go:481 +0x40 fp=0x1400035acd0 sp=0x1400035aca0 pc=0x10295ead0
github.com/prometheus/node_exporter/collector.fetchCPUPowerStatus.func1.1(...)
	/Users/matto/go/src/github.com/prometheus/node_exporter/collector/thermal_darwin.go:121
github.com/prometheus/node_exporter/collector.fetchCPUPowerStatus.func1({0x0, 0xe00002f0, {0x0, 0x0, 0x0, 0x0}})
	/Users/matto/go/src/github.com/prometheus/node_exporter/collector/thermal_darwin.go:121 +0x28 fp=0x1400035acf0 sp=0x1400035acd0 pc=0x102962e98
github.com/prometheus/node_exporter/collector.fetchCPUPowerStatus()
	/Users/matto/go/src/github.com/prometheus/node_exporter/collector/thermal_darwin.go:125 +0x1a4 fp=0x1400035ad90 sp=0x1400035acf0 pc=0x102962e44
github.com/prometheus/node_exporter/collector.(*thermCollector).Update(0x140000d8380, 0x140000ad0e0)
	/Users/matto/go/src/github.com/prometheus/node_exporter/collector/thermal_darwin.go:102 +0x20 fp=0x1400035adf0 sp=0x1400035ad90 pc=0x102962ab0
github.com/prometheus/node_exporter/collector.execute({0x1029720ee, 0x7}, {0x102b3e180, 0x140000d8380}, 0x140000ad0e0, {0x102b3de80, 0x14000034100})
	/Users/matto/go/src/github.com/prometheus/node_exporter/collector/collector.go:161 +0x5c fp=0x1400035af40 sp=0x1400035adf0 pc=0x102950fdc
github.com/prometheus/node_exporter/collector.NodeCollector.Collect.func1(0x140000ad0e0, {0x1400009d1d0, {0x102b3de80, 0x14000034100}}, 0x14000024360, {0x1029720ee, 0x7}, {0x102b3e180, 0x140000d8380})
	/Users/matto/go/src/github.com/prometheus/node_exporter/collector/collector.go:152 +0x58 fp=0x1400035af90 sp=0x1400035af40 pc=0x102950f48
runtime.goexit()
	/usr/local/go/src/runtime/asm_arm64.s:1133 +0x4 fp=0x1400035af90 sp=0x1400035af90 pc=0x1024f6314
created by github.com/prometheus/node_exporter/collector.NodeCollector.Collect
	/Users/matto/go/src/github.com/prometheus/node_exporter/collector/collector.go:151 +0xf0

goroutine 1 [IO wait]:
internal/poll.runtime_pollWait(0x1033fe830, 0x72)
	/usr/local/go/src/runtime/netpoll.go:234 +0xa4
internal/poll.(*pollDesc).wait(0x140000a2b98, 0x72, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x38
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x140000a2b80)
	/usr/local/go/src/internal/poll/fd_unix.go:402 +0x1ec
net.(*netFD).accept(0x140000a2b80)
	/usr/local/go/src/net/fd_unix.go:173 +0x2c
net.(*TCPListener).accept(0x14000091950)
	/usr/local/go/src/net/tcpsock_posix.go:140 +0x2c
net.(*TCPListener).Accept(0x14000091950)
	/usr/local/go/src/net/tcpsock.go:262 +0x34
net/http.(*Server).Serve(0x1400028e0e0, {0x102b48cf8, 0x14000091950})
	/usr/local/go/src/net/http/server.go:3001 +0x37c
github.com/prometheus/exporter-toolkit/web.Serve({0x102b48cf8, 0x14000091950}, 0x1400028e0e0, {0x0, 0x0}, {0x102b3de80, 0x14000034100})
	/Users/matto/go/pkg/mod/github.com/prometheus/exporter-toolkit@v0.7.0/web/tls_config.go:196 +0x174
github.com/prometheus/exporter-toolkit/web.ListenAndServe(0x1400028e0e0, {0x0, 0x0}, {0x102b3de80, 0x14000034100})
	/Users/matto/go/pkg/mod/github.com/prometheus/exporter-toolkit@v0.7.0/web/tls_config.go:188 +0xd4
main.main()
	/Users/matto/go/src/github.com/prometheus/node_exporter/node_exporter.go:201 +0x1668

goroutine 8 [select]:
github.com/prometheus/client_golang/prometheus.(*Registry).Gather(0x140000aee10)
	/Users/matto/go/pkg/mod/github.com/prometheus/client_golang@v1.11.0/prometheus/registry.go:513 +0x818
github.com/prometheus/client_golang/prometheus.Gatherers.Gather({0x14000098920, 0x2, 0x2})
	/Users/matto/go/pkg/mod/github.com/prometheus/client_golang@v1.11.0/prometheus/registry.go:719 +0x1c4
github.com/prometheus/client_golang/prometheus/promhttp.HandlerFor.func1({0x1033bc360, 0x1400007a410}, 0x14000318100)
	/Users/matto/go/pkg/mod/github.com/prometheus/client_golang@v1.11.0/prometheus/promhttp/http.go:126 +0xc8
net/http.HandlerFunc.ServeHTTP(0x14000290150, {0x1033bc360, 0x1400007a410}, 0x14000318100)
	/usr/local/go/src/net/http/server.go:2046 +0x40
github.com/prometheus/client_golang/prometheus/promhttp.InstrumentHandlerInFlight.func1({0x1033bc360, 0x1400007a410}, 0x14000318100)
	/Users/matto/go/pkg/mod/github.com/prometheus/client_golang@v1.11.0/prometheus/promhttp/instrument_server.go:40 +0x9c
net/http.HandlerFunc.ServeHTTP(0x1400009de90, {0x1033bc360, 0x1400007a410}, 0x14000318100)
	/usr/local/go/src/net/http/server.go:2046 +0x40
github.com/prometheus/client_golang/prometheus/promhttp.InstrumentHandlerCounter.func1({0x102b48f08, 0x14000334000}, 0x14000318100)
	/Users/matto/go/pkg/mod/github.com/prometheus/client_golang@v1.11.0/prometheus/promhttp/instrument_server.go:101 +0xb4
net/http.HandlerFunc.ServeHTTP(0x14000294030, {0x102b48f08, 0x14000334000}, 0x14000318100)
	/usr/local/go/src/net/http/server.go:2046 +0x40
main.(*handler).ServeHTTP(0x14000034200, {0x102b48f08, 0x14000334000}, 0x14000318100)
	/Users/matto/go/src/github.com/prometheus/node_exporter/node_exporter.go:80 +0x43c
net/http.(*ServeMux).ServeHTTP(0x102f05dc0, {0x102b48f08, 0x14000334000}, 0x14000318100)
	/usr/local/go/src/net/http/server.go:2424 +0x18c
net/http.serverHandler.ServeHTTP({0x1400028e0e0}, {0x102b48f08, 0x14000334000}, 0x14000318100)
	/usr/local/go/src/net/http/server.go:2878 +0x444
net/http.(*conn).serve(0x1400030a3c0, {0x102b4c2d8, 0x140002941e0})
	/usr/local/go/src/net/http/server.go:1929 +0xb6c
created by net/http.(*Server).Serve
	/usr/local/go/src/net/http/server.go:3033 +0x4b8

goroutine 9 [IO wait]:
internal/poll.runtime_pollWait(0x1033fe748, 0x72)
	/usr/local/go/src/runtime/netpoll.go:234 +0xa4
internal/poll.(*pollDesc).wait(0x14000314118, 0x72, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x38
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x14000314100, {0x1400007c371, 0x1, 0x1})
	/usr/local/go/src/internal/poll/fd_unix.go:167 +0x1dc
net.(*netFD).Read(0x14000314100, {0x1400007c371, 0x1, 0x1})
	/usr/local/go/src/net/fd_posix.go:56 +0x44
net.(*conn).Read(0x14000010048, {0x1400007c371, 0x1, 0x1})
	/usr/local/go/src/net/net.go:183 +0x4c
net/http.(*connReader).backgroundRead(0x1400007c360)
	/usr/local/go/src/net/http/server.go:672 +0x50
created by net/http.(*connReader).startBackgroundRead
	/usr/local/go/src/net/http/server.go:668 +0xc4

goroutine 15 [semacquire]:
sync.runtime_Semacquire(0x14000024354)
	/usr/local/go/src/runtime/sema.go:56 +0x38
sync.(*WaitGroup).Wait(0x14000024354)
	/usr/local/go/src/sync/waitgroup.go:130 +0xa4
github.com/prometheus/client_golang/prometheus.(*Registry).Gather.func2(0x14000024354, 0x140000ad0e0, 0x140000ad140)
	/Users/matto/go/pkg/mod/github.com/prometheus/client_golang@v1.11.0/prometheus/registry.go:463 +0x28
created by github.com/prometheus/client_golang/prometheus.(*Registry).Gather
	/Users/matto/go/pkg/mod/github.com/prometheus/client_golang@v1.11.0/prometheus/registry.go:462 +0x4e4

goroutine 16 [semacquire]:
sync.runtime_Semacquire(0x14000024368)
	/usr/local/go/src/runtime/sema.go:56 +0x38
sync.(*WaitGroup).Wait(0x14000024360)
	/usr/local/go/src/sync/waitgroup.go:130 +0xa4
github.com/prometheus/node_exporter/collector.NodeCollector.Collect({0x1400009d1d0, {0x102b3de80, 0x14000034100}}, 0x140000ad0e0)
	/Users/matto/go/src/github.com/prometheus/node_exporter/collector/collector.go:156 +0x110
github.com/prometheus/client_golang/prometheus.(*Registry).Gather.func1()
	/Users/matto/go/pkg/mod/github.com/prometheus/client_golang@v1.11.0/prometheus/registry.go:446 +0xe4
created by github.com/prometheus/client_golang/prometheus.(*Registry).Gather
	/Users/matto/go/pkg/mod/github.com/prometheus/client_golang@v1.11.0/prometheus/registry.go:538 +0xa24

r0      0x0
r1      0x10296c074
r2      0x1400035ac50
r3      0x102f06340
r4      0x3b0
r5      0x140000838f0
r6      0x102f35d1e
r7      0x1
r8      0x1ea1e2000
r9      0x18b3ede9c
r10     0x1031444c0
r11     0x14000020b20
r12     0x1
r13     0x16d977890
r14     0x200
r15     0xb3
r16     0x18b00d698
r17     0xfbf80
r18     0x0
r19     0x1400035ad78
r20     0x7d
r21     0x3
r22     0x1400028e264
r23     0x8
r24     0xffffffffffffffa2
r25     0x102a9db00
r26     0x102b366e8
r27     0x860
r28     0x102f06340
r29     0x16d977880
lr      0x18b00d77c
sp      0x16d977880
pc      0x18b1d1e00
fault   0x18b1d1e00
@splmatto
Copy link
Contributor Author

Note: resorted to building due to "EXC_BAD_ACCESS (Code Signature Invalid)" reported in #2217

@splmatto
Copy link
Contributor Author

Narrowing this down a bit:

./node_exporter --collector.disable-defaults --collector.thermal
ts=2021-11-25T03:55:41.371Z caller=node_exporter.go:182 level=info msg="Starting node_exporter" version="(version=1.3.0, branch=master, revision=9fbb56c9c80d5486d6de5e06a224a1eb343b28e7)"
ts=2021-11-25T03:55:41.371Z caller=node_exporter.go:183 level=info msg="Build context" build_context="(go=go1.17.3, user=matto@m1minimatto01, date=20211125-03:40:37)"
ts=2021-11-25T03:55:41.371Z caller=node_exporter.go:108 level=info msg="Enabled collectors"
ts=2021-11-25T03:55:41.371Z caller=node_exporter.go:115 level=info collector=thermal
ts=2021-11-25T03:55:41.371Z caller=node_exporter.go:199 level=info msg="Listening on" address=:9100
ts=2021-11-25T03:55:41.372Z caller=tls_config.go:195 level=info msg="TLS is disabled." http2=false
SIGTRAP: trace trap
PC=0x18b1d1e00 m=5 sigcode=0
...

Among the other Darwin / any collectors listed at https://github.com/prometheus/node_exporter#enabled-by-default, thermal appears to be the only one that panics.

@SuperQ SuperQ added the bug label Nov 25, 2021
@SuperQ
Copy link
Member

SuperQ commented Nov 25, 2021

CC @STRRL

@kmahyyg
Copy link

kmahyyg commented Nov 25, 2021

With the request of @STRRL , I use my M1 Macbook Air for testing purpose with the latest build of master branch compiled using : CGO_ENABLED=1 make then ./node_exporter then curl http://localhost:9100 . All works fine without any error message.

Go 1.17.2 ARM64
MacOS 12.0.1

@you06
Copy link

you06 commented Nov 25, 2021

@kmahyyg I can reproduce it by curl http://localhost:9100/metrics

@kmahyyg
Copy link

kmahyyg commented Nov 25, 2021

@kmahyyg I can reproduce it by curl http://localhost:9100/metrics

Confirmed. Only GET / won't trigger the bug. GET /metrics will.

Log files here:

log1.log

Thanks to @you06

@STRRL
Copy link
Contributor

STRRL commented Nov 26, 2021

HI @SuperQ @splmatto , I am afraid that I could not resolve this bug soon. And I have no apple m1 devices so that I could not profile this issue easily.

Maybe we could disable this feature on darwin arm64 default, but I did not find some configurations or compiler flags for doing that. Could you help me with that? Do I miss something?

@splmatto
Copy link
Contributor Author

splmatto commented Nov 26, 2021

It looks like on the M1, IOPMCopyCPUPowerStatus is returning kIOReturnNotFound and the CFDictionaryRef is nil, so the code at [1] causes the panic.

After applying the following:

diff --git a/collector/thermal_darwin.go b/collector/thermal_darwin.go
index 282ca3f..25673dc 100644
--- a/collector/thermal_darwin.go
+++ b/collector/thermal_darwin.go
@@ -47,9 +47,10 @@ import "C"
 import (
        "errors"
        "fmt"
+       "unsafe"
+
        "github.com/go-kit/log"
        "github.com/prometheus/client_golang/prometheus"
-       "unsafe"
 )
 
 type thermCollector struct {
@@ -118,7 +119,9 @@ func (c *thermCollector) Update(ch chan<- prometheus.Metric) error {
 func fetchCPUPowerStatus() (map[string]int, error) {
        cfDictRef, _ := C.FetchThermal()
        defer func() {
-               C.CFRelease(C.CFTypeRef(cfDictRef.ref))
+               if cfDictRef.ref != 0x0 {
+                       C.CFRelease(C.CFTypeRef(cfDictRef.ref))
+               }
        }()
./node_exporter --collector.disable-defaults --collector.thermal
ts=2021-11-26T20:24:16.610Z caller=node_exporter.go:182 level=info msg="Starting node_exporter" version="(version=1.3.0, branch=master, revision=9fbb56c9c80d5486d6de5e06a224a1eb343b28e7)"
ts=2021-11-26T20:24:16.610Z caller=node_exporter.go:183 level=info msg="Build context" build_context="(go=go1.17.3, user=matto@m1minimatto01, date=20211126-20:22:49)"
ts=2021-11-26T20:24:16.610Z caller=node_exporter.go:108 level=info msg="Enabled collectors"
ts=2021-11-26T20:24:16.610Z caller=node_exporter.go:115 level=info collector=thermal
ts=2021-11-26T20:24:16.610Z caller=node_exporter.go:199 level=info msg="Listening on" address=:9100
ts=2021-11-26T20:24:16.610Z caller=tls_config.go:195 level=info msg="TLS is disabled." http2=false
curl -s0 http://localhost:9100/metrics | grep -v '#' | grep node_
node_exporter_build_info{branch="master",goversion="go1.17.3",revision="9fbb56c9c80d5486d6de5e06a224a1eb343b28e7",version="1.3.0"} 1
node_scrape_collector_duration_seconds{collector="thermal"} 0.000246667
node_scrape_collector_success{collector="thermal"} 0

[1]

C.CFRelease(C.CFTypeRef(cfDictRef.ref))

@splmatto
Copy link
Contributor Author

FWIW

pmset -g therm
Note: No thermal warning level has been recorded
Note: No performance warning level has been recorded
Note: No CPU power status has been recorded

https://opensource.apple.com/source/PowerManagement/PowerManagement-572.50.1/pmset/pmset.c.auto.html

@STRRL
Copy link
Contributor

STRRL commented Nov 27, 2021

Thanks to @splmatto ❤️ , I just noticed that some notes in man pmset:

     -g therm shows thermal conditions that affect CPU speed. Not available on all platforms.
     -g thermlog shows a log of thermal notifications that affect CPU speed. Not available on all platforms.

I am not sure Not available on all platforms means it could not work on Apple M1.

And the docs of IOPMCopyCPUPowerStatus introduced kIOReturnNotFound but I do not resolve it propoerly:

https://developer.apple.com/documentation/iokit/1557079-iopmcopycpupowerstatus

I think I would make a PR soon.

SuperQ added a commit that referenced this issue Dec 1, 2021
* [BUGFIX] Handle nil CPU thermal power status on M1 #2218
* [BUGFIX] bsd: Ignore filesystems flagged as MNT_IGNORE. #2227
* [BUGFIX] Sanitize UTF-8 in dmi collector #2229

Signed-off-by: Ben Kochie <superq@gmail.com>
@SuperQ SuperQ mentioned this issue Dec 1, 2021
@discordianfish
Copy link
Member

Should be closed by #2225 - lemme know if this is still an issue

SuperQ added a commit that referenced this issue Dec 4, 2021
* [BUGFIX] Handle nil CPU thermal power status on M1 #2218
* [BUGFIX] bsd: Ignore filesystems flagged as MNT_IGNORE. #2227
* [BUGFIX] Sanitize UTF-8 in dmi collector #2229

Signed-off-by: Ben Kochie <superq@gmail.com>
SuperQ added a commit that referenced this issue Dec 4, 2021
* [BUGFIX] Handle nil CPU thermal power status on M1 #2218
* [BUGFIX] bsd: Ignore filesystems flagged as MNT_IGNORE. #2227
* [BUGFIX] Sanitize UTF-8 in dmi collector #2229

Signed-off-by: Ben Kochie <superq@gmail.com>
@SuperQ SuperQ mentioned this issue Dec 4, 2021
oblitorum pushed a commit to shatteredsilicon/node_exporter that referenced this issue Apr 9, 2024
* [BUGFIX] Handle nil CPU thermal power status on M1 prometheus#2218
* [BUGFIX] bsd: Ignore filesystems flagged as MNT_IGNORE. prometheus#2227
* [BUGFIX] Sanitize UTF-8 in dmi collector prometheus#2229

Signed-off-by: Ben Kochie <superq@gmail.com>
oblitorum pushed a commit to shatteredsilicon/node_exporter that referenced this issue Apr 9, 2024
* [BUGFIX] Handle nil CPU thermal power status on M1 prometheus#2218
* [BUGFIX] bsd: Ignore filesystems flagged as MNT_IGNORE. prometheus#2227
* [BUGFIX] Sanitize UTF-8 in dmi collector prometheus#2229

Signed-off-by: Ben Kochie <superq@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants