Replies: 4 comments 19 replies
-
Test on the manager:
Test on the sensor:
Test on both:
|
Beta Was this translation helpful? Give feedback.
-
Tried to fix the problem with the update to 2.4.60. Update works fine, but the issue is the same... |
Beta Was this translation helpful? Give feedback.
-
The managersearch. I skipped the empty logs and duplicated errors. [user@managersearch]$ sudo so-log-check
Checking container "so-elastalert"Reading index mapping 'es_mappings/8/elastalert_error.json' Checking container "so-dockerregistry"time="2024-04-05T15:16:55.969668304Z" level=info msg="Purge uploads finished. Num deleted=0, num errors=0" Checking log file /opt/so/log/kratos/kratos.log{"audience":"audit","error":{"debug":"","message":"request does not have a valid authentication session","reason":"No active session was found in this request.","stack_trace":"\ngithub.com/ory/kratos/session.(*ManagerHTTP).FetchFromRequest\n\t/go/src/github.com/ory/kratos/session/manager_http.go:236\ngithub.com/ory/kratos/session.(*Handler).whoami\n\t/go/src/github.com/ory/kratos/session/handler.go:215\ngithub.com/ory/kratos/x.(*RouterPublic).Handle.NoCacheHandle.func1\n\t/go/src/github.com/ory/kratos/x/nocache.go:21\ngithub.com/julienschmidt/httprouter.(*Router).ServeHTTP\n\t/go/pkg/mod/github.com/julienschmidt/httprouter@v1.3.0/router.go:387\ngithub.com/ory/nosurf.(*CSRFHandler).handleSuccess\n\t/go/pkg/mod/github.com/ory/nosurf@v1.2.7/handler.go:234\ngithub.com/ory/nosurf.(*CSRFHandler).ServeHTTP\n\t/go/pkg/mod/github.com/ory/nosurf@v1.2.7/handler.go:185\ngithub.com/urfave/negroni.(*Negroni).UseHandler.Wrap.func1\n\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:46\ngithub.com/urfave/negroni.HandlerFunc.ServeHTTP\n\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:29\ngithub.com/urfave/negroni.middleware.ServeHTTP\n\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\ngithub.com/ory/kratos/x.glob..func1\n\t/go/src/github.com/ory/kratos/x/clean_url.go:15\ngithub.com/urfave/negroni.HandlerFunc.ServeHTTP\n\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:29\ngithub.com/urfave/negroni.middleware.ServeHTTP\n\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\ngithub.com/ory/kratos/cmd/daemon.servePublic.func1\n\t/go/src/github.com/ory/kratos/cmd/daemon/serve.go:111\ngithub.com/urfave/negroni.HandlerFunc.ServeHTTP\n\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:29\ngithub.com/urfave/negroni.middleware.ServeHTTP\n\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\nnet/http.HandlerFunc.ServeHTTP\n\t/usr/local/go/src/net/http/server.go:2136\ngithub.com/prometheus/client_golang/prometheus/promhttp.InstrumentHandlerResponseSize.func1\n\t/go/pkg/mod/github.com/prometheus/client_golang@v1.13.0/prometheus/promhttp/instrument_server.go:284\nnet/http.HandlerFunc.ServeHTTP\n\t/usr/local/go/src/net/http/server.go:2136\ngithub.com/prometheus/client_golang/prometheus/promhttp.InstrumentHandlerCounter.func1\n\t/go/pkg/mod/github.com/prometheus/client_golang@v1.13.0/prometheus/promhttp/instrument_server.go:142\nnet/http.HandlerFunc.ServeHTTP\n\t/usr/local/go/src/net/http/server.go:2136\ngithub.com/prometheus/client_golang/prometheus/promhttp.InstrumentHandlerDuration.func1\n\t/go/pkg/mod/github.com/prometheus/client_golang@v1.13.0/prometheus/promhttp/instrument_server.go:92\nnet/http.HandlerFunc.ServeHTTP\n\t/usr/local/go/src/net/http/server.go:2136\ngithub.com/prometheus/client_golang/prometheus/promhttp.InstrumentHandlerDuration.func2\n\t/go/pkg/mod/github.com/prometheus/client_golang@v1.13.0/prometheus/promhttp/instrument_server.go:104\nnet/http.HandlerFunc.ServeHTTP\n\t/usr/local/go/src/net/http/server.go:2136\ngithub.com/prometheus/client_golang/prometheus/promhttp.InstrumentHandlerRequestSize.func1\n\t/go/pkg/mod/github.com/prometheus/client_golang@v1.13.0/prometheus/promhttp/instrument_server.go:234\nnet/http.HandlerFunc.ServeHTTP\n\t/usr/local/go/src/net/http/server.go:2136\ngithub.com/ory/x/prometheusx.Metrics.Instrument.Metrics.instrumentHandlerStatusBucket.func1\n\t/go/pkg/mod/github.com/ory/x@v0.0.614/prometheusx/metrics.go:115\nnet/http.HandlerFunc.ServeHTTP\n\t/usr/local/go/src/net/http/server.go:2136\ngithub.com/ory/x/prometheusx.(MetricsManager).ServeHTTP\n\t/go/pkg/mod/github.com/ory/x@v0.0.614/prometheusx/middleware.go:41\ngithub.com/urfave/negroni.middleware.ServeHTTP\n\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38\ngithub.com/ory/x/metricsx.(Service).ServeHTTP\n\t/go/pkg/mod/github.com/ory/x@v0.0.614/metricsx/middleware.go:272\ngithub.com/urfave/negroni.middleware.ServeHTTP\n\t/go/pkg/mod/github.com/urfave/negroni@v1.0.0/negroni.go:38","status":"Unauthorized","status_code":401},"http_request":{"headers":{"accept":"text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,/;q=0.8","accept-encoding":"gzip, deflate, br","accept-language":"de,en-US;q=0.7,en;q=0.3","connection":"close","cookie":"Value is sensitive and has been redacted. To see the value set config key "log.leak_sensitive_values = true" or environment variable "LOG_LEAK_SENSITIVE_VALUES=true".","if-modified-since":"Tue, 19 Mar 2024 18:07:37 GMT","sec-fetch-dest":"document","sec-fetch-mode":"navigate","sec-fetch-site":"none","sec-fetch-user":"?1","upgrade-insecure-requests":"1","user-agent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:124.0) Gecko/20100101 Firefox/124.0","x-forwarded-for":"193.196.188.222","x-forwarded-proto":"https","x-real-ip":"193.196.188.222"},"host":"10.100.79.2","method":"GET","path":"/sessions/whoami","query":null,"remote":"172.17.1.1:60654","scheme":"http"},"level":"info","msg":"No valid session found.","service_name":"Ory Kratos","service_version":"v1.1.0","time":"2024-04-11T13:24:34.420818832Z"} Checking log file /opt/so/log/influxdb/influxdb.logts=2024-04-11T10:25:07.757629Z lvl=info msg="http: TLS handshake error from 127.0.0.1:47532: EOF" log_id=0oMRZ5H0000 service=http Checking log file /opt/so/log/telegraf/telegraf.log Checking log file /opt/so/log/logstash/logstash.log[2024-04-11T13:27:00,801][WARN ][logstash.outputs.redis ] Failed to flush outgoing items {:outgoing_count=>1, :exception=>"Redis::CommandError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/redis-4.8.1/lib/redis/client.rb:162:in |
Beta Was this translation helpful? Give feedback.
-
Can you restart Logstash and share the whole |
Beta Was this translation helpful? Give feedback.
-
Version
2.4.40
Installation Method
Security Onion ISO image
Description
configuration
Installation Type
Distributed
Location
on-prem with Internet access
Hardware Specs
Exceeds minimum requirements
CPU
8
RAM
32
Storage for /
70 GB
Storage for /nsm
130 GB
Network Traffic Collection
span port
Network Traffic Speeds
1Gbps to 10Gbps
Status
Yes, all services on all nodes are running OK
Salt Status
No, there are no failures
Logs
Yes, there are additional clues in /opt/so/log/ (please provide detail below)
Detail
Hello there, after a reboot I don't receive any alerts.
The logs on the sensor logs everything fine. The logs of the manager I will provide down here. I also provide some additional information in the comment section. Installed system is a distributed system with one managersearch and one sensor.
sudo so-log-check (just the logs with informations):
Checking container "so-elastalert":
...
Reading index mapping 'es_mappings/8/elastalert_error.json'
Checking container "so-elastic-fleet"
...
Error: context canceled
Checking container "so-dockerregistry"
...
time="2024-03-20T10:55:59.20758766Z" level=info msg="Purge uploads finished. Num deleted=0, num errors=0"
Checking log file /opt/so/log/influxdb/influxdb.log
...
ts=2024-03-21T09:33:02.176099Z lvl=info msg="http: TLS handshake error from 127.0.0.1:54448: EOF" log_id=0o1ambp0000 service=http
Checking log file /opt/so/log/telegraf/telegraf.log
2024-03-21T00:01:30Z E! [inputs.exec] Error in plugin: metric parse error: expected field at 1:19: "influxsize kbytes="
2024-03-21T00:02:00Z E! [inputs.exec] Error in plugin: metric parse error: expected field at 1:19: "influxsize kbytes="
2024-03-21T08:59:30Z E! [inputs.logstash] Error in plugin: Get "http://localhost:9600/_node/stats/pipelines": dial tcp 127.0.0.1:9600: connect: connection refused
Checking log file /opt/so/log/logstash/logstash.log
[2024-03-21T09:33:41,305][WARN ][logstash.outputs.redis ] Failed to flush outgoing items {:outgoing_count=>1, :exception=>"Redis::CommandError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/redis-4.8.1/lib/redis/client.rb:162:in
call'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/redis-4.8.1/lib/redis.rb:270:in
block in send_command'", "org/jruby/ext/monitor/Monitor.java:82:insynchronize'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/redis-4.8.1/lib/redis.rb:269:in
send_command'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/redis-4.8.1/lib/redis/commands/lists.rb:86:inrpush'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-redis-5.0.0/lib/logstash/outputs/redis.rb:152:in
flush'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/stud-0.0.23/lib/stud/buffer.rb:221:inblock in buffer_flush'", "org/jruby/RubyHash.java:1587:in
each'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/stud-0.0.23/lib/stud/buffer.rb:216:inbuffer_flush'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/stud-0.0.23/lib/stud/buffer.rb:159:in
buffer_receive'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-redis-5.0.0/lib/logstash/outputs/redis.rb:209:insend_to_redis'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-codec-json-3.1.1/lib/logstash/codecs/json.rb:69:in
encode'", "/usr/share/logstash/logstash-core/lib/logstash/codecs/delegator.rb:48:inblock in encode'", "org/logstash/instrument/metrics/AbstractSimpleMetricExt.java:74:in
time'", "org/logstash/instrument/metrics/AbstractNamespacedMetricExt.java:68:intime'", "/usr/share/logstash/logstash-core/lib/logstash/codecs/delegator.rb:47:in
encode'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-redis-5.0.0/lib/logstash/outputs/redis.rb:123:inreceive'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:104:in
block in multi_receive'", "org/jruby/RubyArray.java:1987:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:104:in
multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:121:inmulti_receive'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:304:in
block in start_workers'"]}[2024-03-21T09:33:41,305][WARN ][logstash.outputs.redis ] Failed to send backlog of events to Redis {:identity=>"redis://@seconion:6379/0 list:logstash:unparsed", :exception=>#<Redis::CommandError: OOM command not allowed when used memory > 'maxmemory'.>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/redis-4.8.1/lib/redis/client.rb:162:in
call'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/redis-4.8.1/lib/redis.rb:270:in
block in send_command'", "org/jruby/ext/monitor/Monitor.java:82:insynchronize'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/redis-4.8.1/lib/redis.rb:269:in
send_command'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/redis-4.8.1/lib/redis/commands/lists.rb:86:inrpush'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-redis-5.0.0/lib/logstash/outputs/redis.rb:152:in
flush'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/stud-0.0.23/lib/stud/buffer.rb:221:inblock in buffer_flush'", "org/jruby/RubyHash.java:1587:in
each'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/stud-0.0.23/lib/stud/buffer.rb:216:inbuffer_flush'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/stud-0.0.23/lib/stud/buffer.rb:159:in
buffer_receive'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-redis-5.0.0/lib/logstash/outputs/redis.rb:209:insend_to_redis'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-codec-json-3.1.1/lib/logstash/codecs/json.rb:69:in
encode'", "/usr/share/logstash/logstash-core/lib/logstash/codecs/delegator.rb:48:inblock in encode'", "org/logstash/instrument/metrics/AbstractSimpleMetricExt.java:74:in
time'", "org/logstash/instrument/metrics/AbstractNamespacedMetricExt.java:68:intime'", "/usr/share/logstash/logstash-core/lib/logstash/codecs/delegator.rb:47:in
encode'", "/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-output-redis-5.0.0/lib/logstash/outputs/redis.rb:123:inreceive'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:104:in
block in multi_receive'", "org/jruby/RubyArray.java:1987:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:104:in
multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:121:inmulti_receive'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:304:in
block in start_workers'"]}Any idea what's going on?
Kind regards
Guidelines
Beta Was this translation helpful? Give feedback.
All reactions