-
Notifications
You must be signed in to change notification settings - Fork 204
Open
Labels
Team:Elastic-Agent-Control-PlaneLabel for the Agent Control Plane teamLabel for the Agent Control Plane teambugSomething isn't workingSomething isn't working
Description
This test was done using 9.2.0-SNAPSHOT with the sub-process runtime with Elastic Agent self-monitoring and the system integration on Ubuntu 24.04.
The memory query in Fleet is not including all components, possibly because Elastic Agent is not reporting memory usage for all of them.
I see 239 MB in Fleet for an agent with monitoring enabled with the system integration:
Linux shows 503.4 MB as the correct amount.
ubuntu@oriented-leafcutter:~/elastic-agent-9.2.0-SNAPSHOT-linux-arm64$ systemctl status elastic-agent
● elastic-agent.service - Elastic Agent is a unified agent to observe, monitor and protect your system.
Loaded: loaded (/etc/systemd/system/elastic-agent.service; enabled; preset: enabled)
Active: active (running) since Thu 2025-10-02 16:30:06 EDT; 17min ago
Main PID: 78095 (elastic-agent)
Tasks: 50 (limit: 1059)
Memory: 503.4M (peak: 771.8M)
CPU: 15.144s
CGroup: /system.slice/elastic-agent.service
├─78095 elastic-agent
├─78167 /opt/Elastic/Agent/data/elastic-agent-9.2.0-SNAPSHOT-e917e0/components/agentbeat filebeat -E se>
├─78174 /opt/Elastic/Agent/data/elastic-agent-9.2.0-SNAPSHOT-e917e0/components/agentbeat metricbeat -E >
├─78186 /opt/Elastic/Agent/data/elastic-agent-9.2.0-SNAPSHOT-e917e0/components/agentbeat metricbeat -E >
├─78192 /opt/Elastic/Agent/data/elastic-agent-9.2.0-SNAPSHOT-e917e0/components/agentbeat metricbeat -E >
└─78197 /opt/Elastic/Agent/data/elastic-agent-9.2.0-SNAPSHOT-e917e0/components/agentbeat filebeat -E se>
Looking at the agent metrics dashboards there are no memory values reported for the beat/metrics-monitoring or http/metrics-monitoring components which I believe explains this.
Looking at the underlying data the http/metrics and beat/metrics inputs are healthy but scarping the metrics endpoint is failing to collect data:
{
"_source": {
"@timestamp": "2025-10-02T20:40:18.916Z",
"agent": {
"ephemeral_id": "a8158123-4ae9-4d25-bcf5-2da6ab67f966",
"id": "5ebb2ff5-36e8-42bb-92b9-c12c970cf13e",
"name": "oriented-leafcutter",
"type": "metricbeat",
"version": "9.2.0"
},
"component": {
"binary": "metricbeat",
"id": "http/metrics-monitoring"
},
"data_stream": {
"dataset": "elastic_agent.metricbeat",
"namespace": "default",
"type": "metrics"
},
"ecs": {
"version": "8.0.0"
},
"elastic_agent": {
"id": "5ebb2ff5-36e8-42bb-92b9-c12c970cf13e",
"process": "metricbeat",
"snapshot": true,
"version": "9.2.0"
},
"error": {
"message": "error making http request: Get \"http://unix/stats\": dial unix /opt/Elastic/Agent/data/tmp/akSPbdqgaHaTY0_J01-dsfYK6JpMz2zn.sock: connect: no such file or directory"
},{
"_source": {
"@timestamp": "2025-10-02T20:39:18.926Z",
"agent": {
"ephemeral_id": "a8158123-4ae9-4d25-bcf5-2da6ab67f966",
"id": "5ebb2ff5-36e8-42bb-92b9-c12c970cf13e",
"name": "oriented-leafcutter",
"type": "metricbeat",
"version": "9.2.0"
},
"component": {
"binary": "metricbeat",
"id": "beat/metrics-monitoring"
},
"data_stream": {
"dataset": "elastic_agent.metricbeat",
"namespace": "default",
"type": "metrics"
},
"ecs": {
"version": "8.0.0"
},
"elastic_agent": {
"id": "5ebb2ff5-36e8-42bb-92b9-c12c970cf13e",
"process": "metricbeat",
"snapshot": true,
"version": "9.2.0"
},
"error": {
"message": "error making http request: Get \"http://unix/stats\": dial unix /opt/Elastic/Agent/data/tmp/Hk6rvk9TDibMPcDvpl0jkLE-qDsHWVYL.sock: connect: no such file or directory"
},Metadata
Metadata
Assignees
Labels
Team:Elastic-Agent-Control-PlaneLabel for the Agent Control Plane teamLabel for the Agent Control Plane teambugSomething isn't workingSomething isn't working