Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak -> OOM when serving aptly api #1323

Closed
russelltg opened this issue Aug 2, 2024 · 7 comments · Fixed by #1324
Closed

Memory leak -> OOM when serving aptly api #1323

russelltg opened this issue Aug 2, 2024 · 7 comments · Fixed by #1324
Assignees
Labels
bug please confirm resolved We believe the issue is resolved ! if so, please close the issue, thanks ;-)

Comments

@russelltg
Copy link
Contributor

russelltg commented Aug 2, 2024

Detailed Description

The OOM killer killed aptly on our server last night, with this line:

Tasks state (memory values in pages):
[  pid  ]   uid  tgid total_vm      rss pgtables_bytes swapents oom_score_adj name
...
[   1371]  1000  1371  7363621  5559445 55930880   941578             0 aptly
...
Out of memory: Killed process 1371 (aptly) total-vm:29454484kB, anon-rss:22237780kB, file-rss:0kB, shmem-rss:0kB, UID:1000 pgtables:54620kB oom_score_adj:0

By my reading, this means the RSS of aptly was 21.2GiB! It had been running since July 23, so not terribly long.

I've attached the full logs from the aptly run. I'll check back in a few days to see what the memory usage is to see if it's ongoing.

Let me know if there is any extra info I can provide, thanks!

aptly_oom_log.txt

Your Environment

Ubuntu 22.04, x86_64
aptly version: '1.5.0+162+g8029305d'

@neolynx neolynx self-assigned this Aug 2, 2024
@neolynx neolynx added the bug label Aug 2, 2024
@neolynx
Copy link
Member

neolynx commented Aug 2, 2024

I have a suspicion, might be that the tasks are never cleared. do you see tasks with curl /api/tasks ?

@neolynx
Copy link
Member

neolynx commented Aug 3, 2024

indeed, aptly was not deleting internal tasks in sync mode.... see fix in #1324

@neolynx
Copy link
Member

neolynx commented Aug 3, 2024

releasing fix, ...

@neolynx neolynx reopened this Aug 3, 2024
@neolynx
Copy link
Member

neolynx commented Aug 3, 2024

please test version 1.5.0+198+g37a9fbe5, you should not see any tasks in /api/tasks anymore.

@neolynx neolynx added the please confirm resolved We believe the issue is resolved ! if so, please close the issue, thanks ;-) label Aug 12, 2024
@NeroBurner
Copy link
Contributor

NeroBurner commented Aug 14, 2024

Tried with aptly_1.5.0+206+g95915480_amd64.deb. The OOM issue is fixed for me and the /api/tasks List is empty

Note: publishing is done through the api using aptly-publisher 0.12.12-1

the status of the aptly server is generated with the following command on the server running aptly-api:

echo -e "  - date: $(date --iso=seconds)\n  - aptly publishes: $(aptly publish list -raw | wc -l)\n  - aptly snapshots: $(aptly snapshot list -raw | wc -l)\n  - *.ldb files: $(ls -l /var/lib/aptly/db/*.ldb | wc -l)\n  - proc/fd files: $(ls /proc/$(pidof aptly)/fd/ 2>/dev/null | wc -l)\n  - mem used: $(free -h | grep Mem: | awk '{print $3}')"
  • after install 1.5.0+206+g95915480 we start with 0.6 GiB used memory at 15:52 local time (24 hour time-format HH:MM)
    • date: 2024-08-14T15:54:14+02:00
    • aptly publishes: 24
    • aptly snapshots: 787
    • *.ldb files: 154
    • proc/fd files: 8
    • mem used: 671Mi
  • after new packages/snapshots and 3 publish 0.6 GiB
    • date: 2024-08-14T15:58:01+02:00
    • aptly publishes: 24
    • aptly snapshots: 835
    • *.ldb files: 154
    • proc/fd files: 8
    • mem used: 672Mi

Previously I was running aptly 1.5.0+162+g8029305d and there the memory usage rose about 0.1 GiB for every 2 "publishes" and was reset to a normal level when restarting the aptly-api systemd service

  • running for about half a day with hourly publishes (or more), 1.2 GiB RAM in use
  • after new packages/snapshots and 3 publish 2.1 GiB 13:15
    • aptly snapshots: 691
    • *.ldb files: 113
    • proc/fd files: 8
    • dropdown to 1.4 GiB 13:20
  • restart aptly-api systemd service 0.6 GiB 13:37
    • aptly snapshots: 691
    • *.ldb files: 113
    • proc/fd files: 8
  • after new packages/snapshots and 3 publish 0.8 GiB 14:15
    • aptly snapshots: 739
    • *.ldb files: 153
    • proc/fd files: 8
    • dropdown to 0.7 GiB 14:19
  • after new packages/snapshots and 3 publish 1.0 GiB 15:15
    • aptly snapshots: 787
    • *.ldb files: 154
    • proc/fd files: 8
    • dropdown to 0.8 GiB 15:19

@neolynx
Copy link
Member

neolynx commented Aug 14, 2024

Hi @NeroBurner,

thanks for the testing and stats !

the numbers look OK, do they ? golang has a garbage collector, unused allocated memory will be freed periodically, that might explain the variations...

@russelltg
Copy link
Contributor Author

Looking good to me. After a few weeks I'm only at ~600MiB usage.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug please confirm resolved We believe the issue is resolved ! if so, please close the issue, thanks ;-)
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants