Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Too much RAM consumption on specific ARM hardware (odroid-hc2 from hardkernel) #893

Closed
MHCraftbeer opened this issue Apr 26, 2020 · 63 comments

Comments

@MHCraftbeer
Copy link

Bug Report Details

Describe the bug
Memory consumption in docker container until synchronization won't work anymore.

Application and Operating System Details:

  • OS: Armbian Buster Server (4.14) with openmediavault 5 on odroid-hc2 from hardkernel
  • headless system with onedrive in docker (default dockerfile-rpi)
  • OneDrive Account Type: personal
  • Application configuration: default
  • fs.inotify.max_user_watches increased to 100.000
  • Curl Version: curl 7.64.0 (arm-unknown-linux-gnueabihf) libcurl/7.64.0 OpenSSL/1.1.1d zlib/1.2.11 libidn2/2.0.5 libpsl/0.20.2 (+libidn2/2.0.5) libssh2/1.8.0 nghttp2/1.36.0 librtmp/2.3
    Release-Date: 2019-02-06
    Protocols: dict file ftp ftps gopher http https imap imaps ldap ldaps pop3 pop3s rtmp rtsp scp sftp smb smbs smtp smtps telnet tftp
    Features: AsynchDNS IDN IPv6 Largefile GSS-API Kerberos SPNEGO NTLM NTLM_WB SSL libz TLS-SRP HTTP2 UnixSockets HTTPS-proxy PSL

To Reproduce
Steps to reproduce the behavior if not causing an application crash:

  1. Docker build
  2. Docker run
  3. Watch the feast on memory

Additional context
The odroid-hc2 only has 2gb of memory. After a few hours the onedrive docker container uses 500mb ram. After 24 hours it uses 1gb and is still slowly increasing until synchronization does not work anymore. In this time only a few small files have been synchronized.
I have 40gb onedrive space with 4.000 directories and 30.000 files. Synchronization works until too much ram is used.

Any ideas?
I tried to reduce the usable memory of the docker container with -m. But it only shortened the time until the synchronization stopped working.

@abraunegg
Copy link
Owner

@MHCraftbeer
Please can you run a process profiler (see https://github.com/abraunegg/onedrive/wiki/Generate-system-performance-data-for-performance-related-issues for ideas) to see what process is consuming memory - as it does sound rather odd. The information you have posted above also does not indicate what application version you are running as well. Can you please provide that.

I also do not use Docker myself.

@nrandon, @jkt628, @pauliacomi, @mnagaku - any suggestions / ideas here to diagnose / troubleshoot?

@MHCraftbeer
Copy link
Author

MHCraftbeer commented Apr 26, 2020

Thank you very much for your reply!

  • I use the current application version since I build the docker a few days ago
  • I currently run the container with reduced memory -m 500m
MemFree:           47732 kB
MemAvailable:     753516 kB
Buffers:           37848 kB
Cached:           698184 kB
SwapCached:            0 kB
Active:          1151748 kB
Inactive:         606228 kB
Active(anon):    1049356 kB
Inactive(anon):    37032 kB
Active(file):     102392 kB
Inactive(file):   569196 kB
Unevictable:        1472 kB
Mlocked:            1472 kB
HighTotal:       1288192 kB
HighFree:           2216 kB
LowTotal:         755628 kB
LowFree:           45516 kB
SwapTotal:             0 kB
SwapFree:              0 kB
Dirty:                44 kB
Writeback:             0 kB
AnonPages:       1023612 kB
Mapped:            70144 kB
Shmem:             64552 kB
Slab:             185044 kB
SReclaimable:      95016 kB
SUnreclaim:        90028 kB
KernelStack:        3528 kB
PageTables:         6964 kB
NFS_Unstable:          0 kB
Bounce:                0 kB
WritebackTmp:          0 kB
CommitLimit:     1021908 kB
Committed_AS:    2684076 kB
VmallocTotal:     245760 kB
VmallocUsed:           0 kB
VmallocChunk:          0 kB
CmaTotal:         131072 kB
CmaFree:             744 kB

Screenshot_2020-04-26-10-12-53-402_com opera browser
*
Screenshot_2020-04-26-10-13-33-157_com opera browser

  • since yesterday evening there wasn't any synchronization
  • unfortunately I can't run perf because apt-get installs version 4.9 and I am running kernel 4.14

I hope this helps?

@abraunegg
Copy link
Owner

@MHCraftbeer
A time out indicates that you have had Internet connectivity issues. The service 'should' re-establish itself.

It also looks like it is not doing much. You may have to increase the verbosity (update the Docker init scripts to include --verbose) to show what it is actually doing. I strongly recommend doing so until you sort out what is going on.

The images you have posted are 'interesting' but also do not help as they are just of the Docker process and not the processes inside Docker. You most likely need to log into the Docker container itself, whilst the client is running, and perform actions from inside the container to look at what the memory utilisation is actually doing and what is consuming what.

This most likely will require you to rebuild a Docker container so that you can use something like valgrind or similar to inspect the running process inside Docker + the onedrive client at the same time. I have no idea however if this is going to be possible given your hardware limited resources.

What I will do however, is get a similar file set organised, and run that in monitor mode so that you have at least some comparison available from a resource usage perspective.

@MHCraftbeer
Copy link
Author

Thanks for your reply!
Yes, the timeout is perfectly fine.
However, since yesterday evening there was no action at all. Also there was no synchronization since.
I uploaded a few test files over the last hours and none of them was synchronized.

I will try to rebuild the container with verbose enabled.

It would be very interesting to see how much memory the client normally requires.

@abraunegg
Copy link
Owner

@MHCraftbeer
Below is from the client running for the last ~10 mins starting to upload 50k files

[alex@centos7full ~]$ pmap 14584
14584:   ./onedrive --confdir ~/.config/onedrive-business/ --monitor --verbose
0000000000400000   3516K r-x-- onedrive
000000000096e000    308K r---- onedrive
00000000009bb000    492K rw--- onedrive
0000000000a36000      8K rw---   [ anon ]
000000000279c000   2740K rw---   [ anon ]
00007f18f4000000    132K rw---   [ anon ]
00007f18f4021000  65404K -----   [ anon ]
00007f18fc000000    132K rw---   [ anon ]
00007f18fc021000  65404K -----   [ anon ]
00007f1900de3000    160K r-x-- libnsspem.so
00007f1900e0b000   2048K ----- libnsspem.so
00007f190100b000      4K r---- libnsspem.so
00007f190100c000      4K rw--- libnsspem.so
00007f190100d000      8K r-x-- libnsssysinit.so
00007f190100f000   2044K ----- libnsssysinit.so
00007f190120e000      4K r---- libnsssysinit.so
00007f190120f000      4K rw--- libnsssysinit.so
00007f1901210000    524K r-x-- libfreeblpriv3.so
00007f1901293000   2044K ----- libfreeblpriv3.so
00007f1901492000      8K r---- libfreeblpriv3.so
00007f1901494000      4K rw--- libfreeblpriv3.so
00007f1901495000     16K rw---   [ anon ]
00007f1901499000    240K r-x-- libsoftokn3.so
00007f19014d5000   2048K ----- libsoftokn3.so
00007f19016d5000      4K r---- libsoftokn3.so
00007f19016d6000      4K rw--- libsoftokn3.so
00007f19016d7000     20K r-x-- libnss_dns-2.17.so
00007f19016dc000   2048K ----- libnss_dns-2.17.so
00007f19018dc000      4K r---- libnss_dns-2.17.so
00007f19018dd000      4K rw--- libnss_dns-2.17.so
00007f19018de000     48K r-x-- libnss_files-2.17.so
00007f19018ea000   2044K ----- libnss_files-2.17.so
00007f1901ae9000      4K r---- libnss_files-2.17.so
00007f1901aea000      4K rw--- libnss_files-2.17.so
00007f1901aeb000     24K rw---   [ anon ]
00007f1901af1000      4K -----   [ anon ]
00007f1901af2000   8192K rw---   [ anon ]
00007f19022f2000      4K -----   [ anon ]
00007f19022f3000  12288K rw---   [ anon ]
00007f1902ef3000      8K r-x-- libfreebl3.so
00007f1902ef5000   2044K ----- libfreebl3.so
00007f19030f4000      4K r---- libfreebl3.so
00007f19030f5000      4K rw--- libfreebl3.so
00007f19030f6000     32K r-x-- libcrypt-2.17.so
00007f19030fe000   2044K ----- libcrypt-2.17.so
00007f19032fd000      4K r---- libcrypt-2.17.so
00007f19032fe000      4K rw--- libcrypt-2.17.so
00007f19032ff000    184K rw---   [ anon ]
00007f190332d000     16K r-x-- libuuid.so.1.3.0
00007f1903331000   2044K ----- libuuid.so.1.3.0
00007f1903530000      4K r---- libuuid.so.1.3.0
00007f1903531000      4K rw--- libuuid.so.1.3.0
00007f1903532000    240K r-x-- libblkid.so.1.1.0
00007f190356e000   2044K ----- libblkid.so.1.1.0
00007f190376d000     12K r---- libblkid.so.1.1.0
00007f1903770000      4K rw--- libblkid.so.1.1.0
00007f1903771000      4K rw---   [ anon ]
00007f1903772000    112K r-x-- libsasl2.so.3.0.0
00007f190378e000   2044K ----- libsasl2.so.3.0.0
00007f190398d000      4K r---- libsasl2.so.3.0.0
00007f190398e000      4K rw--- libsasl2.so.3.0.0
00007f190398f000     12K r-x-- libkeyutils.so.1.5
00007f1903992000   2044K ----- libkeyutils.so.1.5
00007f1903b91000      4K r---- libkeyutils.so.1.5
00007f1903b92000      4K rw--- libkeyutils.so.1.5
00007f1903b93000     56K r-x-- libkrb5support.so.0.1
00007f1903ba1000   2048K ----- libkrb5support.so.0.1
00007f1903da1000      4K r---- libkrb5support.so.0.1
00007f1903da2000      4K rw--- libkrb5support.so.0.1
00007f1903da3000   2260K r-x-- libcrypto.so.1.0.2k
00007f1903fd8000   2048K ----- libcrypto.so.1.0.2k
00007f19041d8000    112K r---- libcrypto.so.1.0.2k
00007f19041f4000     52K rw--- libcrypto.so.1.0.2k
00007f1904201000     16K rw---   [ anon ]
00007f1904205000    412K r-x-- libssl.so.1.0.2k
00007f190426c000   2048K ----- libssl.so.1.0.2k
00007f190446c000     16K r---- libssl.so.1.0.2k
00007f1904470000     28K rw--- libssl.so.1.0.2k
00007f1904477000    256K r-x-- libmount.so.1.1.0
00007f19044b7000   2048K ----- libmount.so.1.1.0
00007f19046b7000      4K r---- libmount.so.1.1.0
00007f19046b8000      4K rw--- libmount.so.1.1.0
00007f19046b9000      4K rw---   [ anon ]
00007f19046ba000     88K r-x-- libresolv-2.17.so
00007f19046d0000   2044K ----- libresolv-2.17.so
00007f19048cf000      4K r---- libresolv-2.17.so
00007f19048d0000      4K rw--- libresolv-2.17.so
00007f19048d1000      8K rw---   [ anon ]
00007f19048d3000    144K r-x-- libselinux.so.1
00007f19048f7000   2044K ----- libselinux.so.1
00007f1904af6000      4K r---- libselinux.so.1
00007f1904af7000      4K rw--- libselinux.so.1
00007f1904af8000      8K rw---   [ anon ]
00007f1904afa000    384K r-x-- libpcre.so.1.2.0
00007f1904b5a000   2048K ----- libpcre.so.1.2.0
00007f1904d5a000      4K r---- libpcre.so.1.2.0
00007f1904d5b000      4K rw--- libpcre.so.1.2.0
00007f1904d5c000     28K r-x-- libffi.so.6.0.1
00007f1904d63000   2044K ----- libffi.so.6.0.1
00007f1904f62000      4K r---- libffi.so.6.0.1
00007f1904f63000      4K rw--- libffi.so.6.0.1
00007f1904f64000    164K r-x-- libpng15.so.15.13.0
00007f1904f8d000   2048K ----- libpng15.so.15.13.0
00007f190518d000      4K r---- libpng15.so.15.13.0
00007f190518e000      4K rw--- libpng15.so.15.13.0
00007f190518f000     12K r-x-- libgmodule-2.0.so.0.5600.1
00007f1905192000   2044K ----- libgmodule-2.0.so.0.5600.1
00007f1905391000      4K r---- libgmodule-2.0.so.0.5600.1
00007f1905392000      4K rw--- libgmodule-2.0.so.0.5600.1
00007f1905393000     84K r-x-- libz.so.1.2.7
00007f19053a8000   2044K ----- libz.so.1.2.7
00007f19055a7000      4K r---- libz.so.1.2.7
00007f19055a8000      4K rw--- libz.so.1.2.7
00007f19055a9000    328K r-x-- libldap-2.4.so.2.10.7
00007f19055fb000   2048K ----- libldap-2.4.so.2.10.7
00007f19057fb000      8K r---- libldap-2.4.so.2.10.7
00007f19057fd000      4K rw--- libldap-2.4.so.2.10.7
00007f19057fe000     56K r-x-- liblber-2.4.so.2.10.7
00007f190580c000   2044K ----- liblber-2.4.so.2.10.7
00007f1905a0b000      4K r---- liblber-2.4.so.2.10.7
00007f1905a0c000      4K rw--- liblber-2.4.so.2.10.7
00007f1905a0d000     12K r-x-- libcom_err.so.2.1
00007f1905a10000   2044K ----- libcom_err.so.2.1
00007f1905c0f000      4K r---- libcom_err.so.2.1
00007f1905c10000      4K rw--- libcom_err.so.2.1
00007f1905c11000    196K r-x-- libk5crypto.so.3.1
00007f1905c42000   2044K ----- libk5crypto.so.3.1
00007f1905e41000      8K r---- libk5crypto.so.3.1
00007f1905e43000      4K rw--- libk5crypto.so.3.1
00007f1905e44000    868K r-x-- libkrb5.so.3.3
00007f1905f1d000   2044K ----- libkrb5.so.3.3
00007f190611c000     56K r---- libkrb5.so.3.3
00007f190612a000     12K rw--- libkrb5.so.3.3
00007f190612d000    296K r-x-- libgssapi_krb5.so.2.2
00007f1906177000   2048K ----- libgssapi_krb5.so.2.2
00007f1906377000      4K r---- libgssapi_krb5.so.2.2
00007f1906378000      8K rw--- libgssapi_krb5.so.2.2
00007f190637a000    232K r-x-- libnspr4.so
00007f19063b4000   2044K ----- libnspr4.so
00007f19065b3000      4K r---- libnspr4.so
00007f19065b4000      8K rw--- libnspr4.so
00007f19065b6000      8K rw---   [ anon ]
00007f19065b8000     16K r-x-- libplc4.so
00007f19065bc000   2044K ----- libplc4.so
00007f19067bb000      4K r---- libplc4.so
00007f19067bc000      4K rw--- libplc4.so
00007f19067bd000     12K r-x-- libplds4.so
00007f19067c0000   2044K ----- libplds4.so
00007f19069bf000      4K r---- libplds4.so
00007f19069c0000      4K rw--- libplds4.so
00007f19069c1000    160K r-x-- libnssutil3.so
00007f19069e9000   2048K ----- libnssutil3.so
00007f1906be9000     28K r---- libnssutil3.so
00007f1906bf0000      4K rw--- libnssutil3.so
00007f1906bf1000   1168K r-x-- libnss3.so
00007f1906d15000   2048K ----- libnss3.so
00007f1906f15000     20K r---- libnss3.so
00007f1906f1a000      8K rw--- libnss3.so
00007f1906f1c000      8K rw---   [ anon ]
00007f1906f1e000    144K r-x-- libsmime3.so
00007f1906f42000   2044K ----- libsmime3.so
00007f1907141000     12K r---- libsmime3.so
00007f1907144000      4K rw--- libsmime3.so
00007f1907145000    308K r-x-- libssl3.so
00007f1907192000   2044K ----- libssl3.so
00007f1907391000     16K r---- libssl3.so
00007f1907395000      4K rw--- libssl3.so
00007f1907396000      4K rw---   [ anon ]
00007f1907397000    160K r-x-- libssh2.so.1.0.1
00007f19073bf000   2048K ----- libssh2.so.1.0.1
00007f19075bf000      4K r---- libssh2.so.1.0.1
00007f19075c0000      4K rw--- libssh2.so.1.0.1
00007f19075c1000    200K r-x-- libidn.so.11.6.11
00007f19075f3000   2044K ----- libidn.so.11.6.11
00007f19077f2000      4K r---- libidn.so.11.6.11
00007f19077f3000      4K rw--- libidn.so.11.6.11
00007f19077f4000   1800K r-x-- libc-2.17.so
00007f19079b6000   2048K ----- libc-2.17.so
00007f1907bb6000     16K r---- libc-2.17.so
00007f1907bba000      8K rw--- libc-2.17.so
00007f1907bbc000     20K rw---   [ anon ]
00007f1907bc1000     84K r-x-- libgcc_s-4.8.5-20150702.so.1
00007f1907bd6000   2044K ----- libgcc_s-4.8.5-20150702.so.1
00007f1907dd5000      4K r---- libgcc_s-4.8.5-20150702.so.1
00007f1907dd6000      4K rw--- libgcc_s-4.8.5-20150702.so.1
00007f1907dd7000     28K r-x-- librt-2.17.so
00007f1907dde000   2044K ----- librt-2.17.so
00007f1907fdd000      4K r---- librt-2.17.so
00007f1907fde000      4K rw--- librt-2.17.so
00007f1907fdf000   1028K r-x-- libm-2.17.so
00007f19080e0000   2044K ----- libm-2.17.so
00007f19082df000      4K r---- libm-2.17.so
00007f19082e0000      4K rw--- libm-2.17.so
00007f19082e1000     92K r-x-- libpthread-2.17.so
00007f19082f8000   2044K ----- libpthread-2.17.so
00007f19084f7000      4K r---- libpthread-2.17.so
00007f19084f8000      4K rw--- libpthread-2.17.so
00007f19084f9000     16K rw---   [ anon ]
00007f19084fd000      8K r-x-- libdl-2.17.so
00007f19084ff000   2048K ----- libdl-2.17.so
00007f19086ff000      4K r---- libdl-2.17.so
00007f1908700000      4K rw--- libdl-2.17.so
00007f1908701000   1104K r-x-- libglib-2.0.so.0.5600.1
00007f1908815000   2044K ----- libglib-2.0.so.0.5600.1
00007f1908a14000      4K r---- libglib-2.0.so.0.5600.1
00007f1908a15000      4K rw--- libglib-2.0.so.0.5600.1
00007f1908a16000      4K rw---   [ anon ]
00007f1908a17000    316K r-x-- libgobject-2.0.so.0.5600.1
00007f1908a66000   2044K ----- libgobject-2.0.so.0.5600.1
00007f1908c65000      4K r---- libgobject-2.0.so.0.5600.1
00007f1908c66000      4K rw--- libgobject-2.0.so.0.5600.1
00007f1908c67000   1620K r-x-- libgio-2.0.so.0.5600.1
00007f1908dfc000   2048K ----- libgio-2.0.so.0.5600.1
00007f1908ffc000     20K r---- libgio-2.0.so.0.5600.1
00007f1909001000     12K rw--- libgio-2.0.so.0.5600.1
00007f1909004000      8K rw---   [ anon ]
00007f1909006000    152K r-x-- libgdk_pixbuf-2.0.so.0.3612.0
00007f190902c000   2048K ----- libgdk_pixbuf-2.0.so.0.3612.0
00007f190922c000      4K r---- libgdk_pixbuf-2.0.so.0.3612.0
00007f190922d000      4K rw--- libgdk_pixbuf-2.0.so.0.3612.0
00007f190922e000     28K r-x-- libnotify.so.4.0.0
00007f1909235000   2044K ----- libnotify.so.4.0.0
00007f1909434000      4K r---- libnotify.so.4.0.0
00007f1909435000      4K rw--- libnotify.so.4.0.0
00007f1909436000    708K r-x-- libsqlite3.so.0.8.6
00007f19094e7000   2044K ----- libsqlite3.so.0.8.6
00007f19096e6000      8K r---- libsqlite3.so.0.8.6
00007f19096e8000     12K rw--- libsqlite3.so.0.8.6
00007f19096eb000    408K r-x-- libcurl.so.4.3.0
00007f1909751000   2044K ----- libcurl.so.4.3.0
00007f1909950000      8K r---- libcurl.so.4.3.0
00007f1909952000      4K rw--- libcurl.so.4.3.0
00007f1909953000      4K rw---   [ anon ]
00007f1909954000    136K r-x-- ld-2.17.so
00007f1909a31000     32K rw-s- items.sqlite3-shm
00007f1909a39000     32K rw-s- items.sqlite3-shm
00007f1909a41000     28K r--s- gconv-modules.cache
00007f1909a48000   1116K rw---   [ anon ]
00007f1909b63000     72K rw---   [ anon ]
00007f1909b75000      4K r---- ld-2.17.so
00007f1909b76000      4K rw--- ld-2.17.so
00007f1909b77000      4K rw---   [ anon ]
00007ffc40d5f000    132K rw---   [ stack ]
00007ffc40dbd000      8K r-x--   [ anon ]
ffffffffff600000      4K r-x--   [ anon ]
 total           282496K
top - 19:34:37 up 12 days,  3:24,  4 users,  load average: 0.00, 0.03, 0.05
Tasks: 224 total,   2 running, 222 sleeping,   0 stopped,   0 zombie
%Cpu(s):  0.0 us,  0.0 sy,  0.0 ni, 99.9 id,  0.0 wa,  0.0 hi,  0.0 si,  0.0 st
KiB Mem :  2046420 total,    83944 free,   408976 used,  1553500 buff/cache
KiB Swap:  4194300 total,  4124668 free,    69632 used.  1313220 avail Mem 

  PID USER      PR  NI    VIRT    RES    SHR S  %CPU %MEM     TIME+ COMMAND                                                                                                                  
14584 alex      20   0  282752  14740   9260 S   0.7  0.7   0:08.13 onedrive    

Very lightweight thus far.

@abraunegg
Copy link
Owner

@MHCraftbeer
After ~2hrs:

top - 22:20:49 up 12 days,  6:10,  4 users,  load average: 0.08, 0.04, 0.05
Tasks: 222 total,   1 running, 221 sleeping,   0 stopped,   0 zombie
%Cpu(s):  0.1 us,  0.1 sy,  0.0 ni, 99.8 id,  0.0 wa,  0.0 hi,  0.0 si,  0.0 st
KiB Mem :  2046420 total,    84424 free,   405080 used,  1556916 buff/cache
KiB Swap:  4194300 total,  4124412 free,    69888 used.  1325772 avail Mem 

  PID USER      PR  NI    VIRT    RES    SHR S  %CPU %MEM     TIME+ COMMAND                                                                                                                  
14584 alex      20   0  285952  18012  10156 S   1.3  0.9   1:52.89 onedrive 
[alex@centos7full ~]$ pmap 14584 | tail -n 1
 total           285960K
[alex@centos7full ~]$ 

@MHCraftbeer
Copy link
Author

MHCraftbeer commented Apr 26, 2020

@abraunegg
Thank you very much.

After 2 hours:
top - 14:22:01 up 5 days, 12:15, 2 users, load average: 0.06, 0.27, 0.35 Tasks: 206 total, 1 running, 204 sleeping, 0 stopped, 1 zombie %Cpu(s): 1.0 us, 0.6 sy, 0.0 ni, 98.1 id, 0.1 wa, 0.0 hi, 0.1 si, 0.0 st MiB Mem : 1995.9 total, 69.7 free, 841.1 used, 1085.2 buff/cache MiB Swap: 0.0 total, 0.0 free, 0.0 used. 1036.2 avail Mem

PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
26056 markus 20 0 211752 173304 1752 S 5.6 8.5 10:58.40 onedrive

root@odroidxu4:~# pmap 26056 | tail -n 1 total 246572K

image

starting 2 hours ago with 50 mb, now at ~200 mb

Also:
image

Is it possible that all network traffic gets stuck in ram?

@MHCraftbeer
Copy link
Author

MHCraftbeer commented Apr 26, 2020

@abraunegg
BTW i have the openmediavault plugin flashmemory (folder2ram) installed.
https://github.com/OpenMediaVault-Plugin-Developers/openmediavault-flashmemory

It "manages temporary filesystems across reboots, to decrease writes on permanent storage. This allows the installation of OMV on [...] SD cards"

Could this be the issue?

@MHCraftbeer
Copy link
Author

MHCraftbeer commented Apr 26, 2020

@abraunegg
I think this are the directories which are going directly to the ram due to the flashmemory plugin:

#<type>         <mount point>           <options>
tmpfs           /var/log
tmpfs           /var/tmp
tmpfs           /var/lib/openmediavault/rrd
tmpfs           /var/spool
tmpfs           /var/lib/rrdcached/
tmpfs           /var/lib/monit
tmpfs           /var/cache/samba

@MHCraftbeer
Copy link
Author

MHCraftbeer commented Apr 26, 2020

@abraunegg
Update after 4 hours ~400mb:

top - 16:16:10 up 5 days, 14:09, 3 users, load average: 1.83, 0.67, 0.50
Tasks: 1 total, 0 running, 1 sleeping, 0 stopped, 0 zombie
%Cpu(s): 3.5 us, 3.2 sy, 0.0 ni, 93.0 id, 0.0 wa, 0.0 hi, 0.3 si, 0.0 st
MiB Mem : 1995.9 total, 51.5 free, 1092.9 used, 851.6 buff/cache
MiB Swap: 0.0 total, 0.0 free, 0.0 used. 782.0 avail Mem

PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
26056 markus 20 0 469508 382944 1784 S 0.0 18.7 19:47.21 onedrive

image

image

root@odroidxu4:~# pmap 26056
26056:   /usr/local/bin/onedrive --verbose --monitor --confdir /onedrive/conf --syncdir /onedrive/data
0043f000   1792K r-x-- onedrive
00600000     24K r-x-- onedrive
00606000    336K rwx-- onedrive
0065a000   5096K rwx--   [ anon ]
9ac27000 435044K rwx--   [ anon ]
b5500000    132K rwx--   [ anon ]
b5521000    892K -----   [ anon ]
b5612000    328K rwx--   [ anon ]
b5664000     32K rw-s- items.sqlite3-shm
b566c000     12K r-x-- libnss_dns-2.24.so
b566f000     60K ----- libnss_dns-2.24.so
b567e000      4K r-x-- libnss_dns-2.24.so
b567f000      4K rwx-- libnss_dns-2.24.so
b5680000     24K r-x-- libnss_files-2.24.so
b5686000     64K ----- libnss_files-2.24.so
b5696000      4K r-x-- libnss_files-2.24.so
b5697000      4K rwx-- libnss_files-2.24.so
b5698000     24K rwx--   [ anon ]
b569e000      4K -----   [ anon ]
b569f000  13312K rwx--   [ anon ]
b639f000     20K r-x-- libffi.so.6.0.4
b63a4000     60K ----- libffi.so.6.0.4
b63b3000      4K r-x-- libffi.so.6.0.4
b63b4000      4K rwx-- libffi.so.6.0.4
b63b5000     48K r-x-- libgpg-error.so.0.21.0
b63c1000     60K ----- libgpg-error.so.0.21.0
b63d0000      4K r-x-- libgpg-error.so.0.21.0
b63d1000      4K rwx-- libgpg-error.so.0.21.0
b63d2000     44K r-x-- libtasn1.so.6.5.3
b63dd000     60K ----- libtasn1.so.6.5.3
b63ec000      4K r-x-- libtasn1.so.6.5.3
b63ed000      4K rwx-- libtasn1.so.6.5.3
b63ee000    184K r-x-- libidn.so.11.6.16
b641c000     64K ----- libidn.so.11.6.16
b642c000      4K r-x-- libidn.so.11.6.16
b642d000      4K rwx-- libidn.so.11.6.16
b642e000    216K r-x-- libp11-kit.so.0.2.0
b6464000     64K ----- libp11-kit.so.0.2.0
b6474000     20K r-x-- libp11-kit.so.0.2.0
b6479000      4K rwx-- libp11-kit.so.0.2.0
b647a000     64K r-x-- libsasl2.so.2.0.25
b648a000     60K ----- libsasl2.so.2.0.25
b6499000      4K r-x-- libsasl2.so.2.0.25
b649a000      4K rwx-- libsasl2.so.2.0.25
b649b000     52K r-x-- libresolv-2.24.so
b64a8000     60K ----- libresolv-2.24.so
b64b7000      4K r-x-- libresolv-2.24.so
b64b8000      4K rwx-- libresolv-2.24.so
b64b9000      8K rwx--   [ anon ]
b64bb000      8K r-x-- libkeyutils.so.1.5
b64bd000     60K ----- libkeyutils.so.1.5
b64cc000      4K r-x-- libkeyutils.so.1.5
b64cd000      4K rwx-- libkeyutils.so.1.5
b64ce000     24K r-x-- libkrb5support.so.0.1
b64d4000     60K ----- libkrb5support.so.0.1
b64e3000      4K r-x-- libkrb5support.so.0.1
b64e4000      4K rwx-- libkrb5support.so.0.1
b64e5000    604K r-x-- libgcrypt.so.20.1.6
b657c000     60K ----- libgcrypt.so.20.1.6
b658b000      4K r-x-- libgcrypt.so.20.1.6
b658c000     16K rwx-- libgcrypt.so.20.1.6
b6590000    288K r-x-- libgmp.so.10.3.2
b65d8000     60K ----- libgmp.so.10.3.2
b65e7000      4K r-x-- libgmp.so.10.3.2
b65e8000      4K rwx-- libgmp.so.10.3.2
b65e9000    180K r-x-- libnettle.so.6.3
b6616000     64K ----- libnettle.so.6.3
b6626000      4K r-x-- libnettle.so.6.3
b6627000      4K rwx-- libnettle.so.6.3
b6628000    156K r-x-- libhogweed.so.4.3
b664f000     60K ----- libhogweed.so.4.3
b665e000      4K r-x-- libhogweed.so.4.3
b665f000      4K rwx-- libhogweed.so.4.3
b6660000   1116K r-x-- libgnutls.so.30.13.1
b6777000     60K ----- libgnutls.so.30.13.1
b6786000     32K r-x-- libgnutls.so.30.13.1
b678e000      4K rwx-- libgnutls.so.30.13.1
b678f000      4K rwx--   [ anon ]
b6790000    952K r-x-- libunistring.so.0.1.2
b687e000     64K ----- libunistring.so.0.1.2
b688e000      8K r-x-- libunistring.so.0.1.2
b6890000      4K rwx-- libunistring.so.0.1.2
b6891000     68K r-x-- libz.so.1.2.8
b68a2000     60K ----- libz.so.1.2.8
b68b1000      4K r-x-- libz.so.1.2.8
b68b2000      4K rwx-- libz.so.1.2.8
b68b3000    192K r-x-- libldap_r-2.4.so.2.10.7
b68e3000     64K ----- libldap_r-2.4.so.2.10.7
b68f3000      4K r-x-- libldap_r-2.4.so.2.10.7
b68f4000      4K rwx-- libldap_r-2.4.so.2.10.7
b68f5000      4K rwx--   [ anon ]
b68f6000     32K r-x-- liblber-2.4.so.2.10.7
b68fe000     60K ----- liblber-2.4.so.2.10.7
b690d000      4K r-x-- liblber-2.4.so.2.10.7
b690e000      4K rwx-- liblber-2.4.so.2.10.7
b690f000      8K r-x-- libcom_err.so.2.1
b6911000     60K ----- libcom_err.so.2.1
b6920000      4K r-x-- libcom_err.so.2.1
b6921000      4K rwx-- libcom_err.so.2.1
b6922000    144K r-x-- libk5crypto.so.3.1
b6946000     60K ----- libk5crypto.so.3.1
b6955000      4K r-x-- libk5crypto.so.3.1
b6956000      4K rwx-- libk5crypto.so.3.1
b6957000      4K rwx--   [ anon ]
b6958000    512K r-x-- libkrb5.so.3.3
b69d8000     64K ----- libkrb5.so.3.3
b69e8000     24K r-x-- libkrb5.so.3.3
b69ee000      8K rwx-- libkrb5.so.3.3
b69f0000    160K r-x-- libgssapi_krb5.so.2.2
b6a18000     64K ----- libgssapi_krb5.so.2.2
b6a28000      4K r-x-- libgssapi_krb5.so.2.2
b6a29000      4K rwx-- libgssapi_krb5.so.2.2
b6a2a000   1084K r-x-- libcrypto.so.1.0.2
b6b39000     64K ----- libcrypto.so.1.0.2
b6b49000     56K r-x-- libcrypto.so.1.0.2
b6b57000     36K rwx-- libcrypto.so.1.0.2
b6b60000     12K rwx--   [ anon ]
b6b63000    216K r-x-- libssl.so.1.0.2
b6b99000     64K ----- libssl.so.1.0.2
b6ba9000      8K r-x-- libssl.so.1.0.2
b6bab000     16K rwx-- libssl.so.1.0.2
b6baf000     44K r-x-- libpsl.so.5.1.1
b6bba000     60K ----- libpsl.so.5.1.1
b6bc9000      4K r-x-- libpsl.so.5.1.1
b6bca000      4K rwx-- libpsl.so.5.1.1
b6bcb000    108K r-x-- libssh2.so.1.0.1
b6be6000     64K ----- libssh2.so.1.0.1
b6bf6000      4K r-x-- libssh2.so.1.0.1
b6bf7000      4K rwx-- libssh2.so.1.0.1
b6bf8000     76K r-x-- librtmp.so.1
b6c0b000     60K ----- librtmp.so.1
b6c1a000      4K r-x-- librtmp.so.1
b6c1b000      4K rwx-- librtmp.so.1
b6c1c000    124K r-x-- libidn2.so.0.1.4
b6c3b000     64K ----- libidn2.so.0.1.4
b6c4b000      4K r-x-- libidn2.so.0.1.4
b6c4c000      4K rwx-- libidn2.so.0.1.4
b6c4d000     88K r-x-- libnghttp2.so.14.12.3
b6c63000     60K ----- libnghttp2.so.14.12.3
b6c72000      4K r-x-- libnghttp2.so.14.12.3
b6c73000      8K rwx-- libnghttp2.so.14.12.3
b6c75000    868K r-x-- libc-2.24.so
b6d4e000     60K ----- libc-2.24.so
b6d5d000      8K r-x-- libc-2.24.so
b6d5f000      4K rwx-- libc-2.24.so
b6d60000     12K rwx--   [ anon ]
b6d63000     96K r-x-- libgcc_s.so.1
b6d7b000     60K ----- libgcc_s.so.1
b6d8a000      4K r-x-- libgcc_s.so.1
b6d8b000      4K rwx-- libgcc_s.so.1
b6d8c000    412K r-x-- libm-2.24.so
b6df3000     60K ----- libm-2.24.so
b6e02000      4K r-x-- libm-2.24.so
b6e03000      4K rwx-- libm-2.24.so
b6e04000     68K r-x-- libpthread-2.24.so
b6e15000     60K ----- libpthread-2.24.so
b6e24000      4K r-x-- libpthread-2.24.so
b6e25000      4K rwx-- libpthread-2.24.so
b6e26000      8K rwx--   [ anon ]
b6e28000     20K r-x-- librt-2.24.so
b6e2d000     60K ----- librt-2.24.so
b6e3c000      4K r-x-- librt-2.24.so
b6e3d000      4K rwx-- librt-2.24.so
b6e3e000      8K r-x-- libdl-2.24.so
b6e40000     60K ----- libdl-2.24.so
b6e4f000      4K r-x-- libdl-2.24.so
b6e50000      4K rwx-- libdl-2.24.so
b6e51000    616K r-x-- libsqlite3.so.0.8.6
b6eeb000     60K ----- libsqlite3.so.0.8.6
b6efa000      8K r-x-- libsqlite3.so.0.8.6
b6efc000      4K rwx-- libsqlite3.so.0.8.6
b6efd000      4K rwx--   [ anon ]
b6efe000    312K r-x-- libcurl.so.4.4.0
b6f4c000     60K ----- libcurl.so.4.4.0
b6f5b000      8K r-x-- libcurl.so.4.4.0
b6f5d000      4K rwx-- libcurl.so.4.4.0
b6f5e000     96K r-x-- ld-2.24.so
b6f78000     32K rwx--   [ anon ]
b6f83000      8K rwx--   [ anon ]
b6f85000      4K r-x-- ld-2.24.so
b6f86000      4K rwx-- ld-2.24.so
bef82000    132K rw---   [ stack ]
befc2000      4K r-x--   [ anon ]
befc3000      4K r----   [ anon ]
befc4000      4K r-x--   [ anon ]
ffff0000      4K r-x--   [ anon ]
 total   469512K

@MHCraftbeer
Copy link
Author

MHCraftbeer commented Apr 26, 2020

@abraunegg
Little update:
It seems everytime this happens

2020-04-26T16:52:37.847138217Z Uploading new items of .
2020-04-26T16:53:00.841153181Z Applying changes of Path ID: 2AAD7C5690710C29!107
2020-04-26T16:56:49.825227488Z Sync with OneDrive is complete

with a few minutes between Applying changes... and sync complete, the used ram increases by ~30mb.

Any ideas?

@abraunegg
Copy link
Owner

BTW i have the openmediavault plugin flashmemory (folder2ram) installed.
https://github.com/OpenMediaVault-Plugin-Developers/openmediavault-flashmemory

It "manages temporary filesystems across reboots, to decrease writes on permanent storage. This allows the installation of OMV on [...] SD cards"

Could this be the issue?

Sorry - no idea. Best ask openmediavault

@abraunegg
Copy link
Owner

@MHCraftbeer
After leaving running till this morning (and still uploading):

top - 06:17:00 up 12 days, 14:06,  4 users,  load average: 0.02, 0.02, 0.05
Tasks: 223 total,   1 running, 222 sleeping,   0 stopped,   0 zombie
%Cpu(s):  0.2 us,  0.1 sy,  0.0 ni, 99.6 id,  0.0 wa,  0.0 hi,  0.0 si,  0.0 st
KiB Mem :  2046420 total,    70624 free,   438028 used,  1537768 buff/cache
KiB Swap:  4194300 total,  4124412 free,    69888 used.  1313996 avail Mem 

  PID USER      PR  NI    VIRT    RES    SHR S  %CPU %MEM     TIME+ COMMAND                                                                                                                  
14584 alex      20   0  288512  20584  12716 S   1.0  1.0   6:57.63 onedrive       
[alex@centos7full ~]$ pmap 14584 | tail -n 1
 total           288488K

@MHCraftbeer
Copy link
Author

@abraunegg do you still have the client running? How much ram does it use now after >10 hours?

@abraunegg
Copy link
Owner

@abraunegg
Little update:
It seems everytime this happens

2020-04-26T16:52:37.847138217Z Uploading new items of .
2020-04-26T16:53:00.841153181Z Applying changes of Path ID: 2AAD7C5690710C29!107
2020-04-26T16:56:49.825227488Z Sync with OneDrive is complete

with a few minutes between Applying changes... and sync complete, the used ram increases by ~30mb.

Any ideas?

At this point no. Will wait for my upload to complete and see what is happening during the same stage for similar data set.

@abraunegg
Copy link
Owner

@abraunegg do you still have the client running? How much ram does it use now after >10 hours?

I started the client ~7pm my time. it has been already running for ~11hrs:

[alex@centos7full ~]$ ps -aufx | grep onedrive
alex     14584  1.0  1.0 288516 20584 pts/1    Sl+  Apr26   7:00  |           \_ ./onedrive --confdir ~/.config/onedrive-business/ --monitor --verbose
alex     21084  0.0  0.0 112708   988 pts/3    S+   06:22   0:00              \_ grep --color=auto onedrive
[alex@centos7full ~]$ date
Mon Apr 27 06:22:18 AEST 2020
[alex@centos7full ~]$ 

@MHCraftbeer
Copy link
Author

MHCraftbeer commented Apr 26, 2020

@abraunegg
This is my log after 7 hours:
(I restarted the docker a few times today)

root@odroidxu4:~# ps -aufx | grep onedrive                                markus    9386 12.8 24.1 640852 494452 pts/0   Ds+  16:57  42:42      \_ /usr/local/bin/onedrive --verbose --monitor --confdir /onedrive/conf --syncdir /onedrive/data
root     10575  0.0  0.0   5944   592 pts/0    S+   22:28   0:00          \_ grep onedrive
root@odroidxu4:~#

root@odroidxu4:~# date                                                          Sun Apr 26 22:32:36 CEST 2020
root@odroidxu4:~#

@MHCraftbeer
Copy link
Author

MHCraftbeer commented Apr 26, 2020

Update:
Now is the 500 MB limit of the container full. Please take a look at the timestamps of the log. It is now very slow:

2020-04-26T20:37:18.150171378Z The file has not changed
2020-04-26T20:37:23.191563916Z Processing WP_20150520_001.jpg
2020-04-26T20:37:48.026085389Z The file has not changed
2020-04-26T20:37:51.087211858Z Processing WP_20150517_13_56_08_Pro.jpg
2020-04-26T20:38:20.913390499Z The file has not changed

The CPU now is also >50% occupied.

@abraunegg
Copy link
Owner

abraunegg commented Apr 26, 2020

@MHCraftbeer
I think what you need to do is strip this back to basics to start determining where the fault lies.

  1. Remove all the layers you have doing other 'work' / stuff - you talked about 'flashmemory' plugin. You need to start striping things back to see if a change is seen.
  2. Remove running the client inside Docker - what happens when it is not running inside Docker - does it exhibit the same problem?

I will wait to see what my client starts behaving like once all the data is uploaded - and see if I get a similar 30Mb memory increase per cycle. If I do not see any sort of similar situation - avenue's of potential actions for you are:

  1. Upgrade / Downgrade Docker. Maybe there is a memory leak in that version you are using when compiled for your hardware that is not triggered elsewhere.
  2. Do you have any other hardware to test / try with (non-arm)? Does the same situation occur with that when you are running the client within Docker?

@pauliacomi - As the original author of Dockerfile-rpi can you please provide some details around your Docker images / memory usage for reference.

@MHCraftbeer
Copy link
Author

@abraunegg thanks for the Roadmap and all the effort.
I'll try a docker container on a x86-64 system this week and keep you updated!

@abraunegg
Copy link
Owner

@abraunegg thanks for the Roadmap and all the effort.
I'll try a docker container on a x86-64 system this week and keep you updated!

I would test without Docker first to confirm what the application will do before then testing with Docker - do not jump straight to testing with Docker if you dont know how it is going to act / operate.

@pauliacomi
Copy link

pauliacomi commented Apr 26, 2020

Hi @MHCraftbeer, just a quick comment regarding my setup:

  • running docker on a raspberry pi B2.

  • multiple docker containers with various functions, onedrive is one of them.

  • I have a ~400 Gb onedrive sync with medium use (files modified 30-100 times/day but generally not in bulk). There are small files (1-2 kb), as well as large ones (5-10 Gb).

  • I do not upload files from the client (download only).

  • Currently with a 5 day uptime and memory usage is comfortably around 230 Mb / 1 Gb.

My docker run command is

docker run -it --restart unless-stopped --name onedrive -v /home/pi/.config/onedrive:/onedrive/conf -v "/mnt/exthdd:/onedrive/data" -e ONEDRIVE_VERBOSE=1 -e ONEDRIVE_RESYNC=1 local-onedrive-rpi

so i can use a previously-created config file with some slight changes from default:

download_only="true"
monitor_interval=600
enable_logging="true"
skip_symlinks="true"

I also had no problems with the initial sync, everything hummed along nicely until it was finished after a few days. I've not seen any memory leaks . Admitedly I have not updated the client since I started using docker (if it isn't broke...), so if it is related to any recent changes to the codebase I would not know.

I would definitely try to go with what @abraunegg suggested and try to remove as many variables out of the equation:

  • run the client on another (non-ARM) machine

  • run it non-dockerized

  • disable upload of files

  • do a dry-run only

Hope it helps!

@abraunegg
Copy link
Owner

@MHCraftbeer
Currently syncing every 45 seconds, 50k file dataset, ~30Gb of random data. Current memory usage:

 total           293368K

After another 10 sync process:

 total           293368K

I am not seeing any memory increase, or anything even close to a 30Mb increase per cycle.

I am going to mark this as a local environment issue & unable to replicate.

@MHCraftbeer
Copy link
Author

MHCraftbeer commented Apr 27, 2020

@pauliacomi thank you very much for this insight.

My docker build command was
docker build . -t local-ondrive-stretch -f contrib/docker/Dockerfile-rpi

My docker run command was

docker run -it -m 500M --name=onedrive -e ONEDRIVE_UID=1000 -e ONEDRIVE_GID=100 -e ONEDRIVE_VERBOSE=1 -v /srv/dev-disk-by-label-WD8TB/Markus/Onedrive:/onedrive/data -v /srv/dev-disk-by-label-WD8TB/common/.Appdata/onedrive:/onedrive/conf --restart unless-stopped local-ondrive-stretch:latest

Before I headed off to work I ran the container using your settings. I'll report in the afternoon and start removing variables when I have more time.

@MHCraftbeer
Copy link
Author

MHCraftbeer commented May 4, 2020

In regards to the back ticks - you need to type them in manually 3 x on the first line, then 3 x after the last line.

@abraunegg thank you very much, it works!

What version of the sqlite libraries have you installed?

it seems i don't have any sqlite library installed? since i did not build the client yet?

root@odroidxu4:/srv/dev-disk-by-label-WD8TB/common/Downloads# sqlite

Command 'sqlite' not found, but can be installed with:

apt install sqlite

root@odroidxu4:

should i install it?

@abraunegg
Copy link
Owner

abraunegg commented May 4, 2020

@MHCraftbeer
What about these items:

  • Ensure you are running v2.4.1. Whilst there are no 'memory leak fixes', it is the latest release / codebase, no reason not to be using it based on the 30+ bug fixes between v2.4.0 and v2.4.1
  • Remove the use of 'flashmemory' plugin - ensure it is disabled, removed, not in use.
  • Run the client on a non ARM server - x86_64 with the same data set - clean install, clean configuration.

The next aspect is to run 'valgrind' against a debug version of the client. To do this:

  1. Install 'valgrind' for your OS, and install the appropriate debug symbols for libcurl and libsqlite
  2. Enable debug symbols via adding --enable-debug to ./configure process ... example:
./configure --enable-debug; make;
checking for a BSD-compatible install... /usr/bin/install -c
checking for pkg-config... /usr/bin/pkg-config
checking pkg-config is at least version 0.9.0... yes
checking for dmd... dmd
checking version of D compiler... 2.091.1
checking for curl... yes
checking for sqlite... yes
configure: creating ./config.status
config.status: creating Makefile
config.status: creating contrib/pacman/PKGBUILD
config.status: creating contrib/spec/onedrive.spec
config.status: creating onedrive.1
config.status: creating contrib/systemd/onedrive.service
config.status: creating contrib/systemd/onedrive@.service
if [ -f .git/HEAD ] ; then \
        git describe --tags > version ; \
else \
        echo v2.4.2-dev > version ; \
fi
dmd  -w -g -O -J. -debug -gs -L-lcurl -L-lsqlite3  -L-ldl src/config.d src/itemdb.d src/log.d src/main.d src/monitor.d src/onedrive.d src/qxor.d src/selective.d src/sqlite.d src/sync.d src/upload.d src/util.d src/progress.d -ofonedrive
  1. Run the debug version of the client using 'valgrind' - example below:
valgrind --leak-check=full \
         --show-leak-kinds=all \
         --track-origins=yes \
         --verbose \
         --log-file=valgrind-out.txt \
         ./onedrive/onedrive --confdir '~/.config/onedrive-personal/' --synchronize --verbose

Once 'valgrind' has fully completed, provide the txt file via email.

@abraunegg
Copy link
Owner

abraunegg commented May 4, 2020

@MHCraftbeer

In regards to the back ticks - you need to type them in manually 3 x on the first line, then 3 x after the last line.

@abraunegg thank you very much, it works!

What version of the sqlite libraries have you installed?

it seems i don't have any sqlite library installed? since i did not build the client yet?

root@odroidxu4:/srv/dev-disk-by-label-WD8TB/common/Downloads# sqlite

Command 'sqlite' not found, but can be installed with:

apt install sqlite

root@odroidxu4:

should i install it?

You need to refer to https://github.com/abraunegg/onedrive/blob/master/docs/INSTALL.md and install the dependencies for your OS as per these requirements.

Whilst you are at it, install the debug symbols too.

@MHCraftbeer
Copy link
Author

What about these items:
Ensure you are running v2.4.1. Whilst there are no 'memory leak fixes', it is the latest release / codebase, no reason not to be using it based on the 30+ bug fixes between v2.4.0 and v2.4.1
Remove the use of 'flashmemory' plugin - ensure it is disabled, removed, not in use.
Run the client on a non ARM server - x86_64 with the same data set - clean install, clean configuration.

@abraunegg sorry, up to now i was fully occupied with the previous tasks:

Please provide an output of pmap -X .. thats a capital X as well
Run through the following (see: https://unix.stackexchange.com/questions/36450/how-can-i-find-a-memory-leak-of-a-running-process)
Capture /proc/PID/smaps to a file (before)
Wait some time - maybe a whole sync? 10 mins? 1 hr?
Capture /proc/PID/smaps to a file (after)
Run a diff between the two files diff -u before after
Look for where 'Anonymous' increased and note the memory address
Use GDB to dump the memory process
See what strings are visible in dump file

Tomorrow I will build the client from the current version

@MHCraftbeer
Copy link
Author

MHCraftbeer commented May 5, 2020

@abraunegg I am trying to compile the client on my armhf debian buster server using:

first Dependencies: Raspbian (ARMHF)

sudo apt-get install libcurl4-openssl-dev
sudo apt-get install libsqlite3-dev
sudo apt-get install libxml2
sudo apt-get install pkg-config
wget https://github.com/ldc-developers/ldc/releases/download/v1.16.0/ldc2-1.16.0-linux-armhf.tar.xz
tar -xvf ldc2-1.16.0-linux-armhf.tar.xz

which works perfectly fine.

and second ARMHF Architecture

git clone https://github.com/abraunegg/onedrive.git
cd onedrive
./configure DC=~/ldc2-1.16.0-linux-armhf/bin/ldmd2
make clean; make
sudo make install

Which results in an error regarding the compiler:

root@odroidxu4:~# git clone https://github.com/abraunegg/onedrive.git
Cloning into 'onedrive'...
remote: Enumerating objects: 61, done.
remote: Counting objects: 100% (61/61), done.
remote: Compressing objects: 100% (43/43), done.
remote: Total 4093 (delta 36), reused 26 (delta 18), pack-reused 4032
Receiving objects: 100% (4093/4093), 2.61 MiB | 1.40 MiB/s, done.
Resolving deltas: 100% (2832/2832), done.
root@odroidxu4:~# cd onedrive
root@odroidxu4:~/onedrive# ./configure DC=~/ldc2-1.16.0-linux-armhf/bin/ldmd2
checking for a BSD-compatible install... /usr/bin/install -c
checking for pkg-config... /usr/bin/pkg-config
checking pkg-config is at least version 0.9.0... yes
checking for dmd... /root/ldc2-1.16.0-linux-armhf/bin/ldmd2
checking version of D compiler... /root/ldc2-1.16.0-linux-armhf/bin/ldmd2: error while loading shared libraries: libtinfo.so.5: cannot open shared object file: No such file or directory

configure: error: Compiler version insufficient, current compiler version , minimum version 1.12.0

Do you know why this happens? Am i missing an important step?

@MHCraftbeer
Copy link
Author

Ensure you are running v2.4.1. Whilst there are no 'memory leak fixes', it is the latest release / codebase, no reason not to be using it based on the 30+ bug fixes between v2.4.0 and v2.4.1

Remove the use of 'flashmemory' plugin - ensure it is disabled, removed, not in use.

I was able to check these two tasks by running version 2.4.1 dockerized with flashmemory removed.
In only took 2 hours for the ram to reach 500mb.

@abraunegg
Copy link
Owner

@MHCraftbeer

checking version of D compiler... /root/ldc2-1.16.0-linux-armhf/bin/ldmd2: error while loading shared libraries: libtinfo.so.5: cannot open shared object file: No such file or directory

Your system is missing a symbolic link to this file somewhere. To fix you may need to do the following:

locate libtinfo.so
sudo ln -s /path/to/libtinfo.so.5 /path/to/libtinfo.so

@abraunegg
Copy link
Owner

abraunegg commented May 5, 2020

@MHCraftbeer

Ensure you are running v2.4.1. Whilst there are no 'memory leak fixes', it is the latest release / codebase, no reason not to be using it based on the 30+ bug fixes between v2.4.0 and v2.4.1

Remove the use of 'flashmemory' plugin - ensure it is disabled, removed, not in use.

I was able to check these two tasks by running version 2.4.1 dockerized with flashmemory removed.
In only took 2 hours for the ram to reach 500Mb.

OK .. but what does it do without Docker? Hopefully you can compile and get that issue sorted.
500Mb usage on Docker on your hardware 'might' be normal. For a similar data set on a number of different OS (all x86_64), memory usage hovers around 200-300Mb and does not dramatically increase in the way that your seeing. I cannot test on ARM hardware as I do not have any.

What does it do on other hardware that you own - x86_64 / i686 ?

@MHCraftbeer
Copy link
Author

@abraunegg

Your system is missing a symbolic link to this file somewhere. To fix you may need to do the following:

locate libtinfo.so
sudo ln -s /path/to/libtinfo.so.5 /path/to/libtinfo.so

unfortunately i it does not find "libtinfo.so"

it only finds "libtinfo.so.#" files:

root@odroidxu4:~# locate libtinfo.so
/lib/arm-linux-gnueabihf/libtinfo.so.6
/lib/arm-linux-gnueabihf/libtinfo.so.6.1
/var/lib/docker/overlay2/4254e8be634c047adc051fdae21587f691ca2b3a3ff7bdca3fced12e431c7301/diff/lib/arm-linux-gnueabihf/libtinfo.so.5
/var/lib/docker/overlay2/4254e8be634c047adc051fdae21587f691ca2b3a3ff7bdca3fced12e431c7301/diff/lib/arm-linux-gnueabihf/libtinfo.so.5.9
/var/lib/docker/overlay2/8c7988fef585358c28bf891b1b4a432bedef1e0ab2e40bdbdba40eac7acbd2b0/merged/lib/arm-linux-gnueabihf/libtinfo.so.5
/var/lib/docker/overlay2/8c7988fef585358c28bf891b1b4a432bedef1e0ab2e40bdbdba40eac7acbd2b0/merged/lib/arm-linux-gnueabihf/libtinfo.so.5.9
/var/lib/docker/overlay2/a809d6366177ba0b3b0d09a932376417e1dab5165c041a8105a04d401334a449/merged/lib/arm-linux-gnueabihf/libtinfo.so.5
/var/lib/docker/overlay2/a809d6366177ba0b3b0d09a932376417e1dab5165c041a8105a04d401334a449/merged/lib/arm-linux-gnueabihf/libtinfo.so.5.9
/var/lib/docker/overlay2/a83eab9858d9b7916d278ec50fb9e790272f364ca39ac52e7efa468450e98a56/diff/lib/arm-linux-gnueabihf/libtinfo.so.5
/var/lib/docker/overlay2/a83eab9858d9b7916d278ec50fb9e790272f364ca39ac52e7efa468450e98a56/diff/lib/arm-linux-gnueabihf/libtinfo.so.5.9
/var/lib/docker/overlay2/c8e66b0c1e28554b86848a8577ae5bdde2a7e1344a4b4f2dacd096db70ad509f/diff/lib/arm-linux-gnueabihf/libtinfo.so.5
/var/lib/docker/overlay2/c8e66b0c1e28554b86848a8577ae5bdde2a7e1344a4b4f2dacd096db70ad509f/diff/lib/arm-linux-gnueabihf/libtinfo.so.5.9
/var/lib/docker/overlay2/d0426382c6dab749e111b1469865c7d8a5ad5803bf090e53175d7826e6f8b2e9/diff/lib/arm-linux-gnueabihf/libtinfo.so.5
/var/lib/docker/overlay2/d0426382c6dab749e111b1469865c7d8a5ad5803bf090e53175d7826e6f8b2e9/diff/lib/arm-linux-gnueabihf/libtinfo.so.5.9
root@odroidxu4:~# 

what should i do?
Thank you so much btw!

@abraunegg
Copy link
Owner

@MHCraftbeer
You need to sort this out with whoever supports your Operating System - sorry I cant help there.

@LordPato
Copy link

LordPato commented May 5, 2020

My current status.

Home Server : Intel(R) Core(TM) i7-3635QM CPU @ 2.40GHz
RAM : 16gb
OS : Ubuntu Server 19.10
Using Docker image : driveone/onedrive:latest
Docker version 19.03.8, build afacb8b7f0

For context, I handle a mix of small files (few kb) and large files (a few gb maybe this is the reason of the high cache).
Also this particular account it syncs over 800gb.

Capture44

Last 30 minutes
Capture45

Datadog stats since this morning (8hs history)
Capture46

@MHCraftbeer
Copy link
Author

@LordPato thank you very much for the information. Your cache is indeed very high, but at least the used memory is in the range of everyone exept me.

@abraunegg thank you so much for your help so far. However, the last few days it became clear to me that I lack the knowledge and the skills to pursue this issue further.

500Mb usage on Docker on your hardware 'might' be normal.

While that might be true, i am not willing to sacrifice much more ram for synchronizing my files.

For now my workaround is to increase the full sync intervall, since it is mainly responsible for the memory leak.
Additionally, i will set up a cron job to restart the docker every few hours.

I know this result is not satisfying at all (especially for me), but I am afraid more is not possible for me.

@reinketelaars
Copy link

reinketelaars commented May 12, 2020

In my installation there are no real memory issues, although its memory uses gradually increases over time (see below).
onedrive v2.4.1 no Docker image, though.
Debian 10.4 - 4GB RAM
I started a 'free -h' every 2 minutes first. Then the client ran for 45 minutes in total. The last entry is after the onedrive client finished. No other cron jobs in the mean time.

rein@mistress8:~$ while true; do free -h; sleep 120; done
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.0Gi       1.2Gi        40Mi       1.3Gi       2.2Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       988Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       972Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       973Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       970Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       971Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       969Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       970Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       967Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       954Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       953Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       953Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       953Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       954Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       962Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       956Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       952Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       951Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       951Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       951Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       951Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       945Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       942Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.1Gi       943Mi        40Mi       1.5Gi       2.1Gi
Swap:         2.4Gi        62Mi       2.4Gi
              total        used        free      shared  buff/cache   available
Mem:          3.5Gi       1.0Gi       995Mi        40Mi       1.5Gi       2.2Gi
Swap:         2.4Gi        62Mi       2.4Gi
MiB Mem :   3607.6 total,    951.0 free,   1116.0 used,   1540.5 buff/cache
MiB Swap:   2504.0 total,   2441.5 free,     62.5 used.   2172.0 avail Mem

  PID USER      PR  NI    VIRT    RES    SHR S  %CPU  %MEM     TIME+ COMMAND
 6289 rein      20   0  211116  55492  14368 S   8.3   1.5   6:23.29 onedrive

some 20 minutes later:

MiB Mem :   3607.6 total,    942.3 free,   1122.2 used,   1543.1 buff/cache
MiB Swap:   2504.0 total,   2441.5 free,     62.5 used.   2165.8 avail Mem

  PID USER      PR  NI    VIRT    RES    SHR S  %CPU  %MEM     TIME+ COMMAND
 6289 rein      20   0  227500  61236  14368 S  20.9   1.7  10:52.85 onedrive

Hope this helps....

@abraunegg
Copy link
Owner

@MHCraftbeer

Please can you validate the following PR to further diagnose your issue:

git clone https://github.com/abraunegg/onedrive.git
cd onedrive
git fetch origin pull/910/head:pr910
git checkout pr910
./configure; make clean; make;

You will need to then either 'install' or run the updated application binary from the PR folder to validate the fix, or build a specific Docker instance using this PR version.

It would be great to obtain some feedback as to whether this PR improves your situation or not.

@LordPato
Given you are also running ARM + Docker as well, it would be great to get some feedback from yourself as well.

@LordPato
Copy link

LordPato commented May 13, 2020

Sorry. I'm not running ARM. I'm using Intel Core i7

@abraunegg
Copy link
Owner

@LordPato
Whoops .. my bad :) are you able to test however?

@abraunegg
Copy link
Owner

@MHCraftbeer
Any update as to being able to test PR #910 ?

@MHCraftbeer
Copy link
Author

@abraunegg i ran the test PR #910 with fresh docker images:

git clone https://github.com/abraunegg/onedrive.git
cd onedrive
git fetch origin pull/910/head:pr910
git checkout pr910
docker build . -t local-onedrive-stretch -f contrib/docker/Dockerfile-rpi
docker run -it -m 666M --name=onedrive -e ONEDRIVE_UID=1000 -e ONEDRIVE_GID=100 -v /srv/dev-disk-by-label-WD8TB/Markus/Onedrive:/onedrive/data -v /srv/dev-disk-by-label-WD8TB/Markus/.Appdata/onedrive:/onedrive/conf --restart unless-stopped local-onedrive-stretch:latest

Using this config file:

#This file contains the list of supported configuration fields
#with their default values.
#All values need to be enclosed in quotes
#When changing a config option below, remove the '#' from the start of the line
#For explanations of all config options below see docs/USAGE.md or the man pag>
#
# sync_dir = "~/OneDrive"
#skip_file = "~|.~|*.tmp"
#monitor_interval = "600"
#skip_dir = ""
log_dir = "/onedrive/conf/"
#drive_id = ""
#upload_only = "false"
#check_nomount = "false"
#check_nosync = "false"
#download_only = "false"
disable_notifications = "true"
#disable_upload_validation = "false"
#enable_logging = "true"
#force_http_11 = "false"
#force_http_2 = "false"
#local_first = "false"
#no_remote_delete = "false"
skip_symlinks = "true"
#debug_https = "false"
#skip_dotfiles = "false"
#dry_run = "false"
#min_notify_changes = "5"
#monitor_log_frequency = "5"
monitor_fullscan_frequency = "10"
#sync_root_files = "false"
#classify_as_big_delete = "1000"
#user_agent = ""
#remove_source_files = "false"
#skip_dir_strict_match = "false"
#application_id = ""

This is the docker running
image

This is the ramlog I made using a cron job:

\\2020-05-17 \11:30:01:  total   101540K
\\2020-05-17 \11:35:01:  total   124680K
\\2020-05-17 \11:40:01:  total   179916K
\\2020-05-17 \11:45:01:  total   179916K
\\2020-05-17 \11:50:01:  total   211660K
\\2020-05-17 \11:55:01:  total   211948K
\\2020-05-17 \12:00:01:  total   247776K
\\2020-05-17 \12:05:01:  total   328124K
\\2020-05-17 \12:10:01:  total   328124K
\\2020-05-17 \12:15:01:  total   328124K
\\2020-05-17 \12:20:01:  total   328124K
\\2020-05-17 \12:25:01:  total   372156K
\\2020-05-17 \12:30:01:  total   420924K
\\2020-05-17 \12:35:01:  total   420924K
\\2020-05-17 \12:40:01:  total   420924K
\\2020-05-17 \12:45:01:  total   472684K
\\2020-05-17 \12:50:01:  total   472684K
\\2020-05-17 \12:55:01:  total   525932K
\\2020-05-17 \13:00:01:  total   525932K
\\2020-05-17 \13:05:01:  total   584144K
\\2020-05-17 \13:10:01:  total   584144K
\\2020-05-17 \13:15:01:  total   584144K
\\2020-05-17 \13:20:01:  total   643536K

It restarted a few minutes ago on its own since the assigned memory was occupied.

Unfortunately there is still no improvement. However, thank you so much for your effort @abraunegg!

@norbusan
Copy link
Collaborator

Hmm, that is strange. I have been running onedrive from the respective PR now quite some hours, and there is an increase in memory (as reported using onedrive itself):

memory usedSize = ...
1497136
369408
1850912
        ERROR: OneDrive returned an error with the following message:
           Error Message: SSL peer certificate or SSH remote key was not OK on handle 55A4B752>                                
1850992    
        communication reestablished 
1850912 (back to the value before)                                              
        Downloading file Pictures.....
        (many of them)
1851168 

These are all the different numbers I see in my log besides intermediate peaks which occur during the periodic sync, but drop back to the value from before immediately. This is all in monitor mode.

Systemd reports Memory: 764.7M for my OneDrive (13764 files, 1953 directories, 69G)

There is some increase when files are downloaded, though.

So something is strange when running in a docker image it seems.

@tsarna
Copy link

tsarna commented May 18, 2020

So something is strange when running in a docker image it seems.

I can't think of any good explanation for why running in a container would matter, except for differences in libraries from the base image. Alpine for example uses musl instead of glibc, and so has a completely different malloc implementation, etc. You could well expect to see differences in behavior from a program in an alpine-based image and one running outside of a container on glibc.

If both the image and OS being compared are glibc based, there could still be differences from version to version or in any other libraries used.

@abraunegg
Copy link
Owner

@tsarna
Agree with your insights. Unfortunatly @MHCraftbeer has been unable to run the application without Docker at this stage. This issue only appears to be limited to:

  • Very specific ARM hardware (odroid-hc2 from hardkernel)
  • Very specific OS + OpenMediaVault + flashmemory (folder2ram)

Given other folk who use Docker (on any architecture) is not seeing this sort of behavior, leads me to think that there is something 100% environmental on that system which is the contributing factor.

Also chatting with folk off here - another potential reason could be OS corruption - given the issues around not being able to run the application outside of Docker - this is a potential possibility.

Also I dont have ARM hardware to test, but 100% would like to get to the bottom of this & have folk test as much as possible to provide more data points.

@abraunegg abraunegg changed the title Too much RAM consumption Too much RAM consumption on ARM hardware (odroid-hc2 from hardkernel) which only has 2GB memory May 20, 2020
@abraunegg abraunegg changed the title Too much RAM consumption on ARM hardware (odroid-hc2 from hardkernel) which only has 2GB memory Too much RAM consumption on specific ARM hardware (odroid-hc2 from hardkernel) May 22, 2020
@abraunegg
Copy link
Owner

@MHCraftbeer
Any update from you on this issue?

No other memory issues are being raised - so would like to understand what you are seeing, otherwise I will close this issue ticket as a local environment issue that cannot be replicated.

Please can you advise.

@abraunegg
Copy link
Owner

Closing issue

@github-actions
Copy link

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators May 27, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

7 participants