Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Since version 0.105 the scan is unbearable slow #590

Open
martin-ms opened this issue May 21, 2022 · 46 comments
Open

Since version 0.105 the scan is unbearable slow #590

martin-ms opened this issue May 21, 2022 · 46 comments

Comments

@martin-ms
Copy link

martin-ms commented May 21, 2022

A full scan on $HOME needs now 98 mins with version 0.105:

----------- SCAN SUMMARY -----------
Known viruses: 8616359
Engine version: 0.105.0
Scanned directories: 6240
Scanned files: 98280
Infected files: 3
Total errors: 8
Data scanned: 22403.29 MB
Data read: 12333.45 MB (ratio 1.82:1)
Time: 5897.640 sec (98 m 17 s)
Start Date: 2022:05:14 10:17:19
End Date:   2022:05:14 11:55:36

After downgrade to 0.104 and perform a scan on the same folder a few minutes later, its completed in 26 mins;

----------- SCAN SUMMARY -----------
Known viruses: 8616428
Engine version: 0.104.2
Scanned directories: 6240
Scanned files: 97797
Infected files: 3
Data scanned: 17019.32 MB
Data read: 12226.76 MB (ratio 1.39:1)
Time: 1569.143 sec (26 m 9 s)
Start Date: 2022:05:14 09:42:41
End Date:   2022:05:14 10:08:50

That is about four times faster, and the normal duration experienced in the past. The amount of files between the runs is almost the same, but the "Data scanned" (whatever that means) is remarkable different.

I also get now a "Can't parse data ERROR" on different PNG and PDF files while scanning with version 0.105, the same files can be processed with version 0.104. -> will be handled in separate report #593

Every suggestion is appreciated to get back the old speed, known until version 0.104.

@teoberi
Copy link
Contributor

teoberi commented May 22, 2022

I noticed the same thing!
For now, Clamav remains only the second solution.

@ragusaa
Copy link
Contributor

ragusaa commented May 23, 2022

Thank you for letting us know. Are there a particular set of files/file types that are causing the issue. Any sample files you could provide us with to demonstrate the issue would help in fixing it.

Thanks,
Andy

@martin-ms
Copy link
Author

As described, the problem occurs when scanning all of $HOME. I can't tell you which of the >98,000 files is causing the problem; I guess all. The log also doesn't show how long a scan of each file took, but only gives an overall statistic, so I can't provide a sample file either. The overall performance is significantly worse with version 0.105 than with version 0.104, I'm sorry but that's all I can say.

@alext
Copy link

alext commented May 24, 2022

I also get now a "Can't parse data ERROR" on different PNG and PDF files while scanning with version 0.105, the same files can be processed with version 0.104.

On this point, I've hit what looks to be the same issue, so I've opened a separate issue for it - #593. I'm unable to attach the PDF I'm experiencing the issue with due to sensitive contents.

@martin-ms, if you have any example files that aren't sensitive, and are reporting "Can't parse data ERROR", could you attach them to that ticket to help with diagnosis.

@martin-ms
Copy link
Author

@alext done

@ragusaa
Copy link
Contributor

ragusaa commented May 24, 2022

Thank you for the updates. I understand that you cannot determine which files are causing the issues. I attempted scanning the sample in #593, and it scanned much quicker with 0.105 than with 0.104, so they don't appear related.

I'll let you know when I am able to reproduce the issue.

@micahsnyder
Copy link
Contributor

When I was reading this earlier, I had initially thought the scan time may be longer because we're now calculating fuzzy hashes for image files. But then we realized a much more obvious reason. In 0.105 we increased the default max file-size, max scan-size, etc.

Specifically:

  • MaxFileSize 25M -> 100M
  • MaxScanSize 100M -> 400M
  • StreamMaxLength 25M -> 100M
  • MaxEmbeddedPE 10M -> 40M
  • MaxHTMLNormalize 10M -> 40M
  • MaxHTMLNoTags 2M -> 8M
  • MaxScriptNormalize 5M -> 20M
  • PCREMaxFileSIze 25M -> 100M

Ref: #489

@martin-ms what scan options do you use? If you're scanning with the defaults, then it would make a lot of sense that 0.105 is significantly slower. 0.105 will be scanning a lot more files, and a lot more data in those files.

@martin-ms
Copy link
Author

Thank you for taking care of the issue.

Although I had used the default settings, I now changed the variables mentioned to the old values. clamconf reports as non-default values

Config file: clamd.conf
-----------------------
LogFile = "/var/log/clamav/clamd.log"
LogTime = "yes"
PidFile = "/run/clamav/clamd.pid"
TemporaryDirectory = "/tmp"
LocalSocket = "/run/clamav/clamd.ctl"
StreamMaxLength = "26214400"
User = "clamav"
MaxScanSize = "104857600"
MaxFileSize = "26214400"
MaxEmbeddedPE = "10485760"
MaxHTMLNormalize = "10485760"
MaxHTMLNoTags = "2097152"
MaxScriptNormalize = "5242880"
PCREMaxFileSize = "26214400"

but unfortunately it doesn't change the behavior that it still runs much longer than with version 0.104:

----------- SCAN SUMMARY -----------
Known viruses: 8617579
Engine version: 0.105.0
Scanned directories: 6387
Scanned files: 114546
Infected files: 3
Total errors: 8
Data scanned: 22744.92 MB
Data read: 12572.25 MB (ratio 1.81:1)
Time: 5998.697 sec (99 m 58 s)
Start Date: 2022:06:07 10:33:11
End Date:   2022:06:07 12:13:10

Here for comparison the same task a few minutes later with the same settings with version 0.104:

----------- SCAN SUMMARY -----------
Known viruses: 8617620
Engine version: 0.104.2
Scanned directories: 6387
Scanned files: 111993
Infected files: 3
Data scanned: 17411.08 MB
Data read: 12474.23 MB (ratio 1.40:1)
Time: 1710.669 sec (28 m 30 s)
Start Date: 2022:06:07 12:15:43
End Date:   2022:06:07 12:44:13

I don't know if it's important, but I got with v0.104 several
LibClamAV Warning: cli_scanxz: decompress file size exceeds limits - only scanning 27262976 bytes
warnings, but not with v0.105, although with the same settings. Does v0.105 probably scan more than defined in the settings, or does not respect some settings? The values of "Data scanned" are also different.

@micahsnyder
Copy link
Contributor

Apologies I should've shared the options for use with clamscan. clamd.conf does not affect the behavior of clamscan. It only affects clamd in combination with clamDscan and clamonacc.

To get a similar effect with clamscan, you can do something like this:
clamscan --max-filesize=25M --max-scansize=100M --max-embeddedpe=10M --max-htmlnormalize=10M --max-htmlnotags=2M --max-scriptnormalize=5M --pcre-max-filesize=25M /path/to/scan

@martin-ms
Copy link
Author

I tried it with the given command line parameters, but it didn't get significantly faster:

----------- SCAN SUMMARY -----------
Known viruses: 8617586
Engine version: 0.105.0
Scanned directories: 6389
Scanned files: 112078
Infected files: 3
Total errors: 8
Data scanned: 17362.63 MB
Data read: 12453.06 MB (ratio 1.39:1)
Time: 5437.267 sec (90 m 37 s)
Start Date: 2022:06:08 09:29:25
End Date:   2022:06:08 11:00:02

@micahsnyder
Copy link
Contributor

@martin-ms Interesting. I'm not sure what to say. We do a bit of performance profiling/monitoring on a a selection of file types but I think we will have to extend that and compare older and newer versions to understand what's going on.

@Devstellar
Copy link
Contributor

Abstract booklet CNIC Inflammation Day.pdf

This seems weird.
Uploaded PDF takes 120 seconds in 0.105.1 with defaults. Note the scanned data 810 MB in a file of only 17 Mb in size...

root:~# clamscan Abstract\ booklet\ CNIC\ Inflammation\ Day.pdf 
Loading:    10s, ETA:   0s [========================>]    8.64M/8.64M sigs       
Compiling:   3s, ETA:   0s [========================>]       41/41 tasks 

/root/Abstract booklet CNIC Inflammation Day.pdf: OK

----------- SCAN SUMMARY -----------
Known viruses: 8637607
Engine version: 0.105.1
Scanned directories: 0
Scanned files: 1
Infected files: 0
Data scanned: 810.32 MB
Data read: 17.62 MB (ratio 45.99:1)
Time: 120.556 sec (2 m 0 s)
Start Date: 2022:09:29 12:49:59
End Date:   2022:09:29 12:51:59

But it ony takes 19 seconds in 0.104.4 or 0.105.1 with same limits.

root:~# clamscan --max-filesize=25M --max-scansize=100M --max-embeddedpe=10M --max-htmlnormalize=10M --max-htmlnotags=2M --max-scriptnormalize=5M --pcre-max-filesize=25M Abstract\ booklet\ CNIC\ Inflammation\ Day.pdf 
Loading:    10s, ETA:   0s [========================>]    8.64M/8.64M sigs       
Compiling:   3s, ETA:   0s [========================>]       41/41 tasks 

/root/Abstract booklet CNIC Inflammation Day.pdf: OK

----------- SCAN SUMMARY -----------
Known viruses: 8637607
Engine version: 0.105.1
Scanned directories: 0
Scanned files: 1
Infected files: 0
Data scanned: 90.37 MB
Data read: 17.62 MB (ratio 5.13:1)
Time: 19.047 sec (0 m 19 s)
Start Date: 2022:09:29 12:49:25
End Date:   2022:09:29 12:49:44
root:~# clamscan Abstract\ booklet\ CNIC\ Inflammation\ Day.pdf 
Loading:    10s, ETA:   0s [========================>]    8.64M/8.64M sigs       
Compiling:   3s, ETA:   0s [========================>]       41/41 tasks 

/root/Abstract booklet CNIC Inflammation Day.pdf: OK

----------- SCAN SUMMARY -----------
Known viruses: 8637648
Engine version: 0.104.4
Scanned directories: 0
Scanned files: 1
Infected files: 0
Data scanned: 90.37 MB
Data read: 17.62 MB (ratio 5.13:1)
Time: 19.072 sec (0 m 19 s)
Start Date: 2022:09:29 12:56:19
End Date:   2022:09:29 12:56:38

@teoberi
Copy link
Contributor

teoberi commented Dec 3, 2022

Any news on this issue?
It does not depend on the files to be scanned, anything larger takes a long time.
Example: Joomla_4.2.5-Stable-Full_Package.tar.gz (size=24 M)

clamdscan -m Joomla_4.2.5-Stable-Full_Package.tar.gz
/tmp/Joomla_4.2.5-Stable-Full_Package.tar.gz: OK

----------- SCAN SUMMARY -----------
Infected files: 0
Time: 120.016 sec (2 m 0 s)
Start Date: 2022:12:03 10:44:21
End Date: 2022:12:03 10:46:21

Another antivirus as a comparison with the mention that it must load virus definitions before scanning (the period is included in the total scan time).

SAVScan virus detection utility
Version 5.90.0 [Linux/AMD64]
Virus data version 5.97, November 2022
Includes detection for 79322720 viruses, Trojans and worms
Copyright (c) 1989-2022 Sophos Limited. All rights reserved.

System time 10:46:57 AM, System date 03 December 2022
Command line qualifiers are: -sc -f -di -c -b -all -rec -remove -archive -mime -oe -tnef -pua

Full Scanning

1 file scanned in 27 seconds.
No viruses were discovered.
No PUAs were discovered.
End of Scan.

Pay attention to the number of virus definitions!
Clamav -> 8815934 signatures
Sophos -> 79322720 signatures

@net1
Copy link

net1 commented Dec 3, 2022

I have problem with slow scan time with big PDF files and I just found that this 2 options or settings are the most sigificant on scan time.

  • PCREMatchLimit (--pcre-match-limit in command line) default value: 100000?
  • PCRERecMatchLimit (--pcre-recmatch-limit in command line) default value: 5000?

Scan file is email file and I created signature from JPG image file inside attachment PDF with fuzzyimg

$ fuzzyimg /tmp/20221203_165750-3-1670055877.msg.8d83562918/3-1670055877.msg.eb940341a8/xxxx.pdf.283ff5d12e/pdf-tmp.35d594783a/pdf01

pdf01: f0e00b0fef9689cc

You can see the different scan time with different scan options adjustment


Scan with default setting

$ clamscan 5-1670056633.msg 
Loading:    24s, ETA:   0s [========================>]    8.82M/8.82M sigs       
Compiling:   4s, ETA:   0s [========================>]       42/42 tasks 

5-1670056633.msg: Fuzzy.Spam.PDF.UNOFFICIAL FOUND

----------- SCAN SUMMARY -----------
Known viruses: 8815090
Engine version: 1.0.0
Scanned directories: 0
Scanned files: 1
Infected files: 1
Data scanned: 1.75 MB
Data read: 0.85 MB (ratio 2.06:1)
Time: 76.165 sec (1 m 16 s)
Start Date: 2022:12:03 17:56:22
End Date:   2022:12:03 17:57:38

Scan with specify default setting values

$ clamscan --pcre-recmatch-limit=5000 --pcre-match-limit=100000 5-1670056633.msg 
Loading:    28s, ETA:   0s [========================>]    8.82M/8.82M sigs       
Compiling:   4s, ETA:   0s [========================>]       42/42 tasks 

5-1670056633.msg: Fuzzy.Spam.PDF.UNOFFICIAL FOUND

----------- SCAN SUMMARY -----------
Known viruses: 8815090
Engine version: 1.0.0
Scanned directories: 0
Scanned files: 1
Infected files: 1
Data scanned: 1.75 MB
Data read: 0.85 MB (ratio 2.06:1)
Time: 87.619 sec (1 m 27 s)
Start Date: 2022:12:03 18:01:32
End Date:   2022:12:03 18:02:36

Scan with specify half default setting values

$ clamscan --pcre-recmatch-limit=2500 --pcre-match-limit=50000 5-1670056633.msg 
Loading:    26s, ETA:   0s [========================>]    8.82M/8.82M sigs       
Compiling:   4s, ETA:   0s [========================>]       42/42 tasks 

5-1670056633.msg: Fuzzy.Spam.PDF.UNOFFICIAL FOUND

----------- SCAN SUMMARY -----------
Known viruses: 8815090
Engine version: 1.0.0
Scanned directories: 0
Scanned files: 1
Infected files: 1
Data scanned: 1.75 MB
Data read: 0.85 MB (ratio 2.06:1)
Time: 35.717 sec (0 m 35 s)
Start Date: 2022:12:03 18:02:53
End Date:   2022:12:03 18:03:01

However I tested with above bigger PDF file (Abstract.booklet.CNIC.Inflammation.Day.pdf)
I wonders that the PCREMatchLimit / PCRERecMatchLimit settings has not affected with scan time so there might be other settings or actual scan time was limited by time limit setting

LibClamAV debug: cli_unzip: Time limit reached (max: 120000)
LibClamAV debug: Exceeded scan time limit while evaluating logical and yara signatures (max: 120000)
LibClamAV debug: Descriptor[4]: halting after file scan because: Exceeded time limit
LibClamAV debug: Descriptor[3]: halting after file scan because: Exceeded time limit

$ clamscan Abstract.booklet.CNIC.Inflammation.Day.pdf 
Loading:    29s, ETA:   0s [========================>]    8.82M/8.82M sigs       
Compiling:   4s, ETA:   0s [========================>]       42/42 tasks 

Abstract.booklet.CNIC.Inflammation.Day.pdf: OK

----------- SCAN SUMMARY -----------
Known viruses: 8815090
Engine version: 1.0.0
Scanned directories: 0
Scanned files: 1
Infected files: 0
Data scanned: 261.20 MB
Data read: 17.62 MB (ratio 14.82:1)
Time: 156.588 sec (2 m 36 s)
Start Date: 2022:12:03 18:35:42
End Date:   2022:12:03 18:38:19
$ clamscan --pcre-recmatch-limit=2500 --pcre-match-limit=50000 Abstract.booklet.CNIC.Inflammation.Day.pdf 
Loading:    24s, ETA:   0s [========================>]    8.82M/8.82M sigs       
Compiling:   7s, ETA:   0s [========================>]       42/42 tasks 

Abstract.booklet.CNIC.Inflammation.Day.pdf: OK

----------- SCAN SUMMARY -----------
Known viruses: 8815080
Engine version: 1.0.0
Scanned directories: 0
Scanned files: 1
Infected files: 0
Data scanned: 255.83 MB
Data read: 17.62 MB (ratio 14.52:1)
Time: 154.080 sec (2 m 34 s)
Start Date: 2022:12:03 18:32:05
End Date:   2022:12:03 18:34:39
$ clamscan  --max-filesize=25M --max-scansize=100M --max-embeddedpe=10M --max-htmlnormalize=10M --max-htmlnotags=2M --max-scriptnormalize=5M --pcre-max-filesize=25M --pcre-recmatch-limit=2500 --pcre-match-limit=50000 Abstract.booklet.CNIC.Inflammation.Day.pdf 
Loading:    25s, ETA:   0s [========================>]    8.82M/8.82M sigs       
Compiling:   5s, ETA:   0s [========================>]       42/42 tasks 

Abstract.booklet.CNIC.Inflammation.Day.pdf: OK

----------- SCAN SUMMARY -----------
Known viruses: 8815090
Engine version: 1.0.0
Scanned directories: 0
Scanned files: 1
Infected files: 0
Data scanned: 90.37 MB
Data read: 17.62 MB (ratio 5.13:1)
Time: 78.892 sec (1 m 18 s)
Start Date: 2022:12:03 19:14:01
End Date:   2022:12:03 19:15:20

@teoberi
Copy link
Contributor

teoberi commented Dec 4, 2022

The problem affects any service that uses Clamav, for example Amavis, Squid eCAP, he puts them in head without any problem.
What should be done, should I go back to 0.103.7 or simply disable Clamav?

@teoberi
Copy link
Contributor

teoberi commented Dec 8, 2022

Scanning using Sophos Protection for Linux (avscanner) new replacement for Sophos Antivirus for Linux.

time avscanner -ai Abstract.booklet.CNIC.Inflammation.Day.pdf
[10:41:45] Logger av configured for level: INFO

[10:41:45] Archive scanning enabled: yes
[10:41:45] Image scanning enabled: yes
[10:41:45] Following symlinks: no
[10:41:45] Scanning /tmp/Abstract.booklet.CNIC.Inflammation.Day.pdf
[10:41:45] End of Scan Summary:
[10:41:45] 1 file scanned in less than a second.
[10:41:45] 0 files out of 1 were infected.

real 0m0.317s
user 0m0.022s
sys 0m0.019s

@teoberi
Copy link
Contributor

teoberi commented Jan 9, 2023

@martin-ms Interesting. I'm not sure what to say. We do a bit of performance profiling/monitoring on a a selection of file types but I think we will have to extend that and compare older and newer versions to understand what's going on.

Any news on that?

@teoberi
Copy link
Contributor

teoberi commented Jan 11, 2023

From https://www.linuxquestions.org/ Slackware forum:

You are right that scanning of pdf's are slow- A clamscan of the 18 Mb AbstractDay.pdf took 3m 21s (the first 1m 22s for loading of databases). I could see the pdf was extracted as six 28 Mb "raw" noname files in the /tmp directory. The clamd.conf can be set to disable unpacking (but not scanning) of pdf's.

@martin-ms
Copy link
Author

I tried again with the current version 1.0.0, but I got the same result.

Scanning directories with version 1.0.0:

----------- SCAN SUMMARY -----------
Known viruses: 8651048
Engine version: 1.0.0
Scanned directories: 8009
Scanned files: 107468
Infected files: 3
Data scanned: 26533.77 MB
Data read: 14757.64 MB (ratio 1.80:1)
Time: 6347.488 sec (105 m 47 s)
Start Date: 2023:02:04 09:24:51
End Date:   2023:02:04 11:10:38

Then the same directories with the same options and settings with version 0.104.2:

----------- SCAN SUMMARY -----------
Known viruses: 8651044
Engine version: 0.104.2
Scanned directories: 8009
Scanned files: 107548
Infected files: 3
Data scanned: 20594.58 MB
Data read: 14766.02 MB (ratio 1.39:1)
Time: 1988.306 sec (33 m 8 s)
Start Date: 2023:02:04 11:12:41
End Date:   2023:02:04 11:45:50

It's still more than three times slower. I'll stay with 0.104.2 for now, but may have to look for something else as I can't work with an outdated version forever.

@teoberi
Copy link
Contributor

teoberi commented Feb 4, 2023

Now Clamav 1.0.0 can only be reasonably used for small files (perhaps under 1 MB), is this by design?
It's OK if you can use another antivirus solution, in my case because of systemd I don't have another solution for production servers (e.g. mail, proxy).

@teoberi
Copy link
Contributor

teoberi commented Mar 7, 2023

A commercial alternative that is compatible with Clamav can be IKARUS scan.server
A quick start guide here.

ClamAV interface

Starting with version 1.7.0, IKARUS scan.server supports a ClamAV compatible TCP socket that mimics clamd (default TCP port: 3310). It only supports the scanning of single files and buffers. For further information regarding the use of the interface directly, please read the ClamAV documentation at https://www.clamav.net/documents/scanning#clamd .

It also works as a unix socket and version 6.1.7 includes the option to configure socket permissions (very useful).

@teoberi
Copy link
Contributor

teoberi commented May 2, 2023

Still no improvement to the existing problem in version 1.1.0!

@m-sola
Copy link
Contributor

m-sola commented May 4, 2023

To those affected, could you please provide a flamegraph showing where clamav is spending more time? This is the best way to show us what you're seeing on your system so we can figure out a fix.

Instructions here: https://docs.clamav.net/manual/Development/performance-profiling.html?highlight=flame#flame-graph-profiling

Ideally, we'd like a flamegraph of the older, more performant version and the latest so we can compare the two.

@teoberi
Copy link
Contributor

teoberi commented May 5, 2023

I'm sorry, but I don't have the necessary hardware to perform those operations.
What I and others have noticed is that Data scanned has increased and recent versions of Clamav quickly scan only small files (under 1 MB). For any type of larger file, the scanning time is very long.
I mention that I cannot use LLVM for compilation (I have the latest version 16.0.3) due to compatibility problems.

@martin-ms
Copy link
Author

martin-ms commented May 5, 2023

Instructions here: https://docs.clamav.net/manual/Development/performance-profiling.html?highlight=flame#flame-graph-profiling

LibClamAV Error: cl_load(): No such file or directory: clamav.hdb

I don't know where to obtain the missing file, it ist not part of the distributed installation package.

[UPDATE]
OK... found the file now in the source archive, issued
perf record -F 100 -g -- clamscan -d clamav.hdb --allmatch ./test/

and the results are for 0.104.2
out-0 104 2

and for 1.0.1
out-1 0 1

The performance was the same both times, I can't imagine what this test is good for, it doesn't bring any new insights.

@Devstellar
Copy link
Contributor

I compiled perf, recompiled clamav (1.0.1) with debug symbols and made some tries.

root:~# perf script > /tmp/out.perf
dso__load_sym: failed to find program header for symbol: _etext st_value: 0x277c5
root:~# perf record -F 100 -g -- clamscan Abstract\ booklet\ CNIC\ Inflammation\ Day.pdf
Loading:     9s, ETA:   0s [========================>]    8.67M/8.67M sigs       
Compiling:   2s, ETA:   0s [========================>]       41/41 tasks 

/root/Abstract booklet CNIC Inflammation Day.pdf: OK

----------- SCAN SUMMARY -----------
Known viruses: 8665703
Engine version: 1.0.1
Scanned directories: 0
Scanned files: 1
Infected files: 0
Data scanned: 472.84 MB
Data read: 17.62 MB (ratio 26.83:1)
Time: 132.078 sec (2 m 12 s)
Start Date: 2023:05:06 20:16:35
End Date:   2023:05:06 20:18:47
[ perf record: Woken up 4 times to write data ]
[ perf record: Captured and wrote 0.875 MB perf.data (13206 samples) ]
root:~# perf script > /tmp/out.perf
dso__load_sym: failed to find program header for symbol: _etext st_value: 0x277c5

Don't know about that symbol error.
I generated the SVG, but not sure if it is worth.

test

@martin-ms
Copy link
Author

As I already told in a previous comment, I don't know how to handle the -d ./unit_tests/clamav.hdb option in the given example, so I just used

perf record -F 100 -g -- /usr/bin/clamscan -ir $HOME

as the command line with the following results:

----------- SCAN SUMMARY -----------
Known viruses: 8665707
Engine version: 0.104.2
Scanned directories: 8457
Scanned files: 145265
Infected files: 3
Data scanned: 21981.75 MB
Data read: 15944.07 MB (ratio 1.38:1)
Time: 2221.037 sec (37 m 1 s)
Start Date: 2023:05:07 13:22:45
End Date:   2023:05:07 13:59:46
[ perf record: Woken up 55 times to write data ]
[ perf record: Captured and wrote 13,965 MB perf.data (218477 samples) ]

v0 104 2

----------- SCAN SUMMARY -----------
Known viruses: 8665703
Engine version: 1.0.1
Scanned directories: 8462
Scanned files: 145753
Infected files: 3
Data scanned: 28681.03 MB
Data read: 15988.41 MB (ratio 1.79:1)
Time: 6552.762 sec (109 m 12 s)
Start Date: 2023:05:07 14:15:39
End Date:   2023:05:07 16:04:52
[ perf record: Woken up 175 times to write data ]
[ perf record: Captured and wrote 44,184 MB perf.data (656843 samples) ]

v1 0 1

For me it's all useless stuff & wasted time, but for those who like it...

@rma-x
Copy link

rma-x commented Jun 21, 2023

This seems weird. Uploaded PDF takes 120 seconds in 0.105.1 with defaults. Note the scanned data 810 MB in a file of only 17 Mb in size...

I did some more tests with this PDF file and found that it seems to keep clamscan busy until one of the limits is hit, which can be shown by adding the --alert-exceeds-max=yes switch to the command line.

In 0.103.8 the default MaxFileSize gets hit pretty quickly, and that's why the scan appears to be so fast. When increasing the size limits the scan runs longer until it hits the MaxScanTime.

With newer version and their higher default size limits the MaxScanTime limit gets hit when running the engine with defaults, but when increasing the time, one of the size limits gets hit as well.

I tried this with size limits of up to 1000MB and time limits up to 200 seconds and got no regular finish of the scan process with either version.

I think the ClamAV team should have a closer look at this file to see why it is always driving ClamAV to its limits.

[Update] See my correction in the next post [/Update]

$ clamscan --alert-exceeds-max=yes Abstract.booklet.CNIC.Inflammation.Day.pdf
Loading:    12s, ETA:   0s [========================>]    8.67M/8.67M sigs       
Compiling:   3s, ETA:   0s [========================>]       41/41 tasks 

/root/Abstract.booklet.CNIC.Inflammation.Day.pdf: Heuristics.Limits.Exceeded.MaxScanTime FOUND

----------- SCAN SUMMARY -----------
Known viruses: 8669437
Engine version: 1.1.0
Scanned directories: 0
Scanned files: 1
Infected files: 1
Data scanned: 714.02 MB
Data read: 17.62 MB (ratio 40.52:1)
Time: 136.619 sec (2 m 16 s)
Start Date: 2023:06:21 14:37:08
End Date:   2023:06:21 14:39:25
$ clamscan --alert-exceeds-max=yes Abstract.booklet.CNIC.Inflammation.Day.pdf
/tmp/Abstract.booklet.CNIC.Inflammation.Day.pdf: Heuristics.Limits.Exceeded.MaxFileSize FOUND

----------- SCAN SUMMARY -----------
Known viruses: 8669401
Engine version: 0.103.8
Scanned directories: 0
Scanned files: 1
Infected files: 1
Data scanned: 168.30 MB
Data read: 17.62 MB (ratio 9.55:1)
Time: 25.319 sec (0 m 25 s)
Start Date: 2023:06:21 14:47:11
End Date:   2023:06:21 14:47:37
$ clamscan --alert-exceeds-max=yes --max-filesize=1000M --max-scansize=1000M Abstract.booklet.CNIC.Inflammation.Day.pdf 
LibClamAV Error: pdf_find_and_extract_objs: Timeout reached in the PDF parser while extracting objects.
/tmp/Abstract.booklet.CNIC.Inflammation.Day.pdf: Heuristics.Limits.Exceeded.MaxScanTime FOUND

----------- SCAN SUMMARY -----------
Known viruses: 8669401
Engine version: 0.103.8
Scanned directories: 0
Scanned files: 1
Infected files: 1
Data scanned: 758.67 MB
Data read: 17.62 MB (ratio 43.05:1)
Time: 136.556 sec (2 m 16 s)
Start Date: 2023:06:21 14:48:24
End Date:   2023:06:21 14:50:40

@rma-x
Copy link

rma-x commented Jun 21, 2023

I have to corrct myself regarding the assumed never-ending scan of that file. It turned out that with just a little more ressources than I had tried before, the scan does come to an end in both versions with comparable timings:

$ clamscan --alert-exceeds-max=yes --max-scantime=200000 --max-filesize=1200M --max-scansize=1200M Abstract.booklet.CNIC.Inflammation.Day.pdf 
Loading:    12s, ETA:   0s [========================>]    8.67M/8.67M sigs       
Compiling:   3s, ETA:   0s [========================>]       41/41 tasks 

/root/Abstract.booklet.CNIC.Inflammation.Day.pdf: OK

----------- SCAN SUMMARY -----------
Known viruses: 8669437
Engine version: 1.1.0
Scanned directories: 0
Scanned files: 1
Infected files: 0
Data scanned: 1001.11 MB
Data read: 17.62 MB (ratio 56.81:1)
Time: 181.161 sec (3 m 1 s)
Start Date: 2023:06:21 15:22:59
End Date:   2023:06:21 15:26:01
$ clamscan --alert-exceeds-max=yes --max-scantime=200000 --max-filesize=1200M --max-scansize=1200M Abstract.booklet.CNIC.Inflammation.Day.pdf 
/root/Abstract.booklet.CNIC.Inflammation.Day.pdf: OK

----------- SCAN SUMMARY -----------
Known viruses: 8669401
Engine version: 0.103.8
Scanned directories: 0
Scanned files: 1
Infected files: 0
Data scanned: 1001.11 MB
Data read: 17.62 MB (ratio 56.81:1)
Time: 174.856 sec (2 m 54 s)
Start Date: 2023:06:21 15:33:56
End Date:   2023:06:21 15:36:51

@micahsnyder
Copy link
Contributor

@martin-ms thank you for making the flamegraph. Sadly it lacks the debug symbols required to show real insight into what's going on. A debug build of clamav would be required (i.e. building with the -g CFLAG.

As @rma-x identified, it seems this particular file is very slow for both versions if you crank up the scan limits.

I'll see if I can do something similar with flamegraph and scanning the provided Abstract booklet CNIC Inflammation Day.pdf. Maybe it will shed some light on what this file is so slow to scan.

@jekv2
Copy link

jekv2 commented Aug 18, 2023

I just started using clamav yesterday on windows10 x64, 1.1.1v.

It takes 55minutes to scan 62GB on a m.2 3500MB/s ssd, i5-10600k oc'd to 5GHz, 32GB ram DDr4 F4-4000C16-16GTZRA OC'd at 4,266MHz, very fast system this is why I supplied my components.

55Minutes for 62GB OS. I have a 22TB HDD needs scan that has 8TB free, imagine how long it would take clamav to scan the 22TB hdd?

I am not trying it.

I am just here to inform you, clamav is awfully awfully unbearable slow.

Here is the cmds I am using.

clamdscan --multiscan --infected --move /quarantine_directory --log /clamav-directory

@teoberi
Copy link
Contributor

teoberi commented Aug 29, 2023

Nothing new yet, not even in version 1.2.0

@rma-x
Copy link

rma-x commented Oct 9, 2023

As @rma-x identified, it seems this particular file is very slow for both versions if you crank up the scan limits.

I'll see if I can do something similar with flamegraph and scanning the provided Abstract booklet CNIC Inflammation Day.pdf. Maybe it will shed some light on what this file is so slow to scan.

I had a closer look at that PDF file and found that it consists of 32 images each having 2623 x 3651 pixels, two for each of the 16 pages, one containing the actual content and one for the borders and cut marks. When I unpack all these images to 24 bit ppm files they take up a total of about 900 MB which might explain why clamscan reports about 1 GB of "Data scanned".

@micahsnyder I hope this can be useful for your further analysis of what ClamAV does with that file that takes so long and whether that's actually needed for proper scanning of the file.

@rma-x
Copy link

rma-x commented Oct 9, 2023

Additional observations: When I use poppler's pdfseparate to split the PDF into 16 one-page documents, scanning them all together produces similar results (time and scan size) to scanning the original file, but when I use pdfimages followed by convert (ImageMagick) to compose a new PDF from the same images, that new file is about ten times as large as the original, but has a much smaller scan size and gets scanned much faster:

Data scanned: 198.15 MB
Data read: 186.05 MB (ratio 1.07:1)
Time: 18.512 sec (0 m 18 s)

BTW, the original file is PDF-1.4 whereas convert produces a PDF-1.3 file, if that matters.

@martin-ms
Copy link
Author

For me it makes no difference if I scan PDFs or not. If I exclude them and all graphics using

--exclude=pdf$ --exclude=jpg$ --exclude=jpeg$ --excluede=png$

then the result with 0.104 is

----------- SCAN SUMMARY -----------
Known viruses: 8674404
Engine version: 0.104.2
Scanned directories: 8386
Scanned files: 154510
Infected files: 3
Data scanned: 10937.06 MB
Data read: 11960.04 MB (ratio 0.91:1)
Time: 1446.615 sec (24 m 6 s)
Start Date: 2023:10:09 16:44:25
End Date:   2023:10:09 17:08:32

and with 1.2.0

---------- SCAN SUMMARY -----------
Known viruses: 8674481
Engine version: 1.2.0
Scanned directories: 8386
Scanned files: 155970
Infected files: 3
Data scanned: 16714.86 MB
Data read: 12012.11 MB (ratio 1.39:1)
Time: 2695.635 sec (44 m 55 s)
Start Date: 2023:10:09 17:13:49
End Date:   2023:10:09 17:58:45

The execution time with version 1.2.0 is still about twice as long as with version 0.104.0.

Adding to all the misery today is the fact that I'm getting a lot of "LibClamAV Error: cli_html_normalise: style chunk size underflow " error messages that didn't exist in the past. Instead of getting better, it keeps getting worse.

I reported the problem as soon as it occurred. I have little understanding of the developers' inability to work out what the differences are between versions 0.104 and 0.105. They need to know what they changed between the two versions and start their investigation there. Instead, we as users have been looking for the causes for over a year, testing everything possible, sometimes with enormous effort, but without any tangible results.

I'm tired for more attempts; my strategy now is to continue using version 0.104 as long as possible and then to avoid clamav altogether because I can't expect any support from the developers. As far as I'm concerned we can close the bug report, there won't be anything more.

@teoberi
Copy link
Contributor

teoberi commented Oct 10, 2023

For me it makes no difference if I scan PDFs or not. If I exclude them and all graphics using

--exclude=pdf$ --exclude=jpg$ --exclude=jpeg$ --excluede=png$

then the result with 0.104 is

----------- SCAN SUMMARY -----------
Known viruses: 8674404
Engine version: 0.104.2
Scanned directories: 8386
Scanned files: 154510
Infected files: 3
Data scanned: 10937.06 MB
Data read: 11960.04 MB (ratio 0.91:1)
Time: 1446.615 sec (24 m 6 s)
Start Date: 2023:10:09 16:44:25
End Date:   2023:10:09 17:08:32

and with 1.2.0

---------- SCAN SUMMARY -----------
Known viruses: 8674481
Engine version: 1.2.0
Scanned directories: 8386
Scanned files: 155970
Infected files: 3
Data scanned: 16714.86 MB
Data read: 12012.11 MB (ratio 1.39:1)
Time: 2695.635 sec (44 m 55 s)
Start Date: 2023:10:09 17:13:49
End Date:   2023:10:09 17:58:45

The execution time with version 1.2.0 is still about twice as long as with version 0.104.0.

Adding to all the misery today is the fact that I'm getting a lot of "LibClamAV Error: cli_html_normalise: style chunk size underflow " error messages that didn't exist in the past. Instead of getting better, it keeps getting worse.

I reported the problem as soon as it occurred. I have little understanding of the developers' inability to work out what the differences are between versions 0.104 and 0.105. They need to know what they changed between the two versions and start their investigation there. Instead, we as users have been looking for the causes for over a year, testing everything possible, sometimes with enormous effort, but without any tangible results.

I'm tired for more attempts; my strategy now is to continue using version 0.104 as long as possible and then to avoid clamav altogether because I can't expect any support from the developers. As far as I'm concerned we can close the bug report, there won't be anything more.

The Clamav seems to become a difficult problem to manage. Maybe because of the lack of human or financial resources, maybe both!
The development team only contains 6 members (Scott Hutton, Andy Ragusa, Dave Raynor, Micah Snyder, Mickey Sola and Ravi Sundriyal)?
Anyway, it was difficult to manage after the departure of Tomasz Kojm's initial team.

@rma-x
Copy link

rma-x commented Oct 10, 2023

For me it makes no difference if I scan PDFs or not. If I exclude them and all graphics using

--exclude=pdf$ --exclude=jpg$ --exclude=jpeg$ --excluede=png$

then the result with 0.104 is

Data scanned: 10937.06 MB

and with 1.2.0

Data scanned: 16714.86 MB

The execution time with version 1.2.0 is still about twice as long as with version 0.104.0.

1.2.0 apparently checks a much larger portion of your data, which of course takes more time, but also has a chance of finding malware that wouldn't have been found before. Did you do these test runs with identical limits for the two versions or with the respective defaults?

BTW, if you want to stick with an older version for now, it might be better to stay with LTS version 0.103 which will be supported for another 11 months, whereas 0.104 and 0.105 are already EOL.

@martin-ms
Copy link
Author

Well, the reason for the performance degradation between 0.104 (and older) and 0.105 (and newer) is well known. It is because some default limits got increased:

I am aware of this, and as you can see from #590 (comment), I had set the limits to the values recommended by Micah Snyder. But nothing changed in the result, so that can't be the problem.

Did you do these test runs with identical limits for the two versions or with the respective defaults?

Yes, I only swapped the program version between the runs

BTW, if you want to stick with an older version for now, it might be better to stay with LTS version 0.103 which will be supported for another 11 months, whereas 0.104 and 0.105 are already EOL.

My provider only offers the current program version. I backed up version 0.104 and reinstalled it because it was the last working version before 0.105. I would have to build the installation package for LTS version 0.103 myself from the current sources and would also have to constantly monitor changes. And in 11 months I'll face the same problem again.

@rma-x
Copy link

rma-x commented Oct 11, 2023

Well, the reason for the performance degradation between 0.104 (and older) and 0.105 (and newer) is well known. It is because some default limits got increased:

I am aware of this, and as you can see from #590 (comment), I had set the limits to the values recommended by Micah Snyder. But nothing changed in the result, so that can't be the problem.

Yep, I noticed that after my post and therefore revoked it.

Did you do these test runs with identical limits for the two versions or with the respective defaults?

Yes, I only swapped the program version between the runs

Sorry, but it is still not clear to me if you went with the (differing) default limits or forced both versions to use the same limits in your latest performance comparison.

[...] And in 11 months I'll face the same problem again.

Well, I would hope that in 11 months either the issues with the newer versions will have been sorted out, or support for version 0.103 will be extended once more.

@martin-ms
Copy link
Author

Sorry, but it is still not clear to me if you went with the (differing) default limits or forced both versions to use the same limits in your latest performance comparison.

In the latest comparison I swapped only the program version and used their default limits, but I can repeat it again with the values mentioned by Micah Snyder for both runs, if that might be useful. But since the first try on Jun 8, 2022 didn't improve the performance I am in doubt that it would change it now.

@rma-x
Copy link

rma-x commented Oct 11, 2023

Well, using different limits will definitely result in different scan times, if there is a significant amount of files in your workload that exceed one of the the limits from the lower set. So, if you want to show that there are performance differences that are not caused by the limit changes you always have to to use identical limits for the two runs you want to compare.

BTW, the --exclude options you posted above only cover the lower-case variant of the file extensions you want to skip. Have you made sure that there are no uppercase PDF, JPG, etc. files in the tree you are scanning?

@martin-ms
Copy link
Author

OK… then here are the results for both runs with the same limits:

--exclude=pdf$ --exclude=jpg$ --exclude=jpeg$ --exclude=png$ --max-filesize=25M --max-scansize=100M --max-embeddedpe=10M --max-htmlnormalize=10M --max-htmlnotags=2M --max-scriptnormalize=5M --pcre-max-filesize=25M

----------- SCAN SUMMARY -----------
Known viruses: 8674691
Engine version: 0.104.2
Scanned directories: 8386
Scanned files: 154456
Infected files: 3
Data scanned: 10787.44 MB
Data read: 11790.80 MB (ratio 0.91:1)
Time: 1445.337 sec (24 m 5 s)
Start Date: 2023:10:11 13:08:10
End Date:   2023:10:11 13:32:16
----------- SCAN SUMMARY -----------
Known viruses: 8674775
Engine version: 1.2.0
Scanned directories: 8386
Scanned files: 155398
Infected files: 3
Data scanned: 10885.03 MB
Data read: 11824.57 MB (ratio 0.92:1)
Time: 1798.677 sec (29 m 58 s)
Start Date: 2023:10:11 13:34:17
End Date:   2023:10:11 14:04:16

This is not too bad at all, only 5 mins more, and this is the result with exclude of uppercase file extensions and limits:

--exclude=pdf$ --exclude=jpg$ --exclude=jpeg$ --exclude=png$ --exclude=PDF$ --exclude=JPG$ --exclude=JPEG$ --exclude=PNG$ --max-filesize=25M --max-scansize=100M --max-embeddedpe=10M --max-htmlnormalize=10M --max-htmlnotags=2M --max-scriptnormalize=5M --pcre-max-filesize=25M

---------- SCAN SUMMARY -----------
Known viruses: 8674775
Engine version: 1.2.0
Scanned directories: 8383
Scanned files: 156240
Infected files: 3
Data scanned: 10524.87 MB
Data read: 11482.78 MB (ratio 0.92:1)
Time: 1722.340 sec (28 m 42 s)
Start Date: 2023:10:11 14:06:59
End Date:   2023:10:11 14:35:42

Scanning all files in $HOME without exclude of filetypes but with max limits like above:

----------- SCAN SUMMARY -----------
Known viruses: 8674775
Engine version: 1.2.0
Scanned directories: 8383
Scanned files: 189700
Infected files: 3
Data scanned: 23685.84 MB
Data read: 19768.35 MB (ratio 1.20:1)
Time: 5423.161 sec (90 m 23 s)
Start Date: 2023:10:11 14:41:16
End Date:   2023:10:11 16:11:40

and the same with 0.104:

----------- SCAN SUMMARY -----------
Known viruses: 8674691
Engine version: 0.104.2
Scanned directories: 8383
Scanned files: 190731
Infected files: 3
Data scanned: 23810.29 MB
Data read: 19840.73 MB (ratio 1.20:1)
Time: 2536.135 sec (42 m 16 s)
Start Date: 2023:10:11 17:16:42
End Date:   2023:10:11 17:58:58

So it looks like after exclude of graphics and pdf the time for scanning is almost the same, but takes about two times longer across all files and setting limits in this case has no effect.

In the meantime, I was also able to build an installation package from the sources of 0.103.10 LTS, but I haven't tested it for functionality yet.

@tigerfoot
Copy link

to follow up.

@teoberi
Copy link
Contributor

teoberi commented Feb 29, 2024

No change in scan speed after removing the bytecode signatures here.

@micahsnyder
Copy link
Contributor

By the way, in 1.4 we'll be adding an option to disable image scanning and image fuzzy hashing.

For clamscan:

--scan-image[=yes(*)/no]
--scan-image-fuzzy-hash[=yes(*)/no]

for clamd.conf:

ScanImage yes(*)/no
ScanImageFuzzyHash yes(*)/no

Ref: https://github.com/Cisco-Talos/clamav/releases/tag/clamav-1.4.0-rc

This may help for those whole-harddrive scans where image fuzzy hashing, added in 0.105, was slowing down scans.

You may also wanto to adjust the max-filesize or max-scansize limits to match 0.104 and older versions.

@teoberi
Copy link
Contributor

teoberi commented Aug 28, 2024

The scanning speed depends a little on the hardware used, but it is still low. On the old server, scanning the .pdf from here takes 2 minutes, on the new one it takes 1 minute and 30 seconds.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests