Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[20.8.0] Unexpected interrupted return code #335

Closed
tuxmaster5000 opened this issue Sep 11, 2020 · 72 comments
Closed

[20.8.0] Unexpected interrupted return code #335

tuxmaster5000 opened this issue Sep 11, 2020 · 72 comments
Assignees

Comments

@tuxmaster5000
Copy link

tuxmaster5000 commented Sep 11, 2020

The scanner self will exits fine, but ospd-openvas will interpret it as an error.
ospd-openvas log:

(ospd.ospd) aae6b036-7a1c-47d4-8866-0e07c8799b48: Host scan finished.
(ospd.ospd) aae6b036-7a1c-47d4-8866-0e07c8799b48: Scan interrupted.
(ospd.ospd) aae6b036-7a1c-47d4-8866-0e07c8799b48: Scan stopped with errors.
(ospd.ospd) aae6b036-7a1c-47d4-8866-0e07c8799b48: Scan interrupted.

openvas log:

Vulnerability scan d667271d-11ef-4e68-bc0c-39fa0659e778 finished for host XXXXX in 993.56 seconds
Vulnerability scan d667271d-11ef-4e68-bc0c-39fa0659e778 finished in 1000 seconds: 1 hosts

No error was logged by openvas.

openvas --version

OpenVAS 20.8.0
gvm-libs 20.8.0

ospd version:

20.8.1

Most new code since 2005: (C) 2020 Greenbone Networks GmbH
Nessus origin: (C) 2004 Renaud Deraison deraison@nessus.org
License GPLv2: GNU GPL version 2
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.

@jjnicola
Copy link
Member

Hi @tuxmaster5000,

could you provide please some more information about the scanner configuration? Is it a full & fast scan config? Some special option for scanner? does the target has something special (behind a firewall)? Did you see in the report or in the log any plugin timeout error / closed port ?

@tuxmaster5000
Copy link
Author

No I don't use any special options, it is an simple full & fast scan. I only use the default setting for the scanner. The only noticeable is, that it will happens not on all hosts. An really error is not show in the report under the error page.
Only the "universal error":

Scan process failure.  
Scan process failure.  
Task interrupted unexpectedly

When I let run openvas in debug mode, I only see the typical error/warning messages about "Not launching because a mandatory key is missing (this is not an error)" on some tests. But the string "Scan process failure" don't occur in the debug log of openvas.

Let an grep run over the debug log don't shown any real error.
Only thinks like:

lib nasl: DEBUG:2020-09-11 05h57.22 utc:323: Request => /_errors/
sd main:MESSAGE:2020-09-11 06h17.16 utc:2302: Plugin Policy/policy_BSI-TR-03116-4_error.nasl is deprecated. It will neither be loaded nor launched.
sd main:MESSAGE:2020-09-11 06h27.36 utc:2330: Launching 2014/gb_ajenti_respond_error_mult_xss_vuln.nasl (1.3.6.1.4.1.25623.1.0.804654) against XXXX [30826]
sd main:MESSAGE:2020-09-11 06h27.37 utc:2330: 2014/gb_ajenti_respond_error_mult_xss_vuln.nasl (1.3.6.1.4.1.25623.1.0.804654) [30826] finished its job in 0.147 seconds

When an real error will occur, I think openvas will log it in the debug mode.

@rolf-d2i
Copy link

I am experiencing something similar with ospdf-openvas. Looks like the same error as above. I don't see how the error below is related to a configuration error. I havent looked into this in detail but set_scan_total_hosts looks like it has missing method being called from somewhere which I don't think is related to configuration errors but source code issues.

OSPD[30973] 2020-10-20 08:30:06,334: ERROR: (ospd.ospd) While scanning: 164a83fe-17e9-42f9-b502-081e3c5b96b4
Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/ospd-20.8.1-py3.7.egg/ospd/ospd.py", line 571, in start_scan
self.exec_scan(scan_id)
File "/usr/local/lib/python3.7/dist-packages/ospd_openvas-20.8.0-py3.7.egg/ospd_openvas/daemon.py", line 1351, in exec_scan
self.report_openvas_results(kbdb, scan_id, "")
File "/usr/local/lib/python3.7/dist-packages/ospd_openvas-20.8.0-py3.7.egg/ospd_openvas/daemon.py", line 1123, in report_openvas_results
self.set_scan_total_hosts(scan_id, count_total)
AttributeError: 'OSPDopenvas' object has no attribute 'set_scan_total_hosts'
OSPD[30973] 2020-10-20 08:30:06,336: INFO: (ospd.ospd) 164a83fe-17e9-42f9-b502-081e3c5b96b4: Scan interrupted.

@CipherMonger
Copy link

Same issue here as well. Details are as follows:

  • OpenVAS version 20.8.0
  • Port List: All IANA assigned TCP
  • Alive Test: Scan Config Default
  • Scanner: OpenVAS Default
  • Scan Config: Full and fast

The report shows some NVT timeouts followed by "Scan Process Failure" and "Task interrupted unexpectedly", but the openvas.log output says:

sd main:MESSAGE:2020-10-26 21h42.55 utc:2791: Vulnerability scan fb1569fe-58ae-443e-8677-1379516cfbc8 finished in 26825 seconds: 2536 hosts

No indication of errors in the log.

@rolf-d2i
Copy link

I managed to fix my error by patching the source code in ospd_openvas/daemon.py replacing the except statement on line 1124 with " except:".
After patching the source code I could actually scan some targets.

@tuxmaster5000
Copy link
Author

@rolf-d2i , can you post an patch for it? Because on line 1124 I don't see an except statement:

1120             )
1121 
1122         return total_results
1123 
1124     def report_openvas_timestamp_scan_host(
1125         self, scan_db: ScanDB, scan_id: str, host: str
1126     ):
1127         """ Get start and end timestamp of a host scan from redis kb. """
1128         timestamp = scan_db.get_host_scan_end_time()
1129         if timestamp:
1130             self.add_scan_log(
1131                 scan_id, host=host, name='HOST_END', value=timestamp

@rolf-d2i
Copy link

The issue i found is related to release 20.8.1 code, processing HOSTS_COUNT (starting on line 1119 in daemon.py), which doesn't work in that branch and version. Ignoring errors from this part of the code was good enough for me, it is more of a hack than a patch.

        # To update total host count
        if msg[0] == 'HOSTS_COUNT':
            try:
                count_total = int(msg[4])
                self.set_scan_total_hosts(scan_id, count_total)
            except:
                logger.debug('Error processing total host count')

@bjoernricks
Copy link
Contributor

The set_scan_total_hosts is only used in the ospd-openvas-20.08 branch and not included in any release yet. So if you use an unreleased version from the ospd-openvas-20.08 branch you need to use the ospd-20.08 branch for ospd too.

The patch from @rolf-d2i is wrong. It wont work and just resulting in a different issue.

@rolf-d2i
Copy link

No this is wrong @bjoernrick. I was building OpenVAS in a docker container (see below). The 20.08 is supposed to be stable released version (when looking at the documentation and github), if it is not a stable released version then the clone command is picking up the wrong version of OpenVAS for some reason.

   RUN git clone -b ospd-openvas-20.08 https://github.com/greenbone/ospd-openvas.git /root/source/ospd-openvas

@jjnicola
Copy link
Member

Hi @rolf-d2i ,
as @bjoernricks explained, your issue is not related with the original one reported by @tuxmaster5000. The patch generating your issue, was added with the PR #343, while this issue was reported before that.
You are using the current state of the 20.08 branch which includes non released patches. Please, use the last release for your installation.
Regards

@rolf-d2i
Copy link

As I mentioned before @jjnicola . I have built the version from source using what is documented as the latest stable version and encountering "Unexpected interrupted return issues". If there is an issue with the latest stable version then there should be at least a new issue when building the latest stable version. I was building everything from scratch using the same version.
I don't think there is anything in the documentation stating which 20.08.X version is considered stable or if there are unstable 20.08.X versions.

@bjoernricks
Copy link
Contributor

@rolf-d2i again you are mixing things up. A release can be found at the releases page of a repository at GitHub e.g. https://github.com/greenbone/ospd-openvas/releases Each release refers to a git tag. If you want to use the latest release you have to use https://github.com/greenbone/ospd-openvas/tree/v20.8.0 for ospd-openvas and https://github.com/greenbone/ospd/tree/v20.8.1 for ospd. New bugfix releases will be created from the release branches e.g. from https://github.com/greenbone/ospd/tree/ospd-20.08 If you are using a release branch of ospd-openvas you must use the release branch of ospd too. You can't combine the 20.8.1 release of ospd with the ospd-openvas 20.8 branch. The lastest stable version is not contained in a release branch. The release branch contains the next going to be released version.

@bjoernricks
Copy link
Contributor

And also again your issue has nothing to do with the original reported one here.

@rolf-d2i
Copy link

@bjoernricks as I mentioned before what versions should be used is poorly documented. Certainly the page "https://community.greenbone.net/t/gvm-20-08-stable-initial-release-2020-08-12/6312" doesn't clearly say anything about 20.08.X versions. It is unexpected behaviour that 20.08.01 release breaks with 20.08.00, you would commonly expect major.minor.patch as version numbering. The page lists for instance OSPd 20.8.1 together with 20.8.0 components implying you should use the latest 20.08.X version. This is honestly a very confusing version handling scheme/git management policy you are using.

@bjoernricks
Copy link
Contributor

Yes this might be confusing from the outside and this specific error will not happen very often. But to be clear even if we would use semantic versioning (which we don't do) it wouldn't indicate that the unreleased ospd-openvas need an unreleased version of ospd or that a 21.1.0 versions of ospd-openvas doesn't work with 20.11.4. This is a task for the dependency management. The dependencies are set correctly for poetry but there seems to be a missing change in the setup.py file.

In most situations it should be fine to mix the latest releases with using release branches. But personally I would never do that and I would not advise to do that. I don't even advise users to build from git checkouts. Even the release announcement says

GVM is published as regularly updated and tested source code releases.

So my general advise is to use the latest released versions. If you are familiar with fixing issues by yourself especially with debugging error messages like the one you got it is fine to build from git.

@zenire
Copy link

zenire commented Nov 4, 2020

Looks like this issue occurs after the scan itself has finished. For some reason the process stays hanging while status is Done/Finished and progress is 100%. I get this error after the NMAP scan and NASL tests are done.

@zenire
Copy link

zenire commented Nov 5, 2020

@tuxmaster5000 @CipherMonger Could you share if you set <exclude_hosts /> or share your create target XML?

@zenire
Copy link

zenire commented Nov 5, 2020

I repeatedly started scans with the following results:
<exclude_hosts>192.168.1.1</exclude_hosts> succeeds
<exclude_hosts>192.168.100.200</exclude_hosts> succeeds
<exclude_hosts>192.168.1.1,192.168.2.5</exclude_hosts> fails
<exclude_hosts>192.168.1.0/24</exclude_hosts> fails

A fail returns:

Host scan finished.
Scan stopped with errors.
Scan interrupted
Scan stopped with errors.

@CipherMonger
Copy link

Yes, I have this set. Here is that section of my target XML:
<exclude_hosts>192.168.200.59, 192.168.200.161</exclude_hosts>

In my case, I get Interrupted at 99%.

@zenire
Copy link

zenire commented Nov 5, 2020

Yes, I have this set. Here is that section of my target XML:
<exclude_hosts>192.168.200.59, 192.168.200.161</exclude_hosts>

In my case, I get Interrupted at 99%.

Could you try a scan where you only exclude 1 host/ip, instead of 2 and see if that works?

@jjnicola
Copy link
Member

jjnicola commented Nov 6, 2020

Hi @zenire @CipherMonger
Ospd calculates the scan progress taking into account the total amount of hosts, dead hosts, excluded hosts, finished hosts. Additionally, if openvas is not able to resolve a host name, it will neither be scanned nor considered dead. If a scan was not stopped (ended), but the scan progress is less than 100%, it is considered "interrupted". It seems there is some issue with this calculation, where the scanner finishes but the progress is still below 100% for
With the PRs greenbone/openvas-scanner#606 #343 and greenbone/ospd#332 this should be solved (these patches are not included in any stable release), as it is openvas which says now, how many hosts will be finally scanned and how many of them not, simplifying the calculation for ospd.

Despite this ends in the same "interrupted scan", it seems not to be the same issue reported by @tuxmaster5000. IIUC, the original issue is a F&F scan with one host. Therefore, not possible a non scanned host for exclusion, non-resolved, etc.
I could reproduce the issue with non-resolved or with excluded hosts (as mentioned, fixed ), but not the original one reported by @tuxmaster5000.

@tuxmaster5000
Copy link
Author

@jjnicola, Yes in my case the scan not stops at 99% or so. It will stops randomly between 39% and 69%.

@lyhistory
Copy link

same issue, it would be better if gvm provider more detailed error msg instead of just saying stopped, here is the screenshots
image
image

@bechavod
Copy link

same
image

@Virsacer
Copy link

Virsacer commented Jan 8, 2021

I have the same problem:
Interrupted at 76 %
20.8.0 Release

==> /gvm/var/log/gvm/ospd-openvas.log <==
OSPD[48] 2021-01-08 00:30:14,317: INFO: (ospd.command.command) Scan 6e9d313f-ab6e-44a7-91c0-0c0e8a166134 added to the queue in position 1.
OSPD[48] 2021-01-08 00:30:15,178: INFO: (ospd.ospd) Currently 1 queued scans.
OSPD[48] 2021-01-08 00:30:15,231: INFO: (ospd.ospd) Starting scan 6e9d313f-ab6e-44a7-91c0-0c0e8a166134.

==> /gvm/var/log/gvm/gvmd.log <==
event task:MESSAGE:2021-01-08 00h30.19 CET:2318: Status of task xxxxx (1efb9532-bda3-4935-9773-319ba47f0417) has changed to Running
WARNING: cipher_setiv: ivlen=15 blklen=16

==> /gvm/var/log/gvm/ospd-openvas.log <==
OSPD[48] 2021-01-08 02:11:29,582: INFO: (ospd.ospd) 6e9d313f-ab6e-44a7-91c0-0c0e8a166134: Host scan finished.
OSPD[48] 2021-01-08 02:11:29,584: INFO: (ospd.ospd) 6e9d313f-ab6e-44a7-91c0-0c0e8a166134: Scan interrupted.

==> /gvm/var/log/gvm/gvmd.log <==
event task:MESSAGE:2021-01-08 02h11.30 CET:2318: Status of task xxxxx (1efb9532-bda3-4935-9773-319ba47f0417) has changed to Interrupted

==> /gvm/var/log/gvm/ospd-openvas.log <==
OSPD[48] 2021-01-08 02:11:30,300: INFO: (ospd.ospd) 6e9d313f-ab6e-44a7-91c0-0c0e8a166134: Scan stopped with errors.
OSPD[48] 2021-01-08 02:11:30,301: INFO: (ospd.ospd) 6e9d313f-ab6e-44a7-91c0-0c0e8a166134: Scan interrupted.
OSPD[48] 2021-01-08 02:11:30,404: INFO: (ospd.ospd) 6e9d313f-ab6e-44a7-91c0-0c0e8a166134: Scan stopped with errors.
OSPD[48] 2021-01-08 02:11:30,404: INFO: (ospd.ospd) 6e9d313f-ab6e-44a7-91c0-0c0e8a166134: Scan interrupted.

In most situations it should be fine to mix the latest releases with using release branches. But personally I would never do that and I would not advise to do that. I don't even advise users to build from git checkouts. Even the release announcement says

GVM is published as regularly updated and tested source code releases.

So my general advise is to use the latest released versions. If you are familiar with fixing issues by yourself especially with debugging error messages like the one you got it is fine to build from git.

And when are the 31+100+91+36+221+167 commits in the 20.08-branches released as stable?

@wisukind
Copy link

wisukind commented Jan 8, 2021 via email

@jjnicola
Copy link
Member

Hello,
I added more debug level log messages to ospd and ospd-openvas. Still not able to reproduce the issue, despite I tried with many different setups. I hope the new logs messages can help to find the issue.
It would be nice if you could set debug log level and share the new logs. Also, if possible, please provide the scan config, target config and task config, as well as the environment you are scanning (master/sensor setup, through a firewall, etc). Does it happen always with the same target? Maybe you find a segmentation fault in some system log file (syslog, kernel.log, debug, messages,etc). Any information is helpful.

greenbone/ospd#352
#375

@wisukind
Copy link

wisukind commented Jun 8, 2021

Note also that in my case; although the task is set as finished on openvas side; If I resume the task after Interruption it will resume where it left before crashing (67%), and after some time will be set to Interrupted state again. Restarting the task from scratch doesn't seems to help.

@Kraemii
Copy link
Member

Kraemii commented Jun 15, 2021

Hi!
Thank you for your report on this topic. I was working on this for the past week and as before no one in my team was able to reproduce it. What bothers me is that the times in the logs that you posted does not match.
You said you are able to reproduce the issue systematically, does that mean on the one specific task you always run into the same error with an interrupted scan?
I am not quite sure: which version of ospd-openvas and openvas did you use?
I added a log message to narrow down a specific case in which the scanner could possibly get set as interrupted even so openvas is not finished. This is in the newest versions of 20.08, 21.04 and master branch.

Again thank you for your participation in hunting this bug!

@wisukind
Copy link

wisukind commented Jun 15, 2021 via email

@wisukind
Copy link

I've installed latest 21.04 and currently re-running the failing task with it. I'll let you know.*

gvm@ov-slave-monterrey:~$ openvas --version
OpenVAS 21.4.1~git-7d6eff81-openvas-21.04
GIT revision ~git-7d6eff81-openvas-21.04
gvm-libs 21.4.1~git-714a5ea2-gvm-libs-21.04
Most new code since 2005: (C) 2021 Greenbone Networks GmbH
Nessus origin: (C) 2004 Renaud Deraison <deraison@nessus.org>
License GPLv2: GNU GPL version 2
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.

gvm@ov-slave-monterrey:~$ /opt/gvm/bin/ospd-scanner/bin/ospd-openvas --version
OSP Server for openvas: 21.4.1
OSP: 21.4.1
OSPd OpenVAS: 21.4.1

@wisukind
Copy link

I've installed latest 21.04 and currently re-running the failing task with it. I'll let you know.*

gvm@ov-slave-monterrey:$ openvas --version
OpenVAS 21.4.1
git-7d6eff81-openvas-21.04
GIT revision git-7d6eff81-openvas-21.04
gvm-libs 21.4.1
git-714a5ea2-gvm-libs-21.04
Most new code since 2005: (C) 2021 Greenbone Networks GmbH
Nessus origin: (C) 2004 Renaud Deraison deraison@nessus.org
License GPLv2: GNU GPL version 2
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.

gvm@ov-slave-monterrey:~$ /opt/gvm/bin/ospd-scanner/bin/ospd-openvas --version
OSP Server for openvas: 21.4.1
OSP: 21.4.1
OSPd OpenVAS: 21.4.1

@wisukind
Copy link

This is weird. The problem has gone on this particular task ! So far it happened systematically but the last scan done with the patched version worked without problem and was super fast.

The only thing I did was update the OS to latest patches and reboot.

I'm thinking it's perhaps a problem which occurs only with certain targets & probes.

Nevertheless, I will roll out this version to all my slaves. We are running dozens of scans per week, so one time or another the same problem will arise again and I'll be able to report more debug informations.

@bjoernricks
Copy link
Contributor

Yes this issue is really really weird. We still are not able to reproduce it reliably. Therefore we are very thankful for every feedback

@wisukind
Copy link

wisukind commented Jun 19, 2021

OK, I was able to reproduce the problem with your patched version. See below the latest logs just before task was set to Interrupted. Hope that helps.

ospd Logs:

OSPD[241864] 2021-06-19 21:07:49,048: DEBUG: (root) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current progress:
OSPD[241864] 2021-06-19 21:07:49,049: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as dead: []
OSPD[241864] 2021-06-19 21:07:49,050: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as finished: []
OSPD[241864] 2021-06-19 21:07:49,124: DEBUG: (ospd_openvas.daemon) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Inserting 17 results into scan collection table
OSPD[241864] 2021-06-19 21:07:49,126: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:49,127: DEBUG: (root) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current progress:
OSPD[241864] 2021-06-19 21:07:49,128: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as dead: []
OSPD[241864] 2021-06-19 21:07:49,129: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as finished: []
OSPD[241864] 2021-06-19 21:07:49,197: DEBUG: (ospd_openvas.daemon) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Inserting 16 results into scan collection table
OSPD[241864] 2021-06-19 21:07:49,199: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:49,200: DEBUG: (root) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current progress:
OSPD[241864] 2021-06-19 21:07:49,201: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as dead: []
OSPD[241864] 2021-06-19 21:07:49,201: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as finished: []
OSPD[241864] 2021-06-19 21:07:49,281: DEBUG: (ospd_openvas.daemon) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Inserting 17 results into scan collection table
OSPD[241864] 2021-06-19 21:07:49,283: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:49,284: DEBUG: (root) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current progress:
OSPD[241864] 2021-06-19 21:07:49,285: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as dead: []
OSPD[241864] 2021-06-19 21:07:49,286: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as finished: []
OSPD[241864] 2021-06-19 21:07:49,367: DEBUG: (ospd_openvas.daemon) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Inserting 20 results into scan collection table
OSPD[241864] 2021-06-19 21:07:49,369: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:49,370: DEBUG: (root) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current progress:
OSPD[241864] 2021-06-19 21:07:49,371: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as dead: []
OSPD[241864] 2021-06-19 21:07:49,372: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as finished: []
OSPD[241864] 2021-06-19 21:07:49,460: DEBUG: (ospd_openvas.daemon) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Inserting 20 results into scan collection table
OSPD[241864] 2021-06-19 21:07:49,463: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:49,464: DEBUG: (root) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current progress:
OSPD[241864] 2021-06-19 21:07:49,465: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as dead: []
OSPD[241864] 2021-06-19 21:07:49,466: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as finished: []
OSPD[241864] 2021-06-19 21:07:49,573: DEBUG: (ospd_openvas.daemon) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Inserting 21 results into scan collection table
OSPD[241864] 2021-06-19 21:07:49,574: DEBUG: (ospd_openvas.daemon) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Host 10.234.37.104 has progress: 100
OSPD[241864] 2021-06-19 21:07:49,576: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:49,579: DEBUG: (root) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current progress:
OSPD[241864] 2021-06-19 21:07:49,583: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as dead: []
OSPD[241864] 2021-06-19 21:07:49,583: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as finished: ['10.234.37.104']
OSPD[241864] 2021-06-19 21:07:49,584: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Remove the following hosts from the target list, as they are already finished or are dead: ['10.234.37.104']
OSPD[241864] 2021-06-19 21:07:49,609: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan status: RUNNING,
OSPD[241864] 2021-06-19 21:07:49,609: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:49,609: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Check scan process:
OSPD[241864] 2021-06-19 21:07:49,609: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan status: RUNNING,
OSPD[241864] 2021-06-19 21:07:49,610: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:49,610: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan status: RUNNING,
OSPD[241864] 2021-06-19 21:07:49,611: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Results sent successfully to the client. Cleaning temporary result list.
OSPD[241864] 2021-06-19 21:07:49,661: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan status: RUNNING,
OSPD[241864] 2021-06-19 21:07:49,661: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:49,661: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Check scan process:
OSPD[241864] 2021-06-19 21:07:49,661: DEBUG: (ospd_openvas.daemon) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Inserting 11 results into scan collection table
OSPD[241864] 2021-06-19 21:07:49,663: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan status: RUNNING,
OSPD[241864] 2021-06-19 21:07:49,663: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:49,666: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan status: RUNNING,
OSPD[241864] 2021-06-19 21:07:49,666: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:49,670: DEBUG: (root) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current progress:
OSPD[241864] 2021-06-19 21:07:49,672: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as dead: []
OSPD[241864] 2021-06-19 21:07:49,672: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as finished: []
OSPD[241864] 2021-06-19 21:07:49,715: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Results sent successfully to the client. Cleaning temporary result list.
OSPD[241864] 2021-06-19 21:07:49,726: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:49,727: DEBUG: (root) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current progress:
OSPD[241864] 2021-06-19 21:07:49,728: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as dead: []
OSPD[241864] 2021-06-19 21:07:49,729: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as finished: []
OSPD[241864] 2021-06-19 21:07:50,733: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:50,734: DEBUG: (root) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current progress:
OSPD[241864] 2021-06-19 21:07:50,734: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as dead: []
OSPD[241864] 2021-06-19 21:07:50,735: DEBUG: (ospd.scan) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Setting the following hosts as finished: []
OSPD[241864] 2021-06-19 21:07:50,735: DEBUG: (ospd_openvas.daemon) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Target is finished
OSPD[241864] 2021-06-19 21:07:50,735: DEBUG: (ospd_openvas.daemon) cf376b64-9aee-4f2e-81d3-a67a086d95a4: End Target. Release main database
OSPD[241864] 2021-06-19 21:07:50,736: INFO: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Host scan finished.
OSPD[241864] 2021-06-19 21:07:50,736: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan status: RUNNING,
OSPD[241864] 2021-06-19 21:07:50,736: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:50,737: DEBUG: (root) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current progress:
OSPD[241864] 2021-06-19 21:07:50,738: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:50,738: INFO: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Host scan finished. Progress: 21, Status: RUNNING
OSPD[241864] 2021-06-19 21:07:50,738: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Set scan status INTERRUPTED,
OSPD[241864] 2021-06-19 21:07:50,738: INFO: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Scan interrupted.
OSPD[241864] 2021-06-19 21:07:50,738: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:50,739: DEBUG: (root) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current progress:
OSPD[241864] 2021-06-19 21:07:50,781: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan status: INTERRUPTED,
OSPD[241864] 2021-06-19 21:07:50,782: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:50,782: INFO: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Scan process is dead and its progress is 21
OSPD[241864] 2021-06-19 21:07:50,782: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Set scan status INTERRUPTED,
OSPD[241864] 2021-06-19 21:07:50,783: INFO: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Scan interrupted.
OSPD[241864] 2021-06-19 21:07:50,783: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Check scan process:
OSPD[241864] 2021-06-19 21:07:50,783: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan status: INTERRUPTED,
OSPD[241864] 2021-06-19 21:07:50,783: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:50,783: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan status: INTERRUPTED,
OSPD[241864] 2021-06-19 21:07:50,784: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Results sent successfully to the client. Cleaning temporary result list.
OSPD[241864] 2021-06-19 21:07:51,533: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan status: INTERRUPTED,
OSPD[241864] 2021-06-19 21:07:51,533: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan progress: 21,
OSPD[241864] 2021-06-19 21:07:51,534: INFO: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Scan process is dead and its progress is 21
OSPD[241864] 2021-06-19 21:07:51,534: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Set scan status INTERRUPTED,
OSPD[241864] 2021-06-19 21:07:51,535: INFO: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Scan interrupted.
OSPD[241864] 2021-06-19 21:07:51,535: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Check scan process:
OSPD[241864] 2021-06-19 21:07:51,536: DEBUG: (ospd.ospd) cf376b64-9aee-4f2e-81d3-a67a086d95a4: Current scan status: INTERRUPTED,

openvas.log

sd main:MESSAGE:2021-06-19 10h26.30 utc:156222: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished for host 10.234.39.68 in 1886.43 seconds
sd main:MESSAGE:2021-06-19 10h30.38 utc:37166: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished for host 10.234.42.144 in 1247.39 seconds
sd main:MESSAGE:2021-06-19 10h36.47 utc:139012: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished for host 10.234.39.25 in 2717.57 seconds
sd main:MESSAGE:2021-06-19 10h37.12 utc:3117: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished for host 10.234.42.131 in 1834.90 seconds
sd main:MESSAGE:2021-06-19 10h37.14 utc:95393: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished for host 10.234.39.2 in 1234.85 seconds
sd main:MESSAGE:2021-06-19 10h39.18 utc:34893: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished for host 10.234.36.117 in 1780.30 seconds
sd main:MESSAGE:2021-06-19 10h39.29 utc:12480: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished for host 10.234.37.1 in 1894.58 seconds
sd main:MESSAGE:2021-06-19 10h40.42 utc:8236: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished for host 10.234.42.135 in 2004.86 seconds
sd main:MESSAGE:2021-06-19 10h42.40 utc:108227: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished for host 10.234.38.217 in 3447.68 seconds
sd main:MESSAGE:2021-06-19 10h48.16 utc:116816: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished for host 10.234.39.1 in 1721.25 seconds
sd main:MESSAGE:2021-06-19 10h49.08 utc:103345: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished for host 10.234.37.94 in 3987.17 seconds
sd main:MESSAGE:2021-06-19 10h49.25 utc:105167: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished for host 10.234.42.166 in 1894.44 seconds
sd main:MESSAGE:2021-06-19 10h52.02 utc:121885: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished for host 10.234.36.40 in 1891.72 seconds
sd main:MESSAGE:2021-06-19 10h53.58 utc:43319: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished for host 10.234.36.15 in 2605.61 seconds
sd main:MESSAGE:2021-06-19 11h02.00 utc:37180: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished for host 10.234.43.111 in 5491.19 seconds
sd main:MESSAGE:2021-06-19 19h07.49 utc:60285: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished for host 10.234.37.104 in 34385.34 seconds
sd main:MESSAGE:2021-06-19 19h07.50 utc:242198: Vulnerability scan cf376b64-9aee-4f2e-81d3-a67a086d95a4 finished in 38595 seconds: 82 alive hosts of 1710

This is weird, as both ospd and openvas confirm the scan task is finished, but instead of putting a 100% score and settings the task as finished, it immediately put it as Interrupted without further details. I don't know what the patch was suppose to do, but it seems to me it failed since I see no difference in the logs vs previous unpatched version.

@tuxmaster5000
Copy link
Author

I have an idea for it. I have seen this messages last week, when no redis connection was available any more. Because to much parallel jobs was running. After allowing redis more parallel databases it was gone. I have set databases to 512 in the redis.conf

@wisukind
Copy link

@tuxmaster5000 not my case here unfortunately. My redis databases number here is set to 65535...

@Virsacer
Copy link

I have one job sheduled every day and 4 jobs spread over the weekends.

With 20.8.0 and 21.4.0 the daily job did complete about every ten days or so.
Two of the weekend-jobs did work most of the time and the other two did not complete most of the time.

On thursday I updated to the latest from 21.04-branch and all jobs have completed successfully since then :)

@wisukind
Copy link

wisukind commented Jun 21, 2021

@Virsacer that's the problem. This issue is completely erratic. Sometimes it happens all the times, sometimes everything runs without issues. Right now I have two tasks which constantly ends up in interrupted state; while using the latest 21.04 version. But I don't see anything in the logs explaining why a running task, right after being set as "finished" by ospd is put in Interrupted state. That's completely weird.

@Kraemii
Copy link
Member

Kraemii commented Jul 8, 2021

To give you a short update: I put quite some work into restructuring the code and eliminating some possible race conditions. These changes are contained in the master and 21.04 branch. Thanks again for the many feedback we get regarding this topic!

@wisukind
Copy link

wisukind commented Jul 8, 2021

Thanks Kraemii. Will give it a try and let you know. Regards

@jjnicola
Copy link
Member

jjnicola commented Aug 6, 2021

HI everyone! I was able to reproduce the issue for a few scans, even with the patch from @Kraemii, since the problem was in openvas-scanner. This got me some hints. The issue should be fixed with greenbone/openvas-scanner#832 .
Please let me know if you still can reproduce the issue after applying the patch.
Regards,
Juan

@wisukind
Copy link

wisukind commented Aug 6, 2021

Thanks guys for the update and trying to fix this problem. I've applied the patch from Kraemii a couple of weeks ago, and ran dozens of scans after that on targets usually prone to this problem, and so far I wasn't able to reproduce the issue anymore. So even if it doesn't fix totally the problem, it seems to help. I'll continue to report any new findings here.

@wisukind
Copy link

All,

I had the problem again, but now the situation is a bit different. Basically what happens now is:

  1. Ospd / Openvas finishs the scan at 100%
  2. For whatever reason, gvmd is stuck at various percentage levels, and isn't progressing anymore
  3. Situation rest like this, until the admin stop the task. Stopping the task would yeld the following error message on gvmd:

`event task:MESSAGE:2021-08-11 08h11.21 UTC:1788: Status of task Cedar Rapids, VOIP (c54c7ddb-a119-4417-8bbd-7952f6f2faab) has changed to Stop Requested
event task:MESSAGE:2021-08-11 08h11.22 UTC:1788: Status of task Cedar Rapids, VOIP (c54c7ddb-a119-4417-8bbd-7952f6f2faab) has changed to Stopped
event task:MESSAGE:2021-08-11 08h12.15 UTC:1918: Status of task Cedar Rapids, VOIP (c54c7ddb-a119-4417-8bbd-7952f6f2faab) has changed to Requested
md manage:WARNING:2021-08-11 08h12.15 UTC:2083: OSP start_scan c54c7ddb-a119-4417-8bbd-7952f6f2faab: Couldn't send stop_scan command to scanner
event task:MESSAGE:2021-08-11 08h12.15 UTC:1918: Task Cedar Rapids, VOIP (c54c7ddb-a119-4417-8bbd-7952f6f2faab) has been resumed by tatooin
event task:MESSAGE:2021-08-11 08h12.17 UTC:1937: Status of task Cedar Rapids, VOIP (c54c7ddb-a119-4417-8bbd-7952f6f2faab) has changed to Done
event task:MESSAGE:2021-08-11 08h12.34 UTC:1937: Status of task Cedar Rapids, VOIP (c54c7ddb-a119-4417-8bbd-7952f6f2faab) has changed to Interrupted

`

While on ospd side, you have:

OSPD[31890] 2021-08-11 03:06:17,001: DEBUG: (ospd.ospd) 353e53cf-550d-458e-aee5-a2da966c2fe4: Current scan progress: 100,
OSPD[31890] 2021-08-11 03:06:17,001: DEBUG: (ospd.ospd) 353e53cf-550d-458e-aee5-a2da966c2fe4: Current scan status: FINISHED,
OSPD[31890] 2021-08-11 03:06:17,002: DEBUG: (ospd.ospd) 353e53cf-550d-458e-aee5-a2da966c2fe4: Results sent successfully to the client. Cleaning temporary result list.
  1. Tasks is set to Stopped; then to Done, and almost immediately switch from Done to Interrupted.

Step 2) should ring some bells to old timers. We exactly had this very problem with GVM 11, and this bug was closed as too old. But seems it hasn't been resolved either...

There is really something broken in the communication protocol between ospd and gvmd. How comes gvmd has so many issues tracking scan progress & status, while everything seems to work quite fine on ospd side ?

On my end it seems I can narrow down the problem to one scanner on some tasks in particular. I'llupdate ospd/openvas with #882 and see if it happens again.
Regards

@wisukind
Copy link

Same problem with patch #832.

Scans are continuing to run on ospd side; while frozen then stopped on gvmd side. Trying to resume the task will actually:

  1. Start a new scan on the scanner side (note it's a NEW scan, so it will run concurrently with the initial one, still running !!)
  2. After a few seconds on gvmd side; scan status will change to Interrupted state
  3. Both scans continue to run on ospd side

@jjnicola
Copy link
Member

Hi @wisukind , Thanks for reporting.
But as far as I understand, everything is working as expected on the scanner side, since openvas and ospd-openvas are finishing the scan without issues now. But the problem is now on the gvmd client side, which for some reason is setting the scan as interrupted. IIRC, you have a master-sensor setup via TLS sockets. Do you see some communication problem here?
In those cases you can see some messages about a broken pipe.
Is this happening only on tasks which run in a sensor?

@wisukind
Copy link

wisukind commented Aug 13, 2021

Yes, it's indeed a different issue. Here ospd / openvas are working fine. On gvmd side I had several connection lost logs lines when the issue occured; so that's fine. But my question is mostly on the resuming part; why gvmd start a new scan when the connection is back, instead of resuming the already finished one (and putting the state as Done) ? And why gvmd puts immediately the resumed task as Interrupted ?

Overrall that's always the same story; gvmd is not reliably maintaining the communication with ospd and loose tracks of scans tasks too easily. It should be able to resume tasks where it left, and also to get up to date scan status from the scanner directly, even after the connection has dropped for some time.

Should I open a new bug report for this ?

@jjnicola
Copy link
Member

Yes, what you are experimenting now is a complete different issue. Feel free to open a new issue for gvmd.

Regarding the current issue, despite I think the main issue here (ospd putting the scanner as interrupted) is already addressed, I will keep it open some more time, waiting for more feedback about openvas/ospd-openvas behaviour.

@jjnicola
Copy link
Member

@wisukind And as always, thank you very much for your feedback! ;)

@wisukind
Copy link

wisukind commented Aug 13, 2021

Yes, please leave that bug open for now. I am running dozens of scans per month, so this is usually happening very frequently. If it doesn't happen again by, let's say, mid september, then you should be fine to close it.

Also I've created bug #1668 on gvmd side.

@wisukind
Copy link

wisukind commented Sep 6, 2021

Hello,

After a few weeks of testing on targets prone to this bug, I confirm the bug has just moved to issue #1668. Basically, tasks are no longer put in Interrupted state, but in Stopped state now for no reasons. On some of my tasks, it's systemic. It turns out, however, that the bug is now on gvmd side, as on ospd the task appears as finished.

@jjnicola
Copy link
Member

jjnicola commented Sep 6, 2021

I am glad to read that last comment and can closed this issue. Thanks a lot to those who have tested the the different patches and given feedback. So, closing here.

@jjnicola jjnicola closed this as completed Sep 6, 2021
ArnoStiefvater pushed a commit to ArnoStiefvater/ospd-openvas that referenced this issue Oct 25, 2021
Use empty string instead of None for credentials value
@wisukind
Copy link

wisukind commented Feb 8, 2022

Hello. This bug should unfortunately be re-opened. I am witnessing that problem again on some tasks. Highly less frequent, but still happens in 21.4.3 final release. :-(

@jjnicola
Copy link
Member

Hello @wisukind,
I think it is not the same issue, but the same results, since this issue was fixed with greenbone/openvas-scanner#832.
I would suggest to open a new issue, and add more information:

  • about the target if possible
  • alive detection methods.
  • does it always happen for the same task?
  • do you still uses the master-sensor setup ?
  • Did you check gvmd logs? is there any brocken pipe message?
  • Is ospd also finishing as expected or only openvas?

Please also set the log level to debug for OSPD and OpenVAS and add some logs if possible. Please. Open a new issue.

Thanks in advanced.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests