Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tenable.io integration missing vulnerabilities found close to time of export request #5762

Closed
meganlear opened this issue Mar 31, 2023 · 4 comments · Fixed by #7696
Closed

Comments

@meganlear
Copy link

Currently, the Tenable.io integration is using the last_found parameter to only export vulnerabilities that were found since the time of the previous export request. Tenable seems to set the last_found value to the time that the scan finished, but after a scan finishes, the results still need to be processed. The vulnerabilities aren't included in any export requests until Tenable has finished publishing the results. This means that data can be missed if Tenable is still processing the results of a scan at the time of an export request.

For example, suppose you are exporting data every hour and a scan finishes at 11:55, so the vulnerabilities have a last_found value of 11:55. If Tenable takes 10 minutes to publish the results, then an export request at 12:00 won't include the vulnerabilities from that scan. But an export request at 13:00 also won't include the vulnerabilities, even though they were published at 12:10, because it is only looking for vulnerabilities with a last_found value after 12:00.

If instead the integration set the last_found parameter to a few minutes before the time of the previous request, it could get data about vulnerabilities that would otherwise be missed. In the example, setting the last_found parameter to 10 minutes before the previous request would mean that the 13:00 export request would be looking for vulnerabilities found after 11:50, which would include the results of the 11:55 scan. The ingest pipeline sets the document _id to a hash of event.original so this shouldn't create any duplicate documents.

@elasticmachine
Copy link

Pinging @elastic/security-external-integrations (Team:Security-External Integrations)

@LaZyDK
Copy link
Contributor

LaZyDK commented Sep 7, 2023

@meganlear is this still an issue for you? We are getting all of the needed data.

@meganlear
Copy link
Author

@LaZyDK Right now we are getting all of the data, but I think that's because something changed in the Tenable API. It seems to be returning data on all vulnerabilities, regardless of what time you set as the since parameter, so we keep getting data on the same vulnerabilities. That does fix the problem of missing data but it means we are storing multiple duplicate events for the same vulnerabilities. The pipeline you shared in #7671 would solve that problem.

@LaZyDK
Copy link
Contributor

LaZyDK commented Sep 7, 2023

This is why I asked. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants