Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reusing dynamically extracted values as iterators in http request #1288

Merged
merged 5 commits into from
Dec 2, 2021

Conversation

Ice3man543
Copy link
Member

This PR adds the support in http module to iterate over the dynamically extracted data from extractors
and use it in other requests. This allows nuclei to follow links on pages, do operations with
multiple versions of the same extracted value, etc.

Proposed changes

Checklist

  • Pull request is created against the dev branch
  • All checks passed (lint, unit/integration/regression tests etc.) with my changes
  • I have added tests that prove my fix is effective or that my feature works
  • I have added necessary documentation (if appropriate)

This PR adds the support in http module to iterate over the dynamically extracted data from extractors
and use it in other requests. This allows nuclei to follow links on pages, do operations with
multiple versions of the same extracted value, etc.
@Ice3man543 Ice3man543 self-assigned this Nov 24, 2021
@Ice3man543 Ice3man543 added the Status: Review Needed The issue has a PR attached to it which needs to be reviewed label Nov 24, 2021
@Ice3man543 Ice3man543 linked an issue Nov 24, 2021 that may be closed by this pull request
1 task
@Ice3man543
Copy link
Member Author

Ice3man543 commented Nov 29, 2021

Example Template

id: valid-robotstxt-endpoints

info:
  name: Iterate robots.txt and request endpoints
  author: pdteam
  severity: info

requests:
  - raw:
      - |
        GET /robots.txt HTTP/1.1
        Host: {{Hostname}}

      - |
        GET {{endpoint}} HTTP/1.1
        Host: {{Hostname}}

    iterate-all: true
    extractors:
      - part: body
        name: endpoint
        internal: true
        type: regex
        regex:
          - "(?m)/([a-zA-Z0-9-_/\\\\]+)"

    matchers:
      - type: status
        status:
          - 200

To test -

 ./nuclei -t test.yaml -u https://github.com
./nuclei -t test.yaml -u https://github.com

                     __     _
   ____  __  _______/ /__  (_)
  / __ \/ / / / ___/ / _ \/ /
 / / / / /_/ / /__/ /  __/ /
/_/ /_/\__,_/\___/_/\___/_/   2.5.4-dev

		projectdiscovery.io

[WRN] Use with caution. You are responsible for your actions.
[WRN] Developers assume no liability and are not responsible for any misuse or damage.
[INF] Using Nuclei Engine 2.5.4-dev (development)
[INF] Using Nuclei Templates 8.6.8 (latest)
[INF] Using Interactsh Server https://interactsh.com
[INF] Templates added in last update: 0
[INF] Templates loaded for scan: 1
[valid-robotstxt-endpoints] [http] [info] https://github.com/robots.txt
[valid-robotstxt-endpoints] [http] [info] https://github.com/search/advanced
[valid-robotstxt-endpoints] [http] [info] https://github.com/account-login
[valid-robotstxt-endpoints] [http] [info] https://github.com/stargazers
[valid-robotstxt-endpoints] [http] [info] https://github.com/archive/
[valid-robotstxt-endpoints] [http] [info] https://github.com/tarball/
[valid-robotstxt-endpoints] [http] [info] https://github.com/contributors
[valid-robotstxt-endpoints] [http] [info] https://github.com/pulse
[valid-robotstxt-endpoints] [http] [info] https://github.com/tags
[valid-robotstxt-endpoints] [http] [info] https://github.com/zipball/
[valid-robotstxt-endpoints] [http] [info] https://github.com/Explodingstuff/
[valid-robotstxt-endpoints] [http] [info] https://github.com/forks
[valid-robotstxt-endpoints] [http] [info] https://github.com/search
[valid-robotstxt-endpoints] [http] [info] https://github.com/watchers
[valid-robotstxt-endpoints] [http] [info] https://github.com/ekansa/Open-Context-Data
[valid-robotstxt-endpoints] [http] [info] https://github.com//docs
[valid-robotstxt-endpoints] [http] [info] https://github.com/download
[valid-robotstxt-endpoints] [http] [info] https://github.com/revisions

Copy link
Member

@Mzack9999 Mzack9999 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good:

  • Interactsh test failure can be ignored
  • Marking as request changes due to conflicts

@Mzack9999 Mzack9999 self-requested a review December 2, 2021 10:09
@ehsandeep ehsandeep merged commit 3b68c29 into dev Dec 2, 2021
@ehsandeep ehsandeep deleted the dynamic-value-reuse-http branch December 2, 2021 10:59
@ehsandeep ehsandeep added Status: Completed Nothing further to be done with this issue. Awaiting to be closed. and removed Status: Review Needed The issue has a PR attached to it which needs to be reviewed labels Dec 2, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Status: Completed Nothing further to be done with this issue. Awaiting to be closed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Reusing dynamic extractors to iterate over raw requests
3 participants