Skip to content

Uma ferramenta feita para facilitar a análise de código html.

License

Notifications You must be signed in to change notification settings

LucasDSilva200/Spyder-ml

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Spyder-HTML

A tool made to facilitate the analysis of html code.

INSTALL(git clone):

git clone https://github.com/LucasDSilva200/Spyder-ml

python setup.py install

INSTALL(PIP):

pip install spyder-ml

USAGE:

spyderml [-h] [-t TARGET] [--tr TR] [--update] [--tags TAGS | --comments | --attribs ATTRIBS | --getjs | --techs | --geturls | --html | --jsr] [-o OUTPUT] [-C COOKIE] [-A AGENT] [-hf HEADERSFILE] [-S] [-w WORKERS] [--domain DOMAIN] [--cache] [--proxy PROXY] [-D DATA [DATA ...]]

A tool made to facilitate the analysis of html code.

options:
-h, --help show this help message and exit
-t TARGET, --target TARGET
Parameter that defines the target URL< http://example.com/index.html
--tr TR Type of request(POST or GET(Default)).
--update Flag responsible for updating the database.
--tags TAGS Flag that defines which tags the program will bring
--comments Flag that brings the comments
--attribs ATTRIBS Flag that defines which attributes the application will look for.
--getjs Flag that brings all JS files from the page.
--techs Flag trying to discover the technologies of the page.
--geturls This flag takes all the target's urls and tries to access them.
--html This Flag results in all the page's html code.
--jsr Makes a request that returns a JSON.
-o OUTPUT, --output OUTPUT Flag that defines in which file the command output will be saved.
-C COOKIE, --cookie COOKIE Cookie to send with the request
-A AGENT, --agent AGENT User-Agent to send with the request
-hf HEADERSFILE, --headersfile HEADERSFILE Parameter that passes an HTTP request header file to be scanned.
-S, --spider flag to run spider
-w WORKERS, --workers WORKERS Defines the number of workers.
--domain DOMAIN Defines the domain of the web crawler.
--cache Defines whether to create cache or not (default: false).
--proxy PROXY Defines the proxy that will be used (Which can be passed tor or burpsuite to use these two default proxies).
-D DATA [DATA ...], --data DATA [DATA ...] Data to send with the request in format key1:value1 key2:value2 key3:value3...

'Functionality': It searches the html document for specific things

About

Uma ferramenta feita para facilitar a análise de código html.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages