This project allows you to run crawlers and save the results in various formats.
- git clone https://github.com/your/repository.git
- poetry install
To install the enviroment via poetry.
- poetry shell
To activate the virtual enviroment.
- VultrCrawler
- HostgatorCrawler
- save_json
- save_csv
-python -m crawley.cli (Crawler) (argument)
- python -m crawley.cli (Crawler) print"
This should print the required information crawled from the website.
- python -m crawley.cli (Crawler) (argument) --filename (output).json
The .json file will be saved in the current directory
- You can choose the name of your file by switching in (output).
- python -m crawley.cli (Crawler) (argument) --filename (output).csv
The .csv file will be saved in the current directory
- You can choose the name of your file by switching in (output).
That's the end of it. I hope this application can live up to the standards of good python Implementation.