Ruby Capstone Project of Microverse, which students have to complete a real-world-like project within 72 hours according to this project specifications
This web scraper is a freelance job scraper that scrapes freelance jobs from these three websites: freelancer.com
, guru.com
, peopleperhour.com
with keywords taken from the user input and exports them to 3 separate CSV files.
- Ruby
- Nokogiri gem
- HTTParty gem
├── README.md
├── bin
│ └── main.rb
└── lib
└── scraper.rb
└── csv_exporter.rb
└── rspec
└── scraper_spec.rb
└── csv_exporter_spec.rb
└── spec_helper.rb
Feel free to check out this link for a 5min video walkthrough :)
- Git clone this repo and cd the to the
freelance-job-scraper
directory. - Run
bundle install
in command line to install Nokogiri and HTTParty Gem. - Run
ruby bin/main.rb
. - Input keywords in separate lines. Press enter key on a new line to begim the search.
- Tada! 'freelance.com-jobs.csv', 'guru.com-jobs.csv', and 'peopleperhour.com-jobs.csv' would be created at the root directory respectively :)
- Git clone this repo and cd the to the
freelance-job-scraper
directory. - Install rspec with
gem install rspec
. - Run
rspec
in Command Line. 15 examples, 0 failures
will be shown on the screen.
- User can select which sites they want to scrape
- Add more sites for scraping
- User can specify how many jobs he/she wants to scrape
👤 Tirthajyoti Ghosh
- Github: @tirthajyoti-ghosh
- Twitter: @terrific_ghosh
- Linkedin: Tirthajyoti Ghosh
Contributions, issues and feature requests are welcome!
Feel free to check the issues page.
Give a ⭐️ if you like this project!
- Microverse
- Nokogiri gem
- HTTParty Parser
- Freelancer.com
- Guru.com
- Peopleperhour.com
This project is MIT licensed.