This is a simple CLI application for the purpose of a hackathon submission. This application scrapes a hackathon website (devpost used for reference) in search for hackathons and stores the returned hackathons into a JSON file at the project directory.
- Python (main programming language).
- Selenium (used for web scraping due to devpost dynamic structure)
- Beautiful soup (to parse selenium fetched content into readable format)
- Git and Github
Git served as the major version control system, I used it to keep track of my development history with clean, readable and understandable commits. Git is very useful as it keeps record of changes in files, because of this reason, it is very easy for me to go back into my code in order to locate bugs if any, and in special cases where my codebase maybe lost, I can easily rely on git for staged history. Github served as a platform for me to create, share and view git staged changes. With github, I can see all the changes in my project and read through them. I also used github actions to create a linter to lint my files whenever I push to production.
-
Ensure python is installed, if not install from python website.
-
install uv: Open your terminal and run
pip install uv
-
Clone file
git clone "git@github.com:Towbee05/hackathon-scraper.git" -
cd into working directory
cd hackathon-scraper -
Create a virtual environment
uv venv .venv
-
Run virtual environment. First is for windows OS, and the second is for linux/macOS
.venv\Scripts\activate .venv\bin\activate
-
Install all dependencies
uv pip install -r requirements.txt
-
Run project
uv run app/main.py
This web scraper is just for practical purposes and not meant for illegal purposes.



