This project is a rss webcrawler in go, that search for feed post of selected urls every hour and store the data in the database.
Name | Version | Notes | Mandatory |
---|---|---|---|
golang | >= go1.18 | Main programming language | true |
make | Run shortcuts | n/a | false |
The mandatory tools/libs to run this webcrawler
Name | Notes | command |
---|---|---|
gofeed | Used to rss webcrawler | go get github.com/mmcdole/gofeed |
Name | Version | Notes | command |
---|---|---|---|
[sqlite] | any stable version | in this repo, I'm using sqlite3 | go get github.com/mattn/go-sqlite3 |
Into root folder run:
make s # Run script on sync mode
or
make a # Run script with goroutine