Skip to content

(draft) Very scalable and easy crawler in Node.js, MongoDB and Redis

License

Notifications You must be signed in to change notification settings

itemsapi/crawler

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

Scalable crawler made easy

At the moment project is in draft mode. I've made many crawlers already and I'd like to share that soon and improve so more people could use it.

Technologies

Desired features

  • easy to scale (by adding new workers to remote machines)
  • easy to deploy (digital ocean, aws, etc)
  • intuitiveness (should be very easy to start)
  • easy to manage data (import / export)
  • crawling many different websites at the same time
  • cli
  • easy to monitor status (to see what's going on)

Inspirations

About

(draft) Very scalable and easy crawler in Node.js, MongoDB and Redis

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published