The spider script and dump files used to populate the chiplibrary database.
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
dumps
sources/csv
.gitattributes
.gitignore
LICENSE
README.md
__init__.py
spider.py

README.md

chiplibrary-data

The spider script and dump files used to populate the chiplibrary database.

Preamble

The spider uses a web scraping library written in Python called scrapy, which returns battle chip library data for the six original Mega Man Battle Network battle chips. The XML dump generated by the spider is meant to be used in complement with chiplibrary, however I am releasing it as a separate project in the hopes that someone may find it useful for purposes outside of the ones intended for chiplibrary.

Because of the incomplete nature of information that this 'spider' will generate, not all relevant in-game information is available yet, however I am pushing to make this a complete, extensive and accurate reference eventually.

Reqirements

~ Python 3 ~ scrapy

Using the spiders

scrapy runspider spider.py

After running the spider, an XML file called chips.xml will be generated in your dumps folder, in which the file itself can be used alongside the initialize_chiplibrary_db script in chiplibrary, or for your own purposes. Note that in order to use your file with chiplibrary, your file must be XML-generated.

Credits

All of the information gathered here is primarily for use in the chiplibrary project. To see the full list of contributors, have a look at the official website:

(http://chips.nachtara.com/)[http://chips.nachtara.com/]