Skip to content

Commit

Permalink
Cleaning up readme
Browse files Browse the repository at this point in the history
  • Loading branch information
dado3212 committed May 22, 2018
1 parent 43affea commit c387928
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion README.md
Expand Up @@ -14,7 +14,7 @@ I kept having difficulty figuring out what classes to take, and alternating betw
* Filter out previous classes

### Setup
First, you'll need to set up the MySQL database. The `Create_Tables.sql` file in the `scrapers` folder contains the SQL code to create the tables. Then, you'll need to create a file called `secret.php` in the `php` folder. All that folder has is a function called "createConnection" which will need to create a PDO connection to the database with the tables you previously created.
First, you'll need to set up the MySQL database. The `Create_Tables.sql` file in the `scrapers` folder contains the SQL code to create the tables. Then, you'll need to create a file called `secret.php` in the `php` folder. All that file has is a function called "createConnection" which will need to create a PDO connection to the database with the tables you previously created. You'll also need to create a file called `secrets.py` in the `scrapers` folder. All that file has is a function called "mysql_connect" which returns a MySQLdb.connect call with user/password information.

Then you'll need to populate the tables with the scrapers. Information on running the scrapers can be found under the 'scrapers' subsection. You can then upload the generated CSV files using the `Upload.sql` file, which contains the SQL code to truncate or update the existing tables from a local CSV. Finally, add the new term to the util.php file as the last option, which will be the default.

Expand All @@ -33,6 +33,8 @@ The project relies on various scrapers to create a MySQL database that is querie
</li>
</ul>

The second two scrapers can be run automatically with Python 3, and `python3 update.py`. It will automatically check for new terms, and upload them accordingly, a check that's run on my server daily.

---

* HTML
Expand Down

0 comments on commit c387928

Please sign in to comment.