Uses the "TwitterAPI" python module to collect tweets based on search query and geocode. Outputs to *.json.
TwitterCrawler.py makes use of TwitterAPI and AlchemyAPI. In order to use TwitterCrawler, you must have valid API keys for each service.
- Create a file called "credentials.txt" in the same directory as TwitterCrawler.py. This file contains your TwitterAPI keys.
- Create a new Twitter App here.
- Once you've created your app, go to the "Keys and Access Tokens" tab.
- Click "Generate my access token"
- Record your Consumer Key, Consumer Secret Key, Access Token, and Access Token Secret.
- In credentials.txt, place your keys in a single line in the following format. Separate with commas, no spaces.
- If you have more than one Twitter account, you may place additional API keys on subsequent lines to circumvent the limit on API calls per account.
- Create a file called "api_key.txt" in the same directory as TwitterCrawler.py. This file contains your AlchemyAPI keys.
- Register for an Alchemy API key here.
- In "api_key.txt" place your Alchemy API key on the first line.
- If you have more than one AlchemyAPI key, you may place additional API keys on subsequent lines to circumvent the limit on API calls per account.
- Run the following command to install the necessary Python packages.
pip install TwitterAPI
- Run the following command within the project directory:
The "candidates.txt" file may be edited to add or remove candidates to search for. Each candidate must be written on separate lines. By default, this file contains current 2016 US Presidential candidates as of ~September, 2015.
The "state-coords.txt" file may be edited to add or remove locations to search for. The format per line for each location is as follows:
LOCATION_NAME LATITUDE,LONGITUDE RADIUS
Where Latitude and Longitude are in Decimal Degree formate and Radius is in Kilometers.