UBC's own search engine
Please see CONTRIBUTING for guidelines on how to contribute to this repo.
- Clone this repository and the Sleuth frontend into the same directory
- Install Docker
$ docker-compose up --build
- Once containers have started you can
webcontainer and configure a Django admin user.
$ docker-compose exec web bash # Create Django admin user root@57d91373cdca:/home/sleuth# python3 manage.py createsuperuser
- To access your Solr admin interface, go to http://localhost:8983/solr.
- To query a core with the name "test", go to http://localhost:8983/solr/#/test/query.
- The base url for your Django instance should be http://localhost:8000.
- To access the Django admin interface make sure you have completed the steps listed above and to go http://localhost:8000/admin.
- To test the backend API, go to http://localhost:8000/api/[ENDPOINT]/?[PARAMS]
Accessing the Sleuth Front-end App
- Go to http://localhost:8080
The Sleuth front-end repository is here
Adding Test Data
Once you have started your containers you can populate the "test" core in Solr with some test data by running
$ bash scripts/populate.sh
For live data, you can currently run the
BroadCrawler, which scrapes a few thousand pages and pipelines them into the appropriate cores based on their type.
$ bash sleuth_crawler/run_crawlers.sh
To empty a core, go to: