Skip to content

Artificial Intelligence and Machine Learning to help find scientific research and filter relevant content


Notifications You must be signed in to change notification settings


Repository files navigation



Gregory is an AI system that uses Machine Learning and Natural Language Processing to track clinical research and identify papers which improves the wellbeing of patients.

Sources for research can be added by RSS feed or manually.

The output can be seen in a static site, using or via the api provided by the Django Rest Framework.

The docker compose file also includes a Metabase container which is used to build dashboards and manage notifications.

Sources can also be added to monitor Clinical Trials, in which case Gregory can notify a list of email subscribers.

For other integrations, the Django app provides RSS feeds with a live update of relevant research and newly posted clinical trials.


  1. Machine Learning to identify relevant research
  2. Configure RSS feeds to gather search results from PubMed and other websites
  3. Configure searches on any public website
  4. Integration with to send emails
  5. Automatic emails to the admin team with results in the last 48hours
  6. Subscriber management
  7. Configure email lists for different stakeholders
  8. Public and Private API to integrate with other software solutions and websites
  9. Configure categories to organize search results based on keywords in title
  10. Configure different “subjects” to have keep different research areas segmented
  11. Identify authors and their ORCID
  12. Generate different RSS feeds

Current Use Case for Multiple Sclerosis

Running in Production

Server Requirements

Installing Gregory

1. Clone and Install

  1. Clone the repository:
    git clone <repository_url>
    cd <repository_directory>
    docker compose up -d 
    docker exec admin python makemigrations
    docker exec admin python migrate

2. Setup DNS for api.domain.etc

  1. Log in to your DNS provider.
  2. Add a new A record for api.domain.etc pointing to your server's IP address.

3. Setup DNS for Mailgun mg.domain.etc

  1. Log in to your DNS provider.
  2. Add the following DNS records provided by Mailgun for mg.domain.etc:
    • TXT record
    • MX record
    • CNAME record

4. Get Mailgun API Keys and Add to .env

  1. Log in to your Mailgun account.
  2. Navigate to API Keys.
  3. Copy the private API key.
  4. Add the key to your .env file.

5. Get ORCID API Keys and Add to .env

  1. Log in to your ORCID account.
  2. Navigate to Developer Tools and create an API client.
  3. Copy the client ID and client secret.
  4. Add the following to your .env file:
5.1 make sure your .env file is complete
# Set this to the subdomain you configured with Mailgun. Example:
# The SMTP server and credentials you are using. For example:
# These variables are only needed if you plan to send notification emails
# We use Mailgun by default on the newsletters, input your API key here
# Where you cloned the repository
# Set your postgres DB and credentials
SECRET_KEY='Yeah well, you know, that is just, like, your DJANGO SECRET_KEY, man' # you should set this manually

6. Configure Server

6.1. Nginx
  1. Install Nginx:
    sudo apt-get update
    sudo apt-get install nginx
  2. Configure Nginx for your application:
    sudo nano /etc/nginx/sites-available/default
    • Add your server block configuration.
  3. Test and restart Nginx:
    sudo nginx -t
    sudo systemctl restart nginx
6.2. Certbot
  1. Install Certbot:
    sudo apt-get install certbot python3-certbot-nginx
  2. Obtain and install SSL certificate:
    sudo certbot --nginx -d domain.etc -d www.domain.etc
6.3. Firewall
  1. Allow necessary ports:
    sudo ufw allow 'Nginx Full'
    sudo ufw enable

7. Configure Gregory

7.1. Create a Site
  1. Log in to the Gregory dashboard.
  2. Navigate to Sites and click Create Site.
7.2. Create a Team
  1. Navigate to Teams and click Create Team.
7.3. Add a User to the Team
  1. Navigate to Teams, select the team, and click Add User.
  2. Enter the user's email and assign a role.
7.4. Add a Source, such as PubMed
  1. Navigate to Sources and click Add Source.
  2. Select RSS method and provide the necessary configuration.

8. Add cronjobs to run the pipeline and send emails

# Every 2 days at 8:00
0 8 */2 * * /usr/bin/docker exec admin python send_admin_summary

# Every Tuesday at 8:05
5 8 * * 2 docker exec admin python send_weekly_summary

# every 12  hours, at minute 25
25 */12 * * * /usr/bin/flock -n /tmp/pipeline /usr/bin/docker exec admin ./ pipeline
  1. Execute python3

The script checks if you have all the requirements and run to help you setup the containers.

Once finished, login at https://api.DOMAIN.TLD/admin or wherever your reverse proxy is listening on.

  1. Go to the admin dashboard and change the site to match your domain
  2. Go to custom settings and set the Site and Title fields.
  3. Configure your RSS Sources in the Django admin page.
  4. Setup database maintenance tasks. Gregory needs to run a series of tasks to fetch missing information before applying the machine learning algorithm. For that, we are using Django-Con. Add the following to your crontab:
*/3 * * * * /usr/bin/docker exec -t admin ./ runcrons
#*/10 * * * * /usr/bin/docker exec -t admin ./ get_takeaways
*/5 * * * * /usr/bin/flock -n /tmp/get_takeaways /usr/bin/docker exec admin ./ get_takeaways

How everything fits together


Most of the logic is inside Django, the admin container provides the Django Rest Framework, manages subscriptions, and sends emails.

The following subscriptions are available:

Admin digest

This is sent every 48 hours with the latest articles and their machine learning prediction. Allows the admin access to an Edit link where the article can be edited and tagged as relevant.

Weekly digest

This is sent every Tuesday, it lists the relevant articles discovered in the last week.

Clinical Trials

This is sent every 12 hours if a new clinical trial was posted.

The title of the email footer for these emails needs to be set in the Custom Settings section of the admin backoffice.

Django also allows you to add new sources from where to fetch articles. Take a look at /admin/gregory/sources/



Emails are sent from the admin container using Mailgun.

To enable them, you will need a mailgun account, or you can replace them with another way to send emails.

You need to configure the relevant variables for this to work:


As an alternative, you can configure Django to use any other email server.

RSS feeds and API

Gregory has the concept of 'subject'. In this case, Multiple Sclerosis is the only subject configured. A Subject is a group of Sources and their respective articles. There are also categories that can be created. A category is a group of articles whose title matches at least one keyword in list for that category. Categories can include articles across subjects.

There are options to filter lists of articles by their category or subject in the format articles/category/<category> and articles/subject/<subject> where and is the lowercase name with spaces replaced by dashes.

Available RSS feeds

  1. Latest articles, /feed/latest/articles/
  2. Latest articles by subject, /feed/articles/subject/<subject>/
  3. Latest articles by category, /feed/articles/category/<category>/
  4. Latest clinical trials, /feed/latest/trials/
  5. Latest relevant articles by Machine Learning, /feed/machine-learning/
  6. Twitter feed, /feed/twitter/. This includes all relevant articles by manual selection and machine learning prediction. It's read by Zapier so that we can post on twitter automatically.

How to update the Machine Learning Algorithms

This is not working right now and there is a pull request to setup an automatic process to keep the machine learning models up to date.

It's useful to re-train the machine learning models once you have a good number of articles flagged as relevant.

  1. cd docker-python; source .venv/bin/activate
  2. python3
  3. python3

Running for local development

Edit the env.example file to fit your configuration and rename to .env

sudo docker-compose up -d
python3 -m venv env
source env/bin/activate
pip install -r requirements.txt

Thank you to

  • @Antoniolopes for helping with the Machine Learning script.
  • @Chbm for help in keeping the code secure.
  • @Jneves for help with the build script
  • @Malduarte for help with the migration from sqlite to postgres.
  • @Melo for showing me Hugo
  • @Nurv for the suggestion in using
  • @Rcarmo for showing me Node-RED

And the Lobsters at One Over Zero