Skip to content

capital-G/baryon

Repository files navigation

Baryon

Baryon is a package index for SuperCollider plugins, called quarks for additions to the standard library (so affecting sclang) and extensions, which also add new DSP functionality to the server via UGens.

github.com/supercollider-quarks/quarks is used as index for Quarks, but as there is no index of extensions, there is a extensions.yml which tries to collect. This is collected within a file instead of a database as this approach allows for de-centralized contributing as well as no exclusive information is stored in the database.

If you want to add an extension to be tracked via Baryon, please make a PR by adding the extension to ./baryon/extensions.yml.

Development

In order to verify the integrity of each commit it is necessary to install and setup pre-commit.

Run local dev server

In order to boot up the local development server use

make local

Test types

In order to run static type analysis run

make test-types

Deployment

The service is deployed on a server via Docker which exposes the web server on port 8080. On the host machine, a nginx reverse proxy is used to expose this service under a stated URL with a configuration similar to

server {
    server_name baryon.supercollider.online;

    # adjust listen to 80 / 443

    location / {
        # add_header Access-Control-Allow-Origin *;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";

        # new
        proxy_redirect off;

        proxy_http_version 1.1;

        proxy_pass http://127.0.0.1:8080;
    }
}

Use certbot to obtain a SSL certificate for the website.

As the scraping needs to run in regular intervals it is necessary to create a reoccurring job which executes the Django command

python manage.py scrape_projects

within the Docker container backend.

Systemd can be used to declare a service which executes this command as well as using. Asserting the service is deployed with the service user baryon under the directory /home/baryon/baryon the provided systemd service files can be linked and activated.

sudo ln -s /home/baryon/baryon/baryon-scraper.service /etc/systemd/system/baryon-scraper.service
sudo ln -s /home/baryon/baryon/baryon-scraper.timer /etc/systemd/system/baryon-scraper.timer

sudo systemctl daemon-reload

sudo systemctl start baryon-scraper.timer
sudo systemctl enable baryon-scraper.timer

In order to trigger a scraping manually it is possible via the command

sudo systemctl start baryon-scraper

# check status
sudo systemctl status baryon-scraper

License

AGPL-3.0