Skip to content

Hedy Development Process

boryanagoncharenko edited this page Jun 6, 2024 · 40 revisions

This page will help you get started programming on Hedy. Make sure that before you do start programming, you understand the Hedy product and project a bit better. You can find that info on the Wiki homepage.

Running Hedy to make changes

The easiest way to get a working development environment to work on Hedy is through Github Codespaces.

Open in GitHub Codespaces

This will open up a VS Code instance in the browser and allow you to work on the Hedy code. This is really useful to quickly make some small and trivial changes without needing to install anything. The first think you need to do, before making any changes, is to make sure you switch from the main branch to the branch you are using. You can find how to do this in the documentation. You can also find here how to then commit your changes. If you cannot find the existing branch you want to switch to, you might first need to run python3 app.py and stop running it again (using Ctrl-C) to make the different branches appear.

Github Codespaces is only free for a certain amount of CPU-hours each month, so if you want to work on Hedy regularly, it might be better to run Hedy on your machine.

Run Hedy on your machine using Docker and develop using VS Code

Important

If you checkout and edit Hedy code on your host machine (for example, Windows) while running it under Linux inside a Docker container, you will have to use VSCode if you want debugging support and be aware of git configuration details like line endings. Not many people on the development team choose this setup. If you pick this, you will be mostly on your own trying to get it to work, so only choose this if you are comfortable with the components already.

If you want to run the website locally and would prefer to use Docker you can build a container with:

docker build -t hedy .

and then you can run the docker container with:

docker run -it --rm -p 8080:8080 --mount type=bind,source="$(pwd)",target=/app --name hedy hedy

After that, you can access bash inside the container with:

docker exec -it hedy bash

VS Code has a great Dev Containers extension that allows you to connect the IDE to a development (docker) container. More info can be found on https://code.visualstudio.com/docs/devcontainers/containers

After opening this repo in VS Code, they will ask whether you want to open this folder in a container. Do this and you will have a working environment in which you can develop Hedy.

Local installation (without Docker)

To run Hedy on your own computer, you need to have the following bits of software installed:

  • Python 3.9 or higher;
  • (Optional) Node.js, if you plan to work on the front-end

Windows machine

Important

If you use Git on Windows it's extremely important that when you install it you disable the feature where git changes line endings (also known as "auto crlf"). If you have already installed Git in the past and you don't remember what you configured, have a look at this StackOverflow post to confirm.

On a Windows machine, you have two options to develop on Hedy:

  • Install all dependencies and edit natively on Windows. This allows you to use any IDE.
    • Python (use the distribution from the python.org website, not from the Microsoft Application Store)
    • Node.js
    • bash must be available on your machine (this usually comes installed with Git for Windows. To check, press Win+R, run cmd, then run bash).
    • You will probably need to install local build tools from Microsoft Visual C++ 14.0 or higher.
  • Run a Linux machine on Windows using either Docker or WSL2. The same caveats as the Docker solution apply: you must pay attention to line endings in your git configuration and you must use VS Code if you want debugging support. Using the WSL, it is better to clone the repository within the Ubuntu folders to prevent funky things happening. And also use npm and node in Ubuntu and not one pointing to Windows!
    • Visual Studio Code has seamless integration with WSL. Install it and add extensions for GitHub, Python and YAML. At the left-bottom of your VSCode window, you'll find a green >< sign to connect to WSL. Select the distro you created for Hedy development in the WSL installation. Clone the hedy repository and open the project. For the rest you can follow the main path.

Preparing environment

Then, here's how to get started once you have downloaded or cloned the code:

$ python3 -m venv .env
$ source .env/bin/activate
(.env) $ pip install -r requirements.txt

# On Windows, the activate command looks like this
C:\...> .env\scripts\activate

If you want to run the website version locally, run:

# Run this once to reset the development database with some default users.
(.env) $ doit run devdb

# Run all backend build steps and run the development server
(.env) $ doit run devserver

Your local Hedy version should be available on address http://localhost:8080/.

Development database

The development database contains some local users and test data which you can use to test functionality. If you run this while the web server is running, you have to restart the web server / docker container, as the database is loaded into memory on startup. You can log in using e.g. teacher1, student1, user, admin. All these users have the password 123456:

Username Password
teacher1 123456
student1 123456
user 123456
admin 123456

Environment variables

  • fix_for_weblate -> set this to override broken code snippets to English so Weblate can be merged. See 4544

Working on Hedy

Running the server

Preparing the Hedy server requires a couple of build steps. To perform these build steps, you need to run the following command at least once on the command-line, in the virtual environment:

(.env) $ doit run backend

Once is enough to get started, afterwards you can run the server from your IDE by starting app.py or by running:

(.env) $ python3 app.py

You may need to run doit backend again if you change something about the grammar, or translations, or add new translatable strings.

You can also run doit devserver which will perform the backend build steps if necessary, and then start the server. If you want to step through your code with the debugger, you must start app.py directly from your IDE.

(.env) $ doit run devserver

Working on the front-end: TypeScript/JavaScript code

Part of the code base of Hedy is written in Python, which runs on the server. The parts that run in the browser are written in TypeScript, and are compiled to JavaScript.

So that most people won't have to install special tools, the generated JavaScript code is checked in. However, if you are working on the browser code, you need to edit the TypeScript source files and regenerate the JavaScript bundle by running:

# You only need to run 'npm ci' once to install the tools
$ npm ci

# Compile all front-end files once
(.env) $ doit run frontend

# Continously watch all files and recompile if necessary
(.env) $ doit watch frontend

The watch command makes the command keep looking for changes and automatically updating the files. It's a good idea to keep it running while you are working on the front-end code.

Make sure to reload your browser (and work in incognito mode) to see the changes. These files are also automatically generated on deploy, so don't worry if you forget to generate them.

Working on the front-end: Tailwind styles

All the styling in our front-end HTML templates is done using the Tailwind library. This library has generated classes for styling which we can apply to HTML elements.

You normally do not need to think about this. During development, a multi-megabyte CSS file will be served that contains most classes. During deployment, a minimized CSS file is automatically produced.

You may need to regenerate the development CSS file if you want to do one of the following things:

  • Use a conditional Tailwind class (for example, a class that starts with hover:). Write the class in the HTML, then regenerate the CSS.
  • Add custom classes to styles.css.

Run the following command to regenerate the development CSS file:

# Only regenerate the CSS files
(.env) $ doit run tailwind

# Regenerate any of the files necessary for the front-end (CSS, JavaScript, etc) 
(.env) $ doit run frontend

For all possible styling classes and more, take a look at their website.

If you want to combine different Tailwind classes into one class or one element, we can do this in the /build-tool/heroku/tailwind/styles.css file. By using the @apply attribute we can assign classes to other styling. For example, we styled the <h1> element with multiple Tailwind classes like this:

h1 {
  @apply font-extralight text-4xl;
}

If you want to use styling without running a Tailwind build and without using Tailwind classes, add it to static/css/additional.css file. But please, try to use the Tailwind classes as much as possible as these are optimized and keep our code base consistent and readable.

Also, please refrain from using inline CSS styling, as this makes the templates hard to read, maintain and alter.

Testing Admin facing features locally

For some things like making classes you need a teacher's account which you might want to test locally. For that you can use the account teacher1 which is stored in the local database.

If you want to try Admin features locally (for example, marking accounts as teacher or updating tags) you have to run Hedy with the environment variable ADMIN_USER set to your username, e.g. ADMIN_USER=teacher1. It works a bit differently in each IDE, this is what it looks like for PyCharm:

image

Running tests

Running the unit tests

To run the unit tests:

(.env) $ python -m pytest -n auto

The -n auto is optional but uses all CPUs on your computer and speeds up testing considerably.

This runs tests for the Hedy language and some backend components. See below for information on how to run the front-end tests, that test parts of the website.

Important

To reduce the execution time of unit tests, Hedy uses a caching mechanism both locally and on GitHub. Currently, Hedy has over 30000 unit tests with the following suites being particularly slow: test_level, test_translation_level and test_snippets. To improve the speed, every unit test generates a unique hash and, upon success, stores its hash for future executions. The hash of each test is a combination of the test local variables and the source code of the following files: grammars/*, hedy.py and hedy_*.py. If there have been no changes to the test itself or the source code of Hedy, and there is an existing cache record, the test will succeed without being executed. However, if there has been a change in the test or the source code of Hedy, or there isn't an existing cache record, the test will be executed.

Running the front-end tests

We use Cypress as our front end testing tool. You must have Node.js installed. To run the tests:

# Clean your local database and reset it with test users
$ doit run devdb

# (Re-)build the frontend and backend components, and start a dev server
$ doit run frontend devserver

# In a separate terminal, run one of these:
# Run all Cypress tests on the command-line
$ npm run cypress

# Selectively run some Cypress tests from a GUI
$ npm run cypress-gui

When you run the GUI, you will see the Cypress Launchpad in which you should choose to open the End2End testing panel. Afterwards you are able to run all the tests configured in the test suite, as well as adding your own according to the documentation of Cypress.

You can also run a particular set of tests on the command line with the following command:

$ npx cypress run --spec "[path to test(s)]"

# Example: run tests for the login page
$ npx cypress run --spec "cypress/e2e/login_page/*"

Do note, you have to set your app to English as some tests rely on exact (translated) label texts.

If you want to connect Cypress to the online dashboard, use:

npx cypress run --record --key <key here>

To check the front end test coverage, you can run the script:

./tests/get-code-coverage

And then go open the index.html file located in tests/coverage/lcov-report, for more information about how this all works you can go (here)[https://docs.cypress.io/guides/tooling/code-coverage]

The script will only do its job if all the tests pass successfully! So take that into account.

Documentation of Routes

Auth

  • POST /auth/login

    • This route creates a new session for an existing user.
    • Requires a body of the form {username: STRING, password: STRING}. Otherwise, the route returns 400.
    • If username contains an @, it is considered an email.
    • username is stripped of any whitespace at the beginning or the end; it is also lowercased.
    • If successful, the route returns 200 and a cookie header containing the session. Otherwise, the route returns 403.
  • POST /auth/signup

    • This route creates a new user.
    • Requires a body of the form {username: STRING, password: STRING, email: STRING, country: STRING|UNDEFINED, birth_year: INTEGER|UNDEFINED, gender: m|f|o|UNDEFINED, subscribe: true|false|UNDEFINED}. Otherwise, the route returns 400.
    • If present, country must be a valid ISO 3166 Alpha-2 country code.
    • If present, birth_year must be an integer between 1900 and the current calendar year.
    • If present, gender must be either m, f or o.
    • If present, subscribe must be either boolean or undefined and if true, indicates that the user wants to subscribe to the Hedy newsletter.
    • email must be a valid email.
    • password must be at least six characters long.
    • If username contains an @, it is considered an email.
    • Both username and email are stripped of any whitespace at the beginning or the end; they are also lowercased.
    • Both username and email should not be in use by an existing user. Otherwise, the route returns 403.
    • The trimmed username must be at least three characters long.
    • If successful, the route returns 200. It will also send a verification email to the provided email.
  • GET /auth/verify?username=USERNAME&token=TOKEN

    • This route verifies ownership of the email address of a new user.
    • If the query parameters username or token are missing, the route returns 400.
    • If the token doesn't correspond to username, the route returns 403.
    • If successful, the route returns a 302 redirecting to /.
  • POST /auth/logout

    • This route destroys the current session.
    • This route is always successful and returns 200. It will only destroy a session only if a valid cookie is set.
  • POST /auth/destroy

    • This route destroys the user's account.
    • This route requires a session, otherwise it returns 403.
    • If successful, the route returns 200.
  • POST /auth/change_password

    • This route changes the user's password.
    • Requires a body of the form {old_password: STRING, new_password: STRING}. Otherwise, the route returns 400.
    • newPassword must be at least six characters long.
    • If successful, the route returns 200.
  • POST /auth/recover

    • This route sends a password recovery email to the user.
    • Requires a body of the form {username: STRING}. Otherwise, the route returns 400.
    • If username contains an @, it is considered an email.
    • username or email must belong to an existing user. Otherwise, the route returns 403.
    • username is stripped of any whitespace at the beginning or the end; it is also lowercased.
    • If successful, the route returns 200 and sends a recovery password email to the user.
  • POST /auth/reset

    • This route allows an user to set a new password using a password recovery token.
    • Requires a body of the form {username: STRING, token: STRING, password: STRING}. Otherwise, the route returns 400.
    • If username contains an @, it is considered an email.
    • username is stripped of any whitespace at the beginning or the end; it is also lowercased.
    • password must be at least six characters long.
    • If the username/token combination is not correct, the route returns 403.
    • If successful, the route returns 200 and sends an email to notify the user that their password has been changed.

Profile

  • GET /profile

    • This route allows the user to retrieve their profile.
    • This route requires a session, otherwise it returns 403.
    • If successful, this route returns 200 with a body of the shape {username: STRING, email: STRING, birth_year: INTEGER|UNDEFINED, country: STRING|UNDEFINED, gender: m|f|o|UNDEFINED, verification_pending: UNDEFINED|true, session_expires_at: INTEGER, student_classes [...]}.
  • POST /profile

    • This route allows the user to change its email, birth_year, gender and/or country.
    • This route requires a session, otherwise it returns 403.
    • Requires a body of the form {email: STRING|UNDEFINED, country: STRING|UNDEFINED, birth_year: INTEGER|UNDEFINED, gender: m|f|oUNDEFINED}. Otherwise, the route returns 400.
    • If present, country must be a valid ISO 3166 Alpha-2 country code.
    • If present, birth_year must be an integer between 1900 and the current calendar year.
    • If present, gender must be either m, f or o.
    • If present, email must be a valid email.
    • email should not be in use by an existing user. Otherwise, the route returns 403.
    • If email is present and different from the existing email, the route will also send a verification email to the provided email.
    • If successful, the route returns 200.
  • GET /admin

    • This route allows the admin user to retrieve a list of all the users in the system, as well as a program count.
    • If there's no session or the logged in user is not the admin user, it returns 403.
    • If successful, the route will return a template containing a table with all the users in the system and a total count of saved programs. The users will be sorted by creation date, last first.
  • POST /admin/markAsTeacher

    • This route allows the admin user to mark an user as teacher, which allows them to access a program from someone else by link.
    • The body of the request should be of the shape {username: STRING, is_teacher: BOOLEAN}.

Programs

  • GET /programs/delete/ID

    • This route requires a session, otherwise it returns 403.
    • This route deletes the program with id ID as long as it belongs to the user performing the request.
  • POST /programs

    • This route requires a session, otherwise it returns 403.
    • Body must be of the shape {level: INT, name: STRING, code: STRING}.

Classes

  • GET /classes

    • This route requires a session of an user that is marked as teacher, otherwise it returns 403.
    • Returns a list of classes, each with the form {'date': INTEGER, 'id': ID, 'link': STRING, 'name': STRING, 'students': [ID, ...], 'teacher': ID}.
  • GET /class/ID

    • This route requires a session of an user that is marked as teacher, otherwise it returns 403.
    • The class must be owned by the user, otherwise it returns 404.
    • Returns a template containing a table, filled with the following information: {id: STRING, name: STRING, link: STRING, students: [{username: STRING, last_login: INTEGER|UNDEFINED, programs: INTEGER, highest_level: INTEGER|UNDEFINED, latest_shared: PROGRAM|UNDEFINED}, ...]}.
  • POST /class

    • This route requires a session of an user that is marked as teacher, otherwise it returns 403.
    • Body must be of the shape {name: STRING}.
  • PUT /class/ID

    • This route requires a session of an user that is marked as teacher, otherwise it returns 403.
    • The class must be owned by the user, otherwise it returns 404.
    • Body must be of the shape {name: STRING}.
  • DELETE /class/ID

    • This route requires a session of an user that is marked as teacher, otherwise it returns 403.
    • The class must be owned by the user, otherwise it returns 404.
  • GET /class/ID/join/LINK

    • This route requires a session of an user, otherwise it returns 403.
    • The route adds the user as a student of the class.
    • The route returns a 302 to redirect the user that joined to /profile.
  • DELETE /class/ID/student/STUDENT_ID

    • This route requires a session of an user that is marked as teacher, otherwise it returns 403.
    • The class must be owned by the user, otherwise it returns 404.
    • The route removes a student from a class. This action can only be done by the teacher who owns the class.
  • GET /hedy/l/LINK_ID

    • If there's a class with a LINK_ID, this route will redirect you with a 302 to the full URL for prejoining a class.

Database

Hedy uses DynamoDB. If this you're not used to this database, read this page first. This page explains the basics of DynamoDB and what you need to keep in mind when adding tables/indexes.

Tables

Current tables are:

table users:
    username:             STRING (main index)
    password:             STRING (not the original password, but a bcrypt hash of it)
    email:                STRING (secondary index)
    birth_year:           INTEGER|UNDEFINED
    country:              STRING|UNDEFINED
    gender:               m|f|o|UNDEFINED
    created:              INTEGER (epoch milliseconds)
    last_login:           INTEGER|UNDEFINED (epoch milliseconds)
    heard_about:UNDEFINED|['from_another_teacher'|'social_media'|'from_video'|'from_magazine_website'|'other_source']
    prog_experience:      UNDEFINED|'yes'|'no',
    experience_languages: UNDEFINED|['scratch'|'other_block'|'python'|'other_text']
    classes:              UNDEFINED|[STRING, ...] (ids of the classes of which the user is a member)

table tokens:
    id:       STRING (main index; for password reset tokens, id is the username)
    username: STRING|UNDEFINED (only set for session tokens)
    token:    STRING|UNDEFINED (only set for password reset tokens)
    ttl:      INTEGER (epoch seconds)

table programs:
    id:           STRING (main index)
    date:         INTEGER (sort index; milliseconds)
    username:     STRING (secondary index)
    name:         STRING (secondary index)
    session:      STRING
    level:        INTEGER
    lang:         STRING
    code:         STRING
    version:      STRING

table classes:
    id:       STRING (main index)
    date:     INTEGER
    teacher:  STRING (secondary index)
    link:     STRING (secondary index)
    name:     STRING
    students: [STRING, ...]

Working with translations

For our multilingual web structure we use a combination of YAML files and Babel to deliver language-dependent content. The content you see in the tabs, mail-templates, achievements, puzzles and quizzes are all stored using YAML files. All our front-end UI strings, error messages and other "small" translations are stored using Babel. To help translating any of these, please follow the explanation in TRANSLATING.md.

Adding new translation keys

When adding new content or implementing a feature that requires new translations you need to manually add these translation keys.

When adding YAML translations please add these to the corresponding YAML file in the /content folder. Make sure that you conform to the already existing YAML structure. As English is the fallback language, the translation should always be available in the English YAML file. Feel free to manually add the translation to as many languages as you know, but don't worry: otherwise these will be translated by other contributors through Weblate.

When adding new Babel translations the implementation is a bit more complex, but don't worry! It should all work fine with the following steps:

  1. First we add the translation "placeholder" to either the front-end or back-end
    • When on the front-end (in a .html template) we do this like this: {{ _('test') }}
    • Notice that the {{ }} characters are Jinja2 template placeholders for variables
    • When on the back-end we do this like this: gettext('test')
  2. Next we run the following command to let Babel search for keys and update the .po files:
    • doit run extract
  3. All keys will be automatically stored in the /translations folder
  4. Search for the .po files for the languages you know (at least do English!) and find the empty msgstr for your added key(s)
  5. Add your translations there, the other translation will hopefully be quickly picked up by other translators
  6. The translations will be visible locally next time you run:
    • doit run devserver
  7. This action will also always be run on deployment to make sure the translations are up-to-date

Pull Requests

Creating a Pull Request

If you are new to GitHub or new to working with repositories that are not your own, do note you cannot push to the Hedy repository. You will have to create your own fork, push there and then create a PR. See these GitHub docs for details.

Pull Request Review process

For creators

If you created a pull request and no one has looked at it after a few days, let us know in #programmers-general on Discord and we will try to find a reviewer.

For reviewers

If you have looked at a PR, please:

  • Add the Under review label so we know it is under review, even if you have not done a formal review in GitHub.
  • Assign yourself so we know who is reviewing
  • Once you have done an initial review and requested changes, please keep an eye on the PR and see it through.

Python code styling

As this project is growing and multiple people are working on it, we want to move to a more uniformly styled code base. We choose to stick to PEP8 guidelines, with the exception of a max line length of 120 characters instead of 79. To ensure your code adheres to these guidelines, you can install the pre-commit configuration to automatically check modified code when you make a commit. Installing this pre-commit hook has to be done manually (for security reasons) and can be done using the following commands. The pre-commit hook is available for installation once you run requirements.txt:

(.env) $ pre-commit install

After this, every modification you commit will be linted by flake8 according to the configuration in setup.cfg. If there are any issues with your code, you can fix these manually using the output, or alternatively use autopep8 to solve these issues automatically (although autopep8 can't fix some issues). If you want to do this, install autopep8 using pip install autopep8 and run autopep8 --in-place --max-line-length=120 [your-file].

If you want, you can bypass the pre-commit check by adding a no-verify flag: git commit -m "your message" --no-verify

When you push code to the repository or make a pull request, a Github Actions workflow will also automatically check your code. At the moment failing this check does not prevent from merging, as there is still some work to do to make the entire codebase compliant. However, it is appreciated if your modifications of new code follow PEP8 styling guidelines. Keep the Boy Scout Rule in mind: always leave the code better than you found it!

Solving common merge conflicts

When working on an issue in a branch it might happen that the main branch is updated before your contribution is finished. If you create a Pull Request it is possible that GitHub returns merge conflicts: you've worked on the same code as the updated part of main and GitHub in uncertain on which code to keep when merging. Always make sure that there are no merge conflicts when setting your PR to Ready for Review. In this section we describe the most common merge conflicts and how to solve them:

  • Conflict with generated.css
  • Conflict with some (or all of the) .po files
  • Conflicts with 'appbundle.js' and appbundle.js.map

Conflict with generated.css

When having a merge conflict with the generated.css file this is probably the result of you working on CSS code and updating files with the Tailwind script. While working on this the file is updated on the main branch as well. In this case you can simply accept your own branch when a conflict occurs. If your PR still needs a review, make sure to run the Tailwind script again after the conflicts are solved. Don't worry if you make a mistake here, the files are always generated again on deploy so they are always up-to-date on the live server.

Conflict with some (or all of the) .po files

When having a merge conflict with (some of) the .po files this is probably the result of you working with the Babel translations. When adding a new translatable string all .po files are updated and the _Last revision_ header of each file is updated as well. As Weblate automatically updates these files as well it might happen that another branch already merge into main triggered Weblate, resulting in merge conflicts in your branch. These headers don't have influence on the functionality, but it is good practice to keep the main branch header when solving these conflicts. The .po files are not generated on deploy, so we must be careful to correctly merge these.

Conflict with appbundle.js and appbundle.js.map

When having a merge conflict with the appbundle files this is probably the result of you working on TypeScript code and updating the files. While working on this the file is updated on the main branch as well. In this case you can simply accept your own branch when a conflict occurs. If your PR still needs a review, make sure to run the TypeScript script again after the conflicts are solved. Don't worry if you make a mistake here, the files are always generated again on deploy so they are always up-to-date on the live server.

How to access query logs?

Prerequisites

Ask someone in the team for the credentials of the hedy-logs-viewer IAM user. Add the following to the ~/.aws/credentials file:

[hedy-logs-viewer]
aws_access_key_id = AKIA**********
aws_secret_access_key = ***********

Install RecordStream (recs) using one of the methods described here. lnav is not the best tool, but it's usable for now.

Install the AWS CLI.

Usage

Run:

$ tools/view-logs <APP> <YYYY-MM-DD>

# Example:
$ tools/view-logs hedy-beta 2021-05-10

NOTE: Time stamps will be in UTC.

The view-logs tool will give you a RecordStream command line to paste into shell.

Directly pulling S3

We store programs for logging purposes on s3. If you want to access the logs, you can use this command (if you have AWS access, mainly this is a note to self for Felienne!):

aws s3 sync s3://hedy-parse-logs/hedy-beta/ .

Likely you will have to first set your AWS credentials using:

aws configure

You can fetch these credentials here: https://console.aws.amazon.com/iam/home?#security_credential

Server configuration

A place to start recording all the special config that needs to be set to make this server work.

Config via environment variables

AWS credentials and setup:

AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
AWS_DYNAMODB_TABLE_PREFIX

JSONbin credentials and setup:

JSONBIN_COLLECTION_ID
JSONBIN_SECRET_KEY

HTTP redirect:

REDIRECT_HTTP_TO_HTTPS

Email:

MAILCHIMP_AUDIENCE_ID
MAILCHIMP_API_KEY
BASE_URL

A/B testing:

PROXY_TO_TEST_HOST
PROXY_TO_TEST_PROPORTION
IS_TEST_ENV

App secret (needs to be the same to share cookies between instances):

SECRET_KEY

To determine if this is the production environment (to avoid requests from e2e tests being considered as such, to avoid any sort of security loopholes):

IS_PRODUCTION

To set up the port of the app through an env variable (helpful to start multiple instances of the app locally

PORT

To turn off development mode

NO_DEBUG_MODE

Test environment

If the PROXY_TO_TEST_HOST environment is set, some requests will be sent to the specified test environment (specified by host prefix). These requests are reverse proxied to the test environment, which means that the main environment fetches the data from the test environment and then gives the result back to the client.

PROXY_TO_TEST_HOST should look like https://host.com (no trailing /).

The main environment passes the session_id to the test environment so that the test environment can use that session_id for logging. The session variables set by the test environment are read by the main environment by parsing the cookie header returned by the test environment. Other session variables set by the main environment will be available to the test environment since they will be also present in the session cookie sent by the main environment to the test environment.

All the auth routes are never reverse proxied, to keep all the cookie setting within the scope of the main environment. The test environment, however, needs access to the same tables as the main environment to make sure that the cookies forwarded by the main environment are indeed valid. In other words, the test environment must be able to read and validate cookies. To do this, the test environment should have the same value for the environment variable AWS_DYNAMODB_TABLE_PREFIX as that of the main environment.

Whenever enabling a test, please make sure of the following:

  1. All the code deployed in the production environment is also merged and deployed to the test environment.
  2. The AWS_DYNAMODB_TABLE_PREFIX configuration variable is the same for both the production and the test environment.

Heroku Metadata

This app depends on some environment variables that require Heroku dyno metadata.

Only when deploying Hedy to a new Heroku environment, enable those variables (once) using the Heroku CLI:

$ heroku labs:enable runtime-dyno-metadata -a <app name>

Updating Python versions

To update to Python 3.12, follow the following steps:

  • If you are using PyCharm, uninstall PyCharm and install the latest one (2023.3).
  • Install the new Python version.
    • On Mac: use Homebrew
    • On Windows: download the correct Python version from python.org.
    • On Linux: use your distribution's package manager.
  • Open a terminal window where the virtualenv is NOT active (the prompt does not start with (venv)) and do the following:
    • Remove the virtualenv directory (venv, .venv, .env, whatever you decided to call it)
    • Using the Python version you just installed, create a new virtual env.
      • First run python3 --version to make sure you have the right version. If this gives the wrong version, you may need to use the full path to the python interpreter, or use a command like python3.12.
      • python3.12 -m venv .venv -- the past part is the venv name, it can be .venv or venv or .env or anything you fancy. If you are using PyCharm, use the same name you were using before.
  • You can now restart your IDE, have it pick up the new virtualenv and have it install the dependencies in there. If you want to install dependencies by hand, do it like this:
    • source .venv/bin/activate
    • pip install -r requirements.txt