Skip to content

Auth integration flows #18

Open
Open
@MartinKolarik

Description

@MartinKolarik

We have the basic tokens functionality, but it hasn't been integrated into any of the services and clients. I'll describe the flows I expect to use in each integration, and if we agree it's all good, I'll then create the issues in the affected repos.

jsDelivr Purge / Globalping homepage GUI app

We'll configure Directus to set its cookies for jsdelivr.com so that any subdomain can read them. Then, we'll be able to detect if the user is logged in to the dashboard from the website and perform any API calls. This way, our front-end code on the website can request a token for the specific API and use it without the user having to do anything.

Tokens created this way will have a special flag in the DB and will not be visible in the user's dashboard. For security reasons, they also have a short TTL (1 day). This doesn't impact the UX in any way since the UI can always request a new token in the background as long as the user is signed in the dashboard (and prompt to sign if not).

Globalping CLI

A token must be created and added manually, probably either set via an env var or a config file. A new command can also be added that:

  1. Prints a link like https://dashboard.jsdelivr.com/tokens/new/?name=Globalping%20CLI.... which brings the user to the correct page with values pre-filled.
  2. Accepts the token as input and saves it to the correct file.

Of course, if the user already has a token, they skip step 1 and simply paste it.

Globalping Discord / Slack

A token must be created and added manually via a new command. It is then stored for that app installation in our DB. The flow can be similar to the CLI, or we can optionally not provide the link here (step 1) if we don't want it to be "too easy" as we previously discussed.

jsDelivr Purge / Globalping API

Requirements:

  • new tokens must work instantly, without any delay
  • deleted / revoked tokens should stop working reasonably fast
  • don't query the DB on every single request

Suggestion:

  • query the DB once a minute for all tokens; for each token found, store it in memory as valid for the next two minutes
  • if a request comes with a token that isn't in memory - query the DB for the specific token and store the result - valid/invalid - for the next two minutes

Note that this applies only to requests not using credits. If the users exceeds their time-based requests quota and has credits, that'll be handled separately in a next step.

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions