Skip to content

HappyFunCorp/gitgratitude

Repository files navigation

Overview

This is the master repo for gitgratitude.com.

It’s deployed on a kubenetes cluster build as described

  1. Setting up knative
  2. Serving services on the root
  3. NextJS Knative service with Prisma

Current Status

In active development

Development

Database

docker run -d -e POSTGRES_PASSWORD=awesome_password -p 5432:5432 postgres

Deployment

Script

There’s a kproj script that handles pushing out new versions.

  1. It builds the images with --platform linux/amd64
  2. Pushes the images to dockerhub
  3. does kn service update to activate the new version
  4. Sets up a local tunnel to the postgres instance and runs npm prisma migrate to sync up the database
  5. Optionally can run npx prsima studio over a similar tunnel

Usage

There’s a database section and a global section which will be managed automatically. To add or configure a service, add a stanza with the same name as a directory. If that directory has a Dockerfile it in, it will be considered a deployable service.

Example:

[watcher]
binding = true
config_maps = ["postgres-postgresql-watcher","s3-access"]
events = ["url.watch"]
image = "wschenk/watcher"
migrate = "rake db:migrate"
schema = "watcher"
service = "watcher"
NameDescription
bindingSetup a K_SINK to send events to the default broker
config_mapsWhich config maps should be injected into the environment
eventsList of events that trigger this service
imageTag of the image to build locally and deploy remotely
migratecommand to use to update the database
schemaDatabase schema to use, config map will be created with a correct DATABASE_URL
serviceThe name of the remote service

Commands are:

./kproj help
Commands:
  kproj build                   # Builds a service
  kproj db                      # Database functions
  kproj doctor                  # Sync the state of the cluster
  kproj help [COMMAND]          # Describe available commands or one specific command
  kproj info name               # queries the cluster
  kproj trigger_add name event  # add a trigger to the service
  kproj up name                 # Deploys a service
  kproj update                  # Pushes configuration to server

./kproj db help
Commands:
  kproj db database_url db schema  # set the database secret
  kproj db dbs                     # List databases
  kproj db help [COMMAND]          # Describe subcommands or one specific subcommand
  kproj db job                     # Create the backup job
  kproj db migrate name            # runs the db migration
  kproj db password                # get the password of a database
  kproj db schemas                 # List defined schemas

Config

deployment.toml has a list of services and data around how they are deployed.

kubernetes directory contains other services not deployed by kn.

PGAdmin

kubectl apply -f kubernetes/pgadmin.yaml

Get the password:

kubectl get secret --namespace default postgres-postgresql -o jsonpath="{.data.postgresql-password}" | base64 --decode | pbcopy

Then start up a port forwarder:

kubectl port-forward svc/pgadmin-service 4000:80

Once you are done, you can kill pgadmin

kubectl delete -f kubernetes/pgadmin.yaml

JSON Schemas

Project Lookup (eco-*)

GET /?package=name

export type Project = {
  name: string;
  homepage?: string;
  description?: string;
  git?: string;
  keywords?: string[];
  participants?: Participant[];
  licenses?: string[];
  releases: Release[]
};

export type Release = {
  version: string;
  released?: string;
  summary?: string;
  description?: string;
  licenses?: string[];
  download_count?: number;
  major?: number;
  minor?: number;
  patch?: number;
  suffix?: string;
  prerelease?: boolean;
};

export type Participant = {
  username?: string;
  name?: string;
  email: string;
  type: string;
};

Lockfile parsing (lock-*)

Services

Cronjob, defined in kubernetes/backup.yaml that copies the databases into s3 buckets

A bit hacked together as far as kproj is concerned, database urls are manually added if service = job

Run ./kproj up backup after you create a new schema to regenerate the yaml file and apply it

Required ENV

AWS_ACCESS_KEY_IDaws access id key
AWS_SECRET_ACCESS_KEYaws secret
AWS_END_POINTaws end point
BUCKET_NAMEbucket name, e..g storage.gitgratitude.com
*_DATABASE_URLwill loop over each entry in the enviroment matching the pattern

eco-cocoapods – not implemented

ruby-based gateway to the cocoapods ecosystem

ruby-based gateway to the rubygems ecosystem

Uses project schema as a resposne

Required ENV

none

Events

none

node-based gateway to the npm ecosystem

Uses project schema as a resposne

Required ENV

none

Events

none

forge-github

Issues and other project level data

ruby-based parsing of Gemfile.lock

Required ENV

none

Events

none

node-based parsing of package-lock.json

ruby-based parsing of Podlock.lock

node-based parsing of yarn.lock

ruby service to handle sending emails and slack notifications

NextJS app that contains the front end as well as the projects database

Required ENV

Mostly referenced in lib/ecosystem.ts

DATABASE_URLconnect string to postgres
ECO_RUBYGEMS_URLurl to eco-rubygems
ECO_NPM_URLurl to eco-npm
ECO_COCOAPODS_URLurl to eco-cocoapods
LOCK_GEMFILE_URLurl to lock-gemfile
LOCK_YARNLOCK_URLurl to lock-yarnlock
LOCK_PACKAGELOCK_URLurl to lock-packagelock
LOCK_PODLOCK_URLuri to lock-podlock

NextJS that contains the frontend to the repositories

Required ENV

DATABASE_URLconnect string to postgres
K_SINKurl to post events to

Events

git.donereceivesa repository has been updated
url.watchsendswatch for changes on this url
git.processsendsprocess a repository

Downloading and analyising the repo

Work outstanding

  • [ ] look for special files to fingerprint the repo (code of conduct, etc)
  • [ ] private repositories

Required ENV

K_SINKurl to post events to
AWS_ACCESS_KEY_IDaws access id key
AWS_SECRET_ACCESS_KEYaws secret
AWS_END_POINTaws end point
BUCKET_NAMEbucket name, e..g storage.gitgratitude.com
STORAGE_URLexternal url to bucket, e.h. https://storage.gitgratitude.com

Events

git.processreceivespulls the repository and creates the database
git.donesendsfinished processing the repository

Service to monitor URLs and trigger recalculations

Work outstanding

  • [ ] specify in url.watch if we want to store in S3 or not
  • [ ] trigger poll based on ping events
  • [ ] better history of frequency changes

Required ENV

K_SINKurl to post events to
AWS_ACCESS_KEY_IDaws access id key
AWS_SECRET_ACCESS_KEYaws secret
AWS_END_POINTaws end point
BUCKET_NAMEbucket name, e..g storage.gitgratitude.com

Events

url.watchreceivesthe url to watch
url.changedsendsA watched url has changed, data uploaded into S3

About

Find out who is building your software

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published