Skip to content

Commit

Permalink
Merge pull request #52 from adamcohenhillel/supabase-migrations
Browse files Browse the repository at this point in the history
Add Supabase migrations, and docs
  • Loading branch information
adamcohenhillel committed Feb 14, 2024
2 parents 9c313da + e87f5d0 commit d30de98
Show file tree
Hide file tree
Showing 3 changed files with 97 additions and 17 deletions.
15 changes: 6 additions & 9 deletions docs/getting_started.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,19 +46,16 @@ We will use Supabase as our database (with vector search, pgvector), authenticat

<img src="../images/supabase_new_user.png" width="200">

6. From there, go to the SQL Editor tab (<img src="../images/supabase_sql_editor.png" width="100">) and paste the [schema.sql](/supabase/schema.sql) from this repo, and execute. This will enable all the relevant extensions (pgvector) and create the two tables:
6. By now, you should have 4 things: `email` & `password` for your supabase user, and the `Supabase URL` and `API Anon Key`.

<img src="../images/supabase_tables.png" width="150">
7. If so, go to your terminal, and cd to the supabase folder: `cd ./supabase`

7. By now, you should have 4 things: `email` & `password` for your supabase user, and the `Supabase URL` and `API Anon Key`.

8. If so, go to your terminal, and cd to the supabase folder: `cd ./supabase`

9. Install Supabase and set up the CLI. You should follow thier [guide here](https://supabase.com/../guides/cli/getting-started?platform=macos#installing-the-supabase-cli), but in short:
8. Install Supabase and set up the CLI. You should follow thier [guide here](https://supabase.com/../guides/cli/getting-started?platform=macos#installing-the-supabase-cli), but in short:
- run `brew install supabase/tap/supabase` to install the CLI (or [check other options](https://supabase.com/../guides/cli/getting-started))
- Install [Docker Desktop](https://www.docker.com/products/docker-desktop/) on your computer (we won't use it, we just need docker daemon to run in the background for deploying supabase functions)
10. Now when we have the CLI, we need to login with oour Supabase account, running `supabase login` - this should pop up a browser window, which should prompt you through the auth
11. And link our Supabase CLI to a specific project, our newly created one, by running `supabase link --project-ref <your-project-id>` (you can check what the project id is from the Supabase web UI, or by running `supabase projects list`, and it will be under "reference id") - you can skip (enter) the database password, it's not needed.
9. Now when we have the CLI, we need to login with our Supabase account, running `supabase login` - this should pop up a browser window, which should prompt you through the auth
10. And link our Supabase CLI to a specific project, our newly created one, by running `supabase link --project-ref <your-project-id>` (you can check what the project id is from the Supabase web UI, or by running `supabase projects list`, and it will be under "reference id") - you can skip (enter) the database password, it's not needed.
11. Now we need to apply the Adeus DB schema on our newly created, and empty database. We can do this by simply run: `supabase db push`. We can verify it worked by going to the Supabase project -> Tables -> and see that new tables were created.
12. Now let's deploy our functions! ([see guide for more details](https://supabase.com/../guides/functions/deploy)) `supabase functions deploy --no-verify-jwt` (see [issue re:security](https://github.com/adamcohenhillel/AdDeus/issues/3))
13. If you're planning to first use OpenAI as your Foundation model provider, then you'd need to also run the following command, to make sure the functions have everything they need to run properly: `supabase secrets set OPENAI_API_KEY=<your-openai-api-key>` (Ollama setup guide is coming out soon)
14. If you want access to tons of AI Models, both Open & Closed Source, set up your OpenRouter API Key. Go to [OpenRouter](https://openrouter.ai/) to get your API Key, then run `supabase secrets set OPENROUTER_API_KEY=<your-openrouter-api-key>`.
Expand Down
79 changes: 79 additions & 0 deletions docs/guides/make_db_migration.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
---
title: Make a DB Migration
description: add description
layout: default
parent: How to Guides
---

# Make a DB Migration
{: .no_toc }

## Table of contents
{: .no_toc .text-delta }

1. TOC
{:toc}

---

## Intro
If you're working on a new feature that requires changes to the database, then you need to generate a migration file for those changes, so when your feature is merged to the main branch, and start being used by other people, they will be able to update their database accordingly.

This guide provides step-by-step instructions for how to make migration file from your Supabaase database changes.


## Create the migration

Let's say you edited the database in your Supabase project. You added the column "new_data" to the table.

Now you need to make sure others will have that column as well.


1. Go to the supabase folder in your local cloned repo
```bash
cd supabase
```

2. Make sure you're linked to the right Supabase project:
```bash
supabase link --project-ref <YOUR_REMOTE_SUPABASE_PROJECT_ID>
```

3. Create a new migration from the remote Supabase instance:
```bash
supabase db pull
```

This will generate a new file in the folder `supabase/migrations` named <timestamp>_remote_commit.sql


Add it to your branch, and push it with the rest of the feature code to your PR.


## Sync your database with all existing migrations

In case there are new migrations for Adeus, and you need to sync your own database with the latest migrations, follow these instructions:


1. Go to the supabase folder in your local cloned repo
```bash
cd supabase
```

2. Make sure you're linked to the right Supabase project:
```bash
supabase link --project-ref <YOUR_REMOTE_SUPABASE_PROJECT_ID>
```

3. Have a dry run:

```bash
supabase db push --dry-run
```
This will tell you what migrations will need to run, but without executing. This is useful way to see upfront what the migration changes are.

4. Push to Prod!!!!!!!!
```bash
supabase db push
```

Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ CREATE EXTENSION IF NOT EXISTS "uuid-ossp" WITH SCHEMA "extensions";

CREATE EXTENSION IF NOT EXISTS "vector" WITH SCHEMA "extensions";

CREATE OR REPLACE FUNCTION "public"."match_records_embeddings_similarity"("query_embedding" "extensions"."vector", "match_threshold" double precision, "match_count" integer) RETURNS TABLE("id" integer, "raw_text" "text", "similarity" double precision)
CREATE OR REPLACE FUNCTION "public"."match_records_embeddings_similarity"(query_embedding extensions.vector, match_threshold double precision, match_count integer) RETURNS TABLE(id integer, raw_text text, similarity double precision)
LANGUAGE "sql" STABLE
AS $$
select
Expand All @@ -39,16 +39,16 @@ CREATE OR REPLACE FUNCTION "public"."match_records_embeddings_similarity"("query
limit match_count;
$$;

ALTER FUNCTION "public"."match_records_embeddings_similarity"("query_embedding" "extensions"."vector", "match_threshold" double precision, "match_count" integer) OWNER TO "postgres";
ALTER FUNCTION "public"."match_records_embeddings_similarity"(query_embedding extensions.vector, match_threshold double precision, match_count integer) OWNER TO "postgres";

SET default_tablespace = '';

SET default_table_access_method = "heap";

CREATE TABLE IF NOT EXISTS "public"."conversations" (
"id" bigint NOT NULL,
"created_at" timestamp with time zone DEFAULT "now"() NOT NULL,
"context" "json" DEFAULT '[]'::"json"
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"context" json DEFAULT '[]'::json
);

ALTER TABLE "public"."conversations" OWNER TO "postgres";
Expand All @@ -64,9 +64,9 @@ ALTER TABLE "public"."conversations" ALTER COLUMN "id" ADD GENERATED BY DEFAULT

CREATE TABLE IF NOT EXISTS "public"."records" (
"id" bigint NOT NULL,
"created_at" timestamp with time zone DEFAULT "now"() NOT NULL,
"raw_text" "text",
"embeddings" "extensions"."vector"
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
"raw_text" text,
"embeddings" extensions.vector
);

ALTER TABLE "public"."records" OWNER TO "postgres";
Expand All @@ -86,7 +86,7 @@ ALTER TABLE ONLY "public"."conversations"
ALTER TABLE ONLY "public"."records"
ADD CONSTRAINT "records_pkey" PRIMARY KEY ("id");

CREATE POLICY "Enable access for all authed" ON "public"."conversations" TO "authenticated" USING (true);
CREATE POLICY "Enable access for all authed" ON "public"."conversations" TO authenticated USING (true);

ALTER TABLE "public"."conversations" ENABLE ROW LEVEL SECURITY;

Expand All @@ -95,6 +95,10 @@ GRANT USAGE ON SCHEMA "public" TO "anon";
GRANT USAGE ON SCHEMA "public" TO "authenticated";
GRANT USAGE ON SCHEMA "public" TO "service_role";

GRANT ALL ON FUNCTION "public"."match_records_embeddings_similarity"(query_embedding extensions.vector, match_threshold double precision, match_count integer) TO "anon";
GRANT ALL ON FUNCTION "public"."match_records_embeddings_similarity"(query_embedding extensions.vector, match_threshold double precision, match_count integer) TO "authenticated";
GRANT ALL ON FUNCTION "public"."match_records_embeddings_similarity"(query_embedding extensions.vector, match_threshold double precision, match_count integer) TO "service_role";

GRANT ALL ON TABLE "public"."conversations" TO "anon";
GRANT ALL ON TABLE "public"."conversations" TO "authenticated";
GRANT ALL ON TABLE "public"."conversations" TO "service_role";
Expand Down

0 comments on commit d30de98

Please sign in to comment.