Learning PostgreSQL by leaning on AI assistants instead of traditional textbooks or video courses. This repo tracks the prompts we use, the study plan generated from them, and the lightweight local environment needed to experiment.
- Build a repeatable study workflow that relies on ChatGPT, Perplexity, and Warp for guidance, research, and exploratory questions.
- Focus on skills relevant to running PostgreSQL in Docker locally while preparing for a managed PostgreSQL deployment later.
- Keep all guidance/versioned notes in Markdown so progress is easy to review or adapt.
docs/– all Markdown guidance (overview, local setup, classic tutorials).prompts/– AI prompt definitions.migrations/– SQL files consumed by themigrateservice (init.sql, etc.).learnings/– notes and insights from interactive study sessions.- project root – runtime files (
docker-compose.yml,.env,.env.local, README).
docs/overview.md– high-level plan, key links, and focus topics.docs/local_setup.md– Docker Compose instructions (Postgres 17,.envdriven).docs/classic_tutorials.md– list of traditional resources (ignored by AI agents unless explicitly requested).docs/study_plan.md– study plan generated fromprompts/create_plan.md(paste ChatGPT response here).learnings/– folder for notes and insights from interactive, non-linear study sessions (create individual Markdown files as needed).prompts/create_plan.md– master prompt used to generate/refresh the learning roadmap..env.example,.env.local,docker-compose.yml– local runtime configuration.
- ChatGPT: primary planning and deep-dive questions (prompt stored in
prompts/create_plan.md). - Perplexity: topic research and cross-checking facts.
- Warp / Cursor / Antigravity: interactive shells/IDEs with inline AI to try queries, capture answers, and keep the learning loop tight.
- The agent treats files under
docs/as the canonical narrative (overview, setup, classic resources). Reference them by filename or section when giving instructions; no need to paste the content into chat unless you are pointing out a specific snippet. prompts/create_plan.mdis the authoritative master prompt. When you want the agent to regenerate the study plan, mention this file ("run the plan fromprompts/create_plan.md") so it knows exactly which instructions to follow. Paste the response intodocs/study_plan.md.- Document interactive learnings (from Warp, Cursor, etc.) by creating files in
learnings/as you discover new concepts. - For files marked as "ignore" (e.g.,
docs/classic_tutorials.md), explicitly tell the agent if you want to override that behavior.
- Update prompts or docs in
docs/as you iterate on the learning plan. - When generating a new study plan from
prompts/create_plan.md, paste the ChatGPT response intodocs/study_plan.md. - Document learnings from interactive sessions (Warp, Cursor, etc.) by creating files in
learnings/. - Capture other noteworthy AI outputs (from Perplexity, ChatGPT deep-dives, etc.) as new Markdown files in
learnings/. - Run
docker compose downwhen finished hacking to keep the environment clean. - Commit and push changes once the docs/environment updates are in place.
- Copy
.env.exampleto.env.local(keep.env.localuntracked or customize credentials there). - Start PostgreSQL:
docker compose up -d. - Run migrations when needed:
docker compose run --rm migrate. - Connect with
pgcliorpsqlusing the credentials defined in.env.local.
Aliases for convenience (add to your shell RC):
alias pgdev-start='docker compose up -d'
alias pgdev-stop='docker compose down'
alias pgdev-migrate='docker compose run --rm migrate'
alias pgdev-reset='docker compose down -v'
# Set environment variables from .env file
alias pgsetenv='set -a; source .env.local; set +a'
# Connect with psql
alias pgconnect='psql -h localhost -p 5432 -U "${POSTGRES_USER}" -d "${POSTGRES_DB}"'
# Connect with pgcli
alias pgcliconnect='pgcli -h localhost -p 5432 -U "${POSTGRES_USER}" -d "${POSTGRES_DB}"'
The pgsetenv alias exports environment variables from .env.local into your current shell session. Add this to your ~/.zshrc:
alias pgsetenv='set -a; source .env.local; set +a'Run pgsetenv in your terminal before using pgconnect or pgcliconnect aliases.
- Both aliases live in
~/.zshrcand rely on the environment variablesPOSTGRES_USERandPOSTGRES_DBbeing exported in your shell session. - Before using the aliases, run
pgsetenvin your terminal to export the variables from.env.local. - Then run
pgconnect(psql) orpgcliconnect(pgcli); the alias will substitute${POSTGRES_USER}/${POSTGRES_DB}automatically using the values from your shell environment.