diff --git a/README.md b/README.md index d2cbb92..c918408 100644 --- a/README.md +++ b/README.md @@ -1,16 +1,16 @@ -# Scenario : Advanced AI Pair Programming Challenge +# Scenario : Spec Driven Development Programming Challenge -Developers today agree that AI code assistant tools are useful for easy, repeatable tasks. JSON to Class, quick templated code, writing simple scripts. But most still think it's bad at more complex tasks. This exercise is here to show you that assumption is no longer as true as you might think! +Developers today agree that AI code assistant tools are useful for easy, repeatable tasks. JSON to Class, quick templated code, writing simple scripts. But most still think it's bad at more complex tasks. This exercise is here to show you that assumption is no longer as true as you might think! Spec-driven development takes your skills as a Software Engineer and augments your AI Coding Assistant to build software *your* way. ## Prerequisites -- Clone the AI Assisted Coding Framework repository: https://github.com/ChrisMcKee1/AI-Assisted-Coding -- [Install & start Docker Desktop](https://docs.docker.com/get-started/get-docker/) +- Clone the Spec Driven Coding Framework repository: https://github.com/ChrisMcKee1/AI-Assisted-Coding -or - -- [Install & Start Podman](https://podman.io/docs/installation) +- [NodeJS 16+](https://nodejs.org/) for MCP servers +- **Container Platform** An OCI compliant container runtime is recommended for .NET, and Python developers + - [Docker Desktop](https://www.docker.com/products/docker-desktop) + - [Podman](https://podman.io/) [^1] @@ -23,7 +23,7 @@ or
For Java Developers -- Clone the Pet Clinic repository: https://github.com/azure-samples/spring-petclinic-microservices +- Clone the Pet Clinic repository: https://github.com/spring-projects/spring-petclinic/
@@ -131,3 +131,5 @@ This project has adopted the [Microsoft Open Source Code of Conduct](https://ope ### Trademarks This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow [Microsoft's Trademark & Brand Guidelines](https://www.microsoft.com/legal/intellectualproperty/trademarks/usage/general). Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies. + +[^1]: For more information, see [Container Runtime](https://learn.microsoft.com/en-us/dotnet/aspire/fundamentals/setup-tooling?tabs=linux%2Cunix&pivots=dotnet-cli#container-runtime) \ No newline at end of file diff --git a/conportal-python.md b/conportal-python.md deleted file mode 100644 index 60b62ec..0000000 --- a/conportal-python.md +++ /dev/null @@ -1,54 +0,0 @@ -# Context Portal MCP in Python - -## Prerequisites - -Before you begin, ensure you have the following installed: - -- **Python:** Version 3.8 or higher is recommended. - - [Download Python](https://www.python.org/downloads/) - - Ensure Python is added to your system's PATH during installation (especially on Windows). -- **uv:** (Highly Recommended) A fast Python environment and package manager. Using `uv` significantly simplifies virtual environment creation and dependency installation. - - [Install uv](https://github.com/astral-sh/uv#installation) - -## Installation and Configuration - -The recommended way to install and run ConPort is by using `uvx` to execute the package directly from PyPI. This method avoids the need to manually create and manage virtual environments. - - -### `uvx` Configuration - -In the `mcp.json` file of your `.vscode` folder, change the conport configuration to the following: - -```json -{ - ... - "conport": { - "command": "uvx", - "type": "stdio", - "args": [ - "--from", - "context-portal-mcp", - "conport-mcp", - "--mode", - "stdio", - "--workspace_id", - "${workspaceFolder}", - "--log-file", - "./logs/conport.log", - "--log-level", - "INFO" - ] - } - ... -} -``` - -- **`command`**: `uvx` handles the environment for you. -- **`args`**: Contains the arguments to run the ConPort server. -- `${workspaceFolder}`: This IDE variable is used to automatically provide the absolute path of the current project workspace. -- `--log-file`: Optional: Path to a file where server logs will be written. If not provided, logs are directed to `stderr` (console). Useful for persistent logging and debugging server behavior. -- `--log-level`: Optional: Sets the minimum logging level for the server. Valid choices are `DEBUG`, `INFO`, `WARNING`, `ERROR`, `CRITICAL`. Defaults to `INFO`. Set to `DEBUG` for verbose output during development or troubleshooting. - -### Update the `copilot-instructions.md` - -Finally, you'll need to update the copilot instructions file. We currently hardcode the Workspace Id for ConPort to be `/app/workspace` as that is the workspace we use within the Docker container to connect to your code. Since you are running the code locally, you will need to change this to `${workspaceFolder}`. So do a find-replace of `/app/workspace` and replace with `${workspaceFolder}`. This should allow VS Code to automatically send the correct workspace ID to your ConPort process. \ No newline at end of file diff --git a/goals/3-tips.md b/goals/3-tips.md index 3fb6a2a..705f35d 100644 --- a/goals/3-tips.md +++ b/goals/3-tips.md @@ -16,11 +16,11 @@ ## Workflow-Specific Tips for This Framework - **Start with Requirements in Markdown** - Before using the `Plan` or `Act` commands, write detailed requirements and architectural notes in markdown files (e.g., `projectBrief.md`, `docs/`). This gives Copilot and the AI agent the context needed for accurate planning and implementation. + Before using the `/create-spec` or `/execute-tasks` commands, write detailed requirements and architectural notes in markdown files (e.g., `projectBrief.md`, `docs/`). This gives Copilot and the AI agent the context needed for accurate planning and implementation. - **Follow the Core Workflow Commands** - Use the provided commands (`Plan`, `Act`, `Research`, etc.) as described in the README to trigger structured workflows. This ensures Copilot leverages both live documentation and persistent project memory. -- **Sync ConPort Regularly** - After major changes or at the end of a session, run the `conport_sync_routine` to persist new knowledge, decisions, and context. This keeps the AI's memory up to date and prevents context loss. + Use the provided commands (`/create-spec`, `/execute-tasks`, `/browser-test`, etc.) as described in the README to trigger structured workflows. This ensures Copilot leverages both live documentation and persistent project memory. +- **Sync Regularly** + After major changes or at the end of a session, run the `/generate-report` to persist new knowledge, decisions, and context. This keeps the AI's memory up to date and prevents context loss. - **Document Decisions and Patterns** When you make architectural or implementation decisions, log them using the appropriate workflow or markdown files. This helps Copilot and the agent avoid repeating past mistakes and improves future suggestions. - **Use Semantic Search for Context** diff --git a/goals/dotnet/1-setup.md b/goals/dotnet/1-setup.md index 7c2a3f3..c9f0e8a 100644 --- a/goals/dotnet/1-setup.md +++ b/goals/dotnet/1-setup.md @@ -1,86 +1,99 @@ ## πŸš€ Setting up the AI Assisted Coding Framework in your project -This framework integrates two powerful MCP (Model Context Protocol) servers to supercharge your development workflow: +This framework integrates several powerful MCP (Model Context Protocol) tools to supercharge your development workflow: - **Context7 MCP**: Provides live documentation and code snippet retrieval for authoritative technical references -- **ConPort MCP**: Delivers persistent project memory, decision tracking, and knowledge graph capabilities +- **Memory MCP**: Delivers persistent project memory, decision tracking, and knowledge graph capabilities +- **Sequential Thinking MCP**: Assists the LLM with ordering tasks, and breaking down complex ideas +- **Microsoft.Learn MCP**: Give your LLM access to the entire Microsoft Learn knowledgebase! Together, they transform GitHub Copilot into an intelligent development assistant that remembers project context, tracks architectural decisions, and maintains comprehensive project knowledge across sessions. ## πŸ“‹ Prerequisites -- **Docker** or **Podman** (for ConPort MCP) -- **Node.js 16+** (for Context7 MCP server) - Optional, see step 2 below -- **VS Code** with GitHub Copilot extension +- **Node.js 16+** (for MCP tools) +- **Container Platform** An OCI compliant container runtime, such as: + - [Docker Desktop](https://www.docker.com/products/docker-desktop) + - [Podman](https://podman.io/) [^1] +- **IDE** with GitHub Copilot[^2] - **Git** for version control ## πŸ› οΈ Installation & Setup -### Step 1: Clone and Copy Framework Files +### Step 1: Clone The Framework Repository #### Windows Terminal: ```powershell # Clone this repository git clone https://github.com/ChrisMcKee1/AI-Assisted-Coding.git -cd AI-Assisted-Coding - -# Copy all framework files to your project's root directory -# Replace 'your-project-path' with the actual path to your project -robocopy . "C:\path\to\your-project" /E /XD .git ``` -#### Linux / OSX Terminal: -```bash -git clone https://github.com/ChrisMcKee1/AI-Assisted-Coding.git -cd AI-Assisted-Coding - -# Copy all framework files to your project's root directory -# Replace '/path/to/your-project' with the actual path to your project -rsync -av --exclude='.git' . /path/to/your-project/ -``` +We will use the files in this repository once you've setup your local workspace. Keep them somewhere easy to access, we recommend a folder such as `C:\github\` or `~/github` on Linux & OSX. ### Step 2 (optional): Change the MCP Configuration -By default, we utilize the local Contex7 MCP server for caching and speed. However, if you don't have NodeJS installed, and do not wish to use NodeJS, you can instead utilize iether the public Context7 MCP server, or build and run the Docker version of the Context7 MCP tool. The `.vscode/mcp.json` file has commented out configuration options for these two paths. +By default, we utilize the local MCP tools for caching and speed. However, if you don't have NodeJS installed, and do not wish to use NodeJS, you can instead utilize docker versions of each tool. To build and run the Docker version of the the tools, you will need to see the documentation for each to setup the docker config: See the [Context7 Docker Readme](../../context7-docker.md) for instructions on how to use Docker to host Context7 locally. -Also, we are using Docker for hosting the Context Portal MCP server. We recommend you pull the image locally before you start up VS Code. To pull the image: +See the [Playwright MCP Readme](https://github.com/microsoft/playwright-mcp) for details on advanced configuration and docker support for Playwright. + +See the [Sequential Thinking MCP Readme](https://github.com/modelcontextprotocol/servers/tree/main/src/sequentialthinking) for details on how to configure it using docker. + +See the [Memory MCP Readme](https://github.com/modelcontextprotocol/servers/tree/main/src/memory) for details on how to configure it using Docker + +> [!IMPORTANT] +> If you intend to do this, please do this before coming to the workshop, and ensure they are setup and configured. This can take time to setup and troubleshoot, so be prepared. Our recommendation is to use the NodeJS versions of the tools. -`docker pull seiggy/context-portal-mcp:0.2.18` +### Step 3: Clone the eShop Repository -If you are using Podman, replace `docker` with `podman` in both the command, and in the `mcp.json` file. If you do not have Docker installed, you can run the Context Portal MCP server locally using Python and UV. For further details on how to use the ConPort MCP server with Python, see the [Context Portal Python Readme](../../conportal-python.md). +You'll be building this challenge using the popular eShop demo repository from Microsoft. Clone the [eShop](https://github.com/dotnet/eShop/) Repository from the dotnet team to an easy to access location on your machine. -### Step 3: Open Project in VS Code +### Step 4: Copy the Framework to the eShop repository +Copy the `.github` and `.vscode` folders from the AI-Assisted-Coding repository to the root of your eShop repository. An example script below assumes you cloned both repositories to the root of `C:\github` or `~/github` respectively. + +#### Powershell ```powershell -# Open your project in VS Code -code . +cd c:\github\AI-Assisted-Coding +robocopy . "C:\github\eShop" /E /XD .git +``` + +#### Bash +```bash +cd ~/github +rsync -av --exclude='.git' . ~/github/eShop ``` -### Step 4: Update Your Project Brief -Before proceeding, overwrite the `projectBrief.md` in your project root that you copied from the AI-Assisted-Coding repository, with the `projectBrief.md` file we provided in this repository. -### Step 5: Verify GitHub Copilot Integration +### Step 5: Run the Analyze Workflow -1. Ensure GitHub Copilot extension is installed and activated -2. The framework will automatically detect the `copilot-instructions.md` file +Open the eShop Solution using VS Code or your preferrred IDE. +Run the `/analyze-product` prompt in Copilot Agent Mode to analyze your woskpace and generate the documents that help the spec-driven development workflow operate smoothly. -### Step 6: Initialize ConPort MCP +> [!NOTE] +> If you are using Visual Studio 2022, you will need to manually reference custom instruction files. You will need to swap to "folder view" in Visual Studio 2022, and then you can type `#analyze-product.instructions.md` to find and reference the file in the chat window. -After verifying Copilot integration, initialize the ConPort MCP memory system: +> [!NOTE] +> If you are using Jebrains Rider, you will need to manually reference the file by dragging it from the file system window, or by right-clicking, and using the context menu to reference the file in chat. -1. In the Copilot chat or comments, type: - ``` - Initialize ConPort - ``` -1. Follow the prompts to complete setup. - This step creates the persistent memory database and loads your project context before you start coding. +![Rider Context Menu](../../screenshots/rider_context_menu.png) -### Step 7: Have the AI conduct Architecture Review +### Step 6: Verify Framework setup is completed -Have Copilot Create an Architecture Review of the repo. Take a look at the `README.md` in the AI Assisted Coding framework for help on how to conduct the Architecture review. When it completes, you should end up with a new folder named `architectureDiagrams` that contains 5 ore more markdown files with charts, documentation, and more about the eShop Application! Make sure you pass the #codebase token in your command, so that VS Code will give the AI access to the codebase for the context! +1. Answer any questions that the AI prompts you with, and wait for it to complete. +2. The framework will automatically generate the `copilot-instructions.md` file +3. The framework should also generate a series of files in a folder named `.docs` +4. Review these documents for accuracy, and fix any problems you see. + + +### Step 7: Follow the instructions from the `README.md` file to run eShop locally + +Follow the instructions in the `README.md` file at the base of the eShop Repository to setup the application locally. You'll want to run the application using Aspire, and Docker, so that you can debug the application. Ensure that you can reach the Blazor Customer Portal, and that you can add an item from the catalog to your cart before you continue. ## Next challenge -Now that you've setup the AI to be able to better understand and work within your repository, it's time to [understand how to write requirements!](./2-requirements.md) \ No newline at end of file +Now that you've setup the AI to be able to better understand and work within your repository, it's time to [understand how to write requirements!](./2-requirements.md) + +[^1]: For more information, see [Container Runtime](https://learn.microsoft.com/en-us/dotnet/aspire/fundamentals/setup-tooling?tabs=linux%2Cunix&pivots=dotnet-cli#container-runtime) +[^2]: _VS Code supports all features, Visual Studio 2022 17.14.13 and Jetbrains IDEs require workarounds noted through the workshop_ \ No newline at end of file diff --git a/goals/dotnet/2-requirements.md b/goals/dotnet/2-requirements.md index a9cf79a..76cbb0c 100644 --- a/goals/dotnet/2-requirements.md +++ b/goals/dotnet/2-requirements.md @@ -16,7 +16,7 @@ Your mission is to **design and document a new feature** for the eShop reference Before you begin, familiarize yourself with the existing eShop architecture by reviewing: -- **Architecture Diagrams**: `architectureDiagrams/*.md` - Architecture documentation created by GitHub Copilot +- **AI Generated documentation**: `docs/*.md` - Project and Architecture documentation created by GitHub Copilot - **Current Services**: Identity, Catalog, Basket, Ordering, Webhooks, Mobile.BFF - **Technology Stack**: .NET 9, Aspire, PostgreSQL, Redis, RabbitMQ, Blazor @@ -39,161 +39,15 @@ Before you begin, familiarize yourself with the existing eShop architecture by r ### 🎨 **Your Own Idea** Create something unique that fits the e-commerce domain and showcases modern software engineering practices. -## πŸ“ Requirements Template +## πŸ“ Requirements Generation -Create a new markdown file named `feature-[your-feature-name].md` in a directory named `/backlog`. Here's a sample structure that can help! (also, see our example feature in the `exercise-files` folder!): +Pass your idea with as much detail as you'd like to the `/create-spec` prompt. The AI will then generate a `.docs/specs` folder, with a specification detailing what it understands of your requirements. You will want to go through all of these documents, and correct any mistakes, add anything it missed, or take off features and functionality you don't want it to implement. -```markdown -# Feature: [Your Feature Name] - -## πŸ“– Executive Summary - -### Business Value -- [ ] **Problem Statement**: What business problem does this solve? -- [ ] **Target Users**: Who will use this feature? -- [ ] **Success Metrics**: How will you measure success? -- [ ] **Business Impact**: Revenue, user engagement, operational efficiency? - -### Technical Alignment -- [ ] **Architecture Fit**: How does this align with the microservices architecture? -- [ ] **Service Boundaries**: Which services will be affected or created? -- [ ] **Data Ownership**: Which service owns the feature's data? - -## 🎯 Feature Requirements - -### Functional Requirements -1. **[Requirement ID]**: [Clear, testable requirement] - - **Acceptance Criteria**: - - Given [context] - - When [action] - - Then [expected outcome] - - **Priority**: Must Have / Should Have / Could Have / Won't Have - -2. **[Next Requirement]**... - -### Non-Functional Requirements -- **Performance**: Response time, throughput expectations -- **Scalability**: Expected load, growth patterns -- **Security**: Authentication, authorization, data protection -- **Reliability**: Availability, error handling, recovery -- **Usability**: User experience considerations - -## πŸ—οΈ Technical Design - -### Service Architecture -- **New Services**: What new microservices need to be created? -- **Modified Services**: Which existing services need changes? -- **Service Communication**: How will services communicate? (sync/async) -- **Data Flow**: Map the data flow through your feature - -### Database Design -- **New Tables/Collections**: Schema design -- **Data Relationships**: How does your data relate to existing entities? -- **Migration Strategy**: How will you handle schema changes? - -### Event Design -- **Domain Events**: What events will your feature publish? -- **Integration Events**: How will you integrate with other services? -- **Event Handlers**: What background processing is needed? - -### API Design - -# Example API endpoints - -GET /api/[service]/[resource] -POST /api/[service]/[resource] -PUT /api/[service]/[resource]/{id} -DELETE /api/[service]/[resource]/{id} - -## πŸ”„ Implementation Roadmap - -### Phase 1: Foundation (Week 1-2) -- [ ] Service scaffolding -- [ ] Database schema -- [ ] Basic CRUD operations - -### Phase 2: Core Features (Week 3-4) -- [ ] Business logic implementation -- [ ] Event integration -- [ ] API development - -### Phase 3: Integration (Week 5-6) -- [ ] Frontend integration -- [ ] Service communication -- [ ] End-to-end testing - -### Phase 4: Polish (Week 7-8) -- [ ] Performance optimization -- [ ] Security hardening -- [ ] Documentation - -## πŸ§ͺ Testing Strategy - -### Unit Testing -- [ ] Service layer tests -- [ ] Domain logic tests -- [ ] Repository tests - -### Integration Testing -- [ ] API endpoint tests -- [ ] Database integration tests -- [ ] Event handler tests - -### End-to-End Testing -- [ ] User workflow tests -- [ ] Cross-service integration tests -- [ ] Performance tests - -## πŸ“Š Monitoring & Observability - -### Metrics -- [ ] Business metrics to track -- [ ] Technical metrics to monitor -- [ ] SLA/SLO definitions - -### Logging -- [ ] Structured logging requirements -- [ ] Log correlation across services -- [ ] Security audit logging - -### Alerting -- [ ] Critical alerts -- [ ] Performance degradation alerts -- [ ] Business metric alerts - -## 🚨 Risk Assessment - -### Technical Risks -- **Risk**: [Description] - - **Probability**: High/Medium/Low - - **Impact**: High/Medium/Low - - **Mitigation**: [Strategy] - -### Business Risks -- **Risk**: [Description] - - **Mitigation**: [Strategy] - -## πŸŽ“ Decision Log - -Use this section to document key decisions made during feature design: - -### Decision 1: [Title] -- **Context**: Why was this decision needed? -- **Options Considered**: What alternatives were evaluated? -- **Decision**: What was chosen? -- **Rationale**: Why was this the best choice? -- **Consequences**: What are the implications? - -## πŸ“š References - -- [ ] External APIs or services referenced -- [ ] Design patterns used -- [ ] Industry best practices followed -- [ ] Performance benchmarks -``` + > [!IMPORTANT] + > Try to be specific! Don't just tell the AI: "add a rewards system", give it details, such as: `create a customer loyalty program where for every $1 spent on the store, the customer earns 10 points. And for each 1,000 points the customer can redeem those points for $10 off their next purchase. Show the customer's point balance on their profile. And show the number of points earned under each item on the product page` > [!TIP] - > Try using Copilot to have it write your requirements for you! Copy the template, and surround it with <|TEMPLATE_START|><|TEMPLATE_END|>, and describe to the AI the Feature you'd like to plan! Then review and edit the generated markdown file to ensure that it matches what you want! + > Super-pro tip! You can create your requirements in a markdown file and then reference it to the AI in the same way we've referenced other files. We have a sample requirements-template.md file in this repository that has a great starter template to use for feeding detailed requirements to GitHub Copilot! [Check it out!](../../requirements-template.md) ## βœ… Success Criteria @@ -248,23 +102,23 @@ Your feature requirements document should demonstrate: Remember: **Good requirements are the foundation of great software**. Take your time to think through the problem space before jumping into solutions. The eShop application is a reference for modern software engineering practicesβ€”your feature should exemplify the same level of thoughtfulness and technical excellence. -**Start by creating your feature file and begin documenting your thinking process. Use Chat mode to help you if you're stuck!** +**Once you have a well defined specification, you can start work!** -When you're created your requirements file, prompt Copilot in Agent Mode: +When you're created your specs, prompt Copilot in Agent Mode: -`Plan #feature-[your-feature-name].md` +`/execute-tasks` -This should trigger the planning process. The AI should validate and produce an implementation plan, and ask if you're ready to start! When you're ready to continue: +You can pass it alone, or reference the specific task id from the `tasks.md` file you'd like to start with. Remember to help the AI keep track of completed tasks, so that you can easily start new conversations as context windows fill up. You can have the AI pickup on a specific task id, and sub-task with the same command: -`Act: #feature-[your-feature-name].md` +`/execute-task Resume 03bd0240-fdcc-48a9-832c-71c44193a375 task 1.10` From here, the AI will begin implementing your feature. Ensure you interact with the AI often, running unit tests, building and validating it's progress, provide feedback. Continue to use this process as you go until your feature is completed! ```mermaid flowchart LR - Plan[Plan
#story] --> Act[Act
#story] - Act --> Status[Status
#story] - Status --> Debug[Debug
#story] - Debug --> Plan + Plan[Plan
/create-spec] --> Act[Act
/execute-tasks] + Act --> Status[Status
/generate-report] + Status --> Debug[Debug
/startDebugging] + Debug --> Act style Plan stroke:#4F8EF7,stroke-width:2px style Act stroke:#4F8EF7,stroke-width:2px style Status stroke:#4F8EF7,stroke-width:2px diff --git a/goals/dotnet/projectBrief.md b/goals/dotnet/projectBrief.md deleted file mode 100644 index 74b321c..0000000 --- a/goals/dotnet/projectBrief.md +++ /dev/null @@ -1,40 +0,0 @@ -# Project Brief: eShop (.NET Reference Application) - -## Overview -The **eShop** project is a reference .NET application developed by the .NET team to demonstrate modern application architecture and development practices using the latest .NET technologies. It implements a full-featured e-commerce website and serves as a practical example for developers building cloud-native, scalable, and maintainable applications. - -## Purpose -The primary goals of the eShop project are to: -- Showcase best practices in .NET application development. -- Demonstrate the use of **.NET Aspire**, a new stack for building distributed applications. -- Provide a hands-on learning tool for developers exploring microservices, containerization, and modern frontend/backend integration. - -## Key Features -- **Architecture**: Services-based architecture using .NET Aspire. -- **Frontend**: Built with Blazor WebAssembly and ASP.NET Core. -- **Backend**: Includes APIs, microservices, and integration with databases and messaging systems. -- **DevOps Ready**: Includes CI/CD pipelines and Docker support. -- **AI Integration**: Features an AI-powered chatbot to assist users in product discovery and shopping. -- **Performance Optimizations**: - - Uses `MapStaticAssets` for optimized static file handling. - - Implements Brotli compression and fingerprinted file names for aggressive caching. - -## Technology Stack -- .NET 9 (latest version) -- ASP.NET Core -- Blazor -- Docker -- Visual Studio 2022+ -- Optional: .NET MAUI for cross-platform client apps - -## Getting Started -To run the eShop project: -1. Clone the repository: [github.com/dotnet/eshop](https://github.com/dotnet/eshop) -2. Install prerequisites: - - .NET 9 SDK - - Visual Studio 2022 (with ASP.NET and .NET Aspire workloads) - - Docker Desktop -3. Run the solution using Visual Studio or configure your environment using the provided PowerShell scripts. - -## Internal Use and Demonstrations -The eShop app has been featured in internal presentations such as the [dotnet conf 2024 Keynote - Copy 3](https://microsoft.sharepoint.com/teams/DevRelTeam/_layouts/15/Doc.aspx?sourcedoc=%7B15D653EE-CFB6-4C32-8669-EF8B2367FE95%7D&file=dotnet%20conf%202024%20Keynote%20-%20Copy%203.pptx&action=edit&mobileredirect=true&DefaultItemOpen=1&EntityRepresentationId=421e3d07-d5fd-4cc9-b4ee-9dac667e0f4b), where it was used to demonstrate .NET 9 \ No newline at end of file diff --git a/goals/java/1-setup.md b/goals/java/1-setup.md index 7c2a3f3..a856e26 100644 --- a/goals/java/1-setup.md +++ b/goals/java/1-setup.md @@ -1,85 +1,86 @@ ## πŸš€ Setting up the AI Assisted Coding Framework in your project -This framework integrates two powerful MCP (Model Context Protocol) servers to supercharge your development workflow: +This framework integrates several powerful MCP (Model Context Protocol) tools to supercharge your development workflow: - **Context7 MCP**: Provides live documentation and code snippet retrieval for authoritative technical references -- **ConPort MCP**: Delivers persistent project memory, decision tracking, and knowledge graph capabilities +- **Memory MCP**: Delivers persistent project memory, decision tracking, and knowledge graph capabilities +- **Sequential Thinking MCP**: Assists the LLM with ordering tasks, and breaking down complex ideas +- **Microsoft.Learn MCP**: Give your LLM access to the entire Microsoft Learn knowledgebase! Together, they transform GitHub Copilot into an intelligent development assistant that remembers project context, tracks architectural decisions, and maintains comprehensive project knowledge across sessions. ## πŸ“‹ Prerequisites -- **Docker** or **Podman** (for ConPort MCP) -- **Node.js 16+** (for Context7 MCP server) - Optional, see step 2 below -- **VS Code** with GitHub Copilot extension +- **Node.js 16+** (for MCP tools) +- **IDE** with GitHub Copilot[^2] +- **Java** 17+ +- **Gradle 7.5+** or **Maven 3.5+** - **Git** for version control ## πŸ› οΈ Installation & Setup -### Step 1: Clone and Copy Framework Files +### Step 1: Clone The Framework Repository #### Windows Terminal: ```powershell # Clone this repository git clone https://github.com/ChrisMcKee1/AI-Assisted-Coding.git -cd AI-Assisted-Coding - -# Copy all framework files to your project's root directory -# Replace 'your-project-path' with the actual path to your project -robocopy . "C:\path\to\your-project" /E /XD .git ``` -#### Linux / OSX Terminal: -```bash -git clone https://github.com/ChrisMcKee1/AI-Assisted-Coding.git -cd AI-Assisted-Coding - -# Copy all framework files to your project's root directory -# Replace '/path/to/your-project' with the actual path to your project -rsync -av --exclude='.git' . /path/to/your-project/ -``` +We will use the files in this repository once you've setup your local workspace. Keep them somewhere easy to access, we recommend a folder such as `C:\github\` or `~/github` on Linux & OSX. ### Step 2 (optional): Change the MCP Configuration -By default, we utilize the local Contex7 MCP server for caching and speed. However, if you don't have NodeJS installed, and do not wish to use NodeJS, you can instead utilize iether the public Context7 MCP server, or build and run the Docker version of the Context7 MCP tool. The `.vscode/mcp.json` file has commented out configuration options for these two paths. +By default, we utilize the local MCP tools for caching and speed. However, if you don't have NodeJS installed, and do not wish to use NodeJS, you can instead utilize docker versions of each tool. To build and run the Docker version of the the tools, you will need to see the documentation for each to setup the docker config: See the [Context7 Docker Readme](../../context7-docker.md) for instructions on how to use Docker to host Context7 locally. -Also, we are using Docker for hosting the Context Portal MCP server. We recommend you pull the image locally before you start up VS Code. To pull the image: +See the [Playwright MCP Readme](https://github.com/microsoft/playwright-mcp) for details on advanced configuration and docker support for Playwright. + +See the [Sequential Thinking MCP Readme](https://github.com/modelcontextprotocol/servers/tree/main/src/sequentialthinking) for details on how to configure it using docker. -`docker pull seiggy/context-portal-mcp:0.2.18` +See the [Memory MCP Readme](https://github.com/modelcontextprotocol/servers/tree/main/src/memory) for details on how to configure it using Docker -If you are using Podman, replace `docker` with `podman` in both the command, and in the `mcp.json` file. If you do not have Docker installed, you can run the Context Portal MCP server locally using Python and UV. For further details on how to use the ConPort MCP server with Python, see the [Context Portal Python Readme](../../conportal-python.md). +> [!IMPORTANT] +> If you intend to do this, please do this before coming to the workshop, and ensure they are setup and configured. This can take time to setup and troubleshoot, so be prepared. Our recommendation is to use the NodeJS versions of the tools. -### Step 3: Open Project in VS Code +### Step 3: Clone the Spring Petclinic Repository +You'll be building this challenge using the popular Petclinic demo repository from Spring. Clone the [petclinic](https://github.com/spring-projects/spring-petclinic/) Repository from the Spring team to an easy to access location on your machine. + +### Step 4: Copy the Framework to the petclinic repository + +Copy the `.github` and `.vscode` folders from the AI-Assisted-Coding repository to the root of your petclinic repository. An example script below assumes you cloned both repositories to the root of `C:\github` or `~/github` respectively. + +#### Powershell ```powershell -# Open your project in VS Code -code . +cd c:\github\AI-Assisted-Coding +robocopy . "C:\github\spring-petclinic" /E /XD .git +``` + +#### Bash +```bash +cd ~/github +rsync -av --exclude='.git' . ~/github/spring-petclinic ``` -### Step 4: Update Your Project Brief -Before proceeding, overwrite the `projectBrief.md` in your project root that you copied from the AI-Assisted-Coding repository, with the `projectBrief.md` file we provided in this repository. -### Step 5: Verify GitHub Copilot Integration +### Step 5: Run the Analyze Workflow -1. Ensure GitHub Copilot extension is installed and activated -2. The framework will automatically detect the `copilot-instructions.md` file +Open the petclinic Solution using VS Code or your preferrred IDE. +Run the `/analyze-product` prompt in Copilot Agent Mode to analyze your woskpace and generate the documents that help the spec-driven development workflow operate smoothly. -### Step 6: Initialize ConPort MCP +### Step 6: Verify Framework setup is completed -After verifying Copilot integration, initialize the ConPort MCP memory system: +1. Answer any questions that the AI prompts you with, and wait for it to complete. +2. The framework will automatically generate the `copilot-instructions.md` file +3. The framework should also generate a series of files in a folder named `.docs` +4. Review these documents for accuracy, and fix any problems you see. -1. In the Copilot chat or comments, type: - ``` - Initialize ConPort - ``` -1. Follow the prompts to complete setup. - This step creates the persistent memory database and loads your project context before you start coding. -### Step 7: Have the AI conduct Architecture Review +### Step 7: Follow the instructions from the `README.md` file to run petclinic locally -Have Copilot Create an Architecture Review of the repo. Take a look at the `README.md` in the AI Assisted Coding framework for help on how to conduct the Architecture review. When it completes, you should end up with a new folder named `architectureDiagrams` that contains 5 ore more markdown files with charts, documentation, and more about the eShop Application! Make sure you pass the #codebase token in your command, so that VS Code will give the AI access to the codebase for the context! +Follow the instructions in the `README.md` file at the base of the petclinic Repository to setup the application locally. You'll want to ensure that you can reach the application, search for owners, and see a list of veterinarians. ## Next challenge diff --git a/goals/java/2-requirements.md b/goals/java/2-requirements.md index f090b5c..4913711 100644 --- a/goals/java/2-requirements.md +++ b/goals/java/2-requirements.md @@ -1,8 +1,8 @@ -# Pet Clinic Feature Development Challenge +# Spring PetClinic Feature Development Challenge ## 🎯 Challenge Overview -Your mission is to **design and document a new feature** for the Pet Clinic reference application. This challenge will teach you how to write comprehensive feature requirements that integrate seamlessly with the ConPort workflow and architectural patterns we've created. +Your mission is to **design and document a new feature** for the PetClinic reference application. This challenge will teach you how to write comprehensive feature requirements that integrate seamlessly with the ConPort workflow and architectural patterns we've created. ## πŸ“‹ What You'll Learn @@ -14,186 +14,36 @@ Your mission is to **design and document a new feature** for the Pet Clinic refe ## πŸ—οΈ Architecture Context -Before you begin, familiarize yourself with the existing Pet Clinic architecture by reviewing: +Before you begin, familiarize yourself with the existing PetClinic architecture by reviewing: -- **Architecture Diagrams**: `architectureDiagrams/*.md` - Architecture documentation created by GitHub Copilot +- **AI Generated documentation**: `docs/*.md` - Project and Architecture documentation created by GitHub Copilot - **Current Services**: Identity, Catalog, Basket, Ordering, Webhooks, Mobile.BFF - **Technology Stack**: .NET 9, Aspire, PostgreSQL, Redis, RabbitMQ, Blazor ## 🎲 Feature Ideas (Choose One or Create Your Own) ### πŸ’‘ **Beginner Level** -- **Pet Images**: Add images to pet profiles for easier identification -- **Pet Details**: Enhance the pet profile to add more detailed information. Breed, weight, etc. -- **Integrate Social Login**: Add OIDC support using Entra, Auth0, or another social login provider +- **Pet Profile**: Add the ability to upload a photo for a customer's pet ### πŸ”₯ **Intermediate Level** -- **Pet Visit Enhanced Details**: Add ability to upload x-rays, details around diagnostics, tagging, and vet's notes. Add the ability to select which vet the pet saw as well. -- **Customer Portal**: Customer portal where a customer can edit their data and manage their pet's profiles +- **Veterinarian Schedule**: Veterinarian appointment schedule view +- **Loyalty Program**: Points-based rewards system with tier benefits ### πŸš€ **Advanced Level** -- **Pet Adoption System**: Pet adoptions with a public view of adotable pet profiles, and management page in the existing UI for managing pet's available for adoption -- **Veterinarian Schedule**: Add a full schedule system for scheduling visits -- **Customer Portal**: Add scheduling to customer portal with calendar view +- **Real-time Notifications**: Notify the customer using SMS when their next appointment is! ### 🎨 **Your Own Idea** -Create something unique that fits the Veterinary Services domain and showcases modern software engineering practices. +Create something unique that fits the veterinarian domain and showcases modern software engineering practices. -## πŸ“ Requirements Template +## πŸ“ Requirements Generation -Create a new markdown file named `feature-[your-feature-name].md` in a directory named `/backlog`. Here's a sample structure that can help! (also, see our example feature in the `exercise-files` folder!): +Pass your idea with as much detail as you'd like to the `/create-spec` prompt. The AI will then generate a `.docs/specs` folder, with a specification detailing what it understands of your requirements. You will want to go through all of these documents, and correct any mistakes, add anything it missed, or take off features and functionality you don't want it to implement. -```markdown -# Feature: [Your Feature Name] - -## πŸ“– Executive Summary - -### Business Value -- [ ] **Problem Statement**: What business problem does this solve? -- [ ] **Target Users**: Who will use this feature? -- [ ] **Success Metrics**: How will you measure success? -- [ ] **Business Impact**: Revenue, user engagement, operational efficiency? - -### Technical Alignment -- [ ] **Architecture Fit**: How does this align with the microservices architecture? -- [ ] **Service Boundaries**: Which services will be affected or created? -- [ ] **Data Ownership**: Which service owns the feature's data? - -## 🎯 Feature Requirements - -### Functional Requirements -1. **[Requirement ID]**: [Clear, testable requirement] - - **Acceptance Criteria**: - - Given [context] - - When [action] - - Then [expected outcome] - - **Priority**: Must Have / Should Have / Could Have / Won't Have - -2. **[Next Requirement]**... - -### Non-Functional Requirements -- **Performance**: Response time, throughput expectations -- **Scalability**: Expected load, growth patterns -- **Security**: Authentication, authorization, data protection -- **Reliability**: Availability, error handling, recovery -- **Usability**: User experience considerations - -## πŸ—οΈ Technical Design - -### Service Architecture -- **New Services**: What new microservices need to be created? -- **Modified Services**: Which existing services need changes? -- **Service Communication**: How will services communicate? (sync/async) -- **Data Flow**: Map the data flow through your feature - -### Database Design -- **New Tables/Collections**: Schema design -- **Data Relationships**: How does your data relate to existing entities? -- **Migration Strategy**: How will you handle schema changes? - -### Event Design -- **Domain Events**: What events will your feature publish? -- **Integration Events**: How will you integrate with other services? -- **Event Handlers**: What background processing is needed? - -### API Design - -# Example API endpoints - -GET /api/[service]/[resource] -POST /api/[service]/[resource] -PUT /api/[service]/[resource]/{id} -DELETE /api/[service]/[resource]/{id} - -## πŸ”„ Implementation Roadmap - -### Phase 1: Foundation (Week 1-2) -- [ ] Service scaffolding -- [ ] Database schema -- [ ] Basic CRUD operations - -### Phase 2: Core Features (Week 3-4) -- [ ] Business logic implementation -- [ ] Event integration -- [ ] API development - -### Phase 3: Integration (Week 5-6) -- [ ] Frontend integration -- [ ] Service communication -- [ ] End-to-end testing - -### Phase 4: Polish (Week 7-8) -- [ ] Performance optimization -- [ ] Security hardening -- [ ] Documentation - -## πŸ§ͺ Testing Strategy - -### Unit Testing -- [ ] Service layer tests -- [ ] Domain logic tests -- [ ] Repository tests - -### Integration Testing -- [ ] API endpoint tests -- [ ] Database integration tests -- [ ] Event handler tests - -### End-to-End Testing -- [ ] User workflow tests -- [ ] Cross-service integration tests -- [ ] Performance tests - -## πŸ“Š Monitoring & Observability - -### Metrics -- [ ] Business metrics to track -- [ ] Technical metrics to monitor -- [ ] SLA/SLO definitions - -### Logging -- [ ] Structured logging requirements -- [ ] Log correlation across services -- [ ] Security audit logging - -### Alerting -- [ ] Critical alerts -- [ ] Performance degradation alerts -- [ ] Business metric alerts - -## 🚨 Risk Assessment - -### Technical Risks -- **Risk**: [Description] - - **Probability**: High/Medium/Low - - **Impact**: High/Medium/Low - - **Mitigation**: [Strategy] - -### Business Risks -- **Risk**: [Description] - - **Mitigation**: [Strategy] - -## πŸŽ“ Decision Log - -Use this section to document key decisions made during feature design: - -### Decision 1: [Title] -- **Context**: Why was this decision needed? -- **Options Considered**: What alternatives were evaluated? -- **Decision**: What was chosen? -- **Rationale**: Why was this the best choice? -- **Consequences**: What are the implications? - -## πŸ“š References - -- [ ] External APIs or services referenced -- [ ] Design patterns used -- [ ] Industry best practices followed -- [ ] Performance benchmarks -``` + > [!IMPORTANT] + > Try to be specific! Don't just tell the AI: "add a rewards system", give it details, such as: `create a customer loyalty program where for every $1 spent on services, the customer earns 10 points. And for each 1,000 points the customer can redeem those points for $10 off their next visit. Show the customer's point balance on their profile. And show the number of points earned under each visit in their pet's record` > [!TIP] - > Try using Copilot to have it write your requirements for you! Copy the template, and surround it with <|TEMPLATE_START|><|TEMPLATE_END|>, and describe to the AI the Feature you'd like to plan! Then review and edit the generated markdown file to ensure that it matches what you want! + > Super-pro tip! You can create your requirements in a markdown file and then reference it to the AI in the same way we've referenced other files. We have a sample requirements-template.md file in this repository that has a great starter template to use for feeding detailed requirements to GitHub Copilot! [Check it out!](../../requirements-template.md) ## βœ… Success Criteria @@ -246,25 +96,25 @@ Your feature requirements document should demonstrate: ## πŸŽ‰ Ready to Begin? -Remember: **Good requirements are the foundation of great software**. Take your time to think through the problem space before jumping into solutions. The Pet Clinic application is a reference for modern software engineering practicesβ€”your feature should exemplify the same level of thoughtfulness and technical excellence. +Remember: **Good requirements are the foundation of great software**. Take your time to think through the problem space before jumping into solutions. The PetClinic application is a reference for modern software engineering practicesβ€”your feature should exemplify the same level of thoughtfulness and technical excellence. -**Start by creating your feature file and begin documenting your thinking process. Use Chat mode to help you if you're stuck!** +**Once you have a well defined specification, you can start work!** -When you're created your requirements file, prompt Copilot in Agent Mode: +When you're created your specs, prompt Copilot in Agent Mode: -`Plan #feature-[your-feature-name].md` +`/execute-tasks` -This should trigger the planning process. The AI should validate and produce an implementation plan, and ask if you're ready to start! When you're ready to continue: +You can pass it alone, or reference the specific task id from the `tasks.md` file you'd like to start with. Remember to help the AI keep track of completed tasks, so that you can easily start new conversations as context windows fill up. You can have the AI pickup on a specific task id, and sub-task with the same command: -`Act: #feature-[your-feature-name].md` +`/execute-task Resume 03bd0240-fdcc-48a9-832c-71c44193a375 task 1.10` From here, the AI will begin implementing your feature. Ensure you interact with the AI often, running unit tests, building and validating it's progress, provide feedback. Continue to use this process as you go until your feature is completed! ```mermaid flowchart LR - Plan[Plan
#story] --> Act[Act
#story] - Act --> Status[Status
#story] - Status --> Debug[Debug
#story] - Debug --> Plan + Plan[Plan
/create-spec] --> Act[Act
/execute-tasks] + Act --> Status[Status
/generate-report] + Status --> Debug[Debug
/startDebugging] + Debug --> Act style Plan stroke:#4F8EF7,stroke-width:2px style Act stroke:#4F8EF7,stroke-width:2px style Status stroke:#4F8EF7,stroke-width:2px diff --git a/goals/java/projectBrief.md b/goals/java/projectBrief.md deleted file mode 100644 index e669361..0000000 --- a/goals/java/projectBrief.md +++ /dev/null @@ -1,41 +0,0 @@ -# Project Brief: Spring Petclinic Microservices - -## Overview -The **Spring Petclinic Microservices** project is a reference application based on the well-known Spring Petclinic sample, re-architected as a set of microservices. Developed and maintained by Azure Samples, it demonstrates how to build and deploy cloud-native Java applications on the Microsoft Azure platform. The application implements a full-featured veterinary clinic management system and serves as a practical example for developers. - -## Purpose -The primary goals of this project are to: -- Showcase best practices for building and deploying microservices with Spring Boot and Spring Cloud on Azure. -- Demonstrate the use of **Azure Spring Apps** for hosting scalable and resilient Java applications. -- Provide a hands-on learning tool for developers exploring microservices, Infrastructure as Code (IaC), and modern DevOps practices on Azure. -- Serve as an **Azure Developer CLI (`azd`)** template for rapid, repeatable environment provisioning and application deployment. - -## Key Features -- **Architecture**: A distributed microservices architecture using Spring Boot, including services for customers, vets, and visits, managed via a discovery server and an API gateway. -- **Frontend**: Built with AngularJS and Bootstrap, providing a dynamic user interface. -- **Backend**: A suite of microservices developed with Java and the Spring Framework. -- **DevOps Ready**: Includes CI/CD pipeline examples using GitHub Actions and Infrastructure as Code definitions with Bicep. -- **Cloud Integration**: Leverages key Azure services such as **Azure Spring Apps**, **Azure Database for MySQL**, and **Azure Key Vault** for comprehensive cloud-native capabilities. - -## Technology Stack -- Java 17+ -- Spring Boot & Spring Cloud -- AngularJS & Bootstrap -- Maven -- Docker -- Azure Spring Apps -- Azure Database for MySQL -- Azure Developer CLI (`azd`) -- Bicep - -## Getting Started -To run the Spring Petclinic project on Azure: -1. Clone the repository: `git clone https://github.com/Azure-Samples/spring-petclinic-microservices.git` -2. Install prerequisites: - - Azure Developer CLI (`azd`) - - Java 17 SDK - - Azure CLI -3. Run the solution on Azure using the `azd up` command, which handles provisioning and deployment. - -## Use as a Reference Sample -This application is used as a public reference sample in various Microsoft tutorials, learning modules, and documentation to demonstrate how to effectively build and deploy modern Java applications on Azure. \ No newline at end of file diff --git a/goals/python/1-setup.md b/goals/python/1-setup.md index 7c2a3f3..b4375e6 100644 --- a/goals/python/1-setup.md +++ b/goals/python/1-setup.md @@ -1,85 +1,85 @@ ## πŸš€ Setting up the AI Assisted Coding Framework in your project -This framework integrates two powerful MCP (Model Context Protocol) servers to supercharge your development workflow: +This framework integrates several powerful MCP (Model Context Protocol) tools to supercharge your development workflow: - **Context7 MCP**: Provides live documentation and code snippet retrieval for authoritative technical references -- **ConPort MCP**: Delivers persistent project memory, decision tracking, and knowledge graph capabilities +- **Memory MCP**: Delivers persistent project memory, decision tracking, and knowledge graph capabilities +- **Sequential Thinking MCP**: Assists the LLM with ordering tasks, and breaking down complex ideas +- **Microsoft.Learn MCP**: Give your LLM access to the entire Microsoft Learn knowledgebase! Together, they transform GitHub Copilot into an intelligent development assistant that remembers project context, tracks architectural decisions, and maintains comprehensive project knowledge across sessions. ## πŸ“‹ Prerequisites -- **Docker** or **Podman** (for ConPort MCP) -- **Node.js 16+** (for Context7 MCP server) - Optional, see step 2 below -- **VS Code** with GitHub Copilot extension +- **Node.js 16+** (for MCP tools) +- **IDE** with GitHub Copilot[^1] +- **Docker** is recommended, or you can use the devcontainer. - **Git** for version control ## πŸ› οΈ Installation & Setup -### Step 1: Clone and Copy Framework Files +### Step 1: Clone The Framework Repository #### Windows Terminal: ```powershell # Clone this repository git clone https://github.com/ChrisMcKee1/AI-Assisted-Coding.git -cd AI-Assisted-Coding - -# Copy all framework files to your project's root directory -# Replace 'your-project-path' with the actual path to your project -robocopy . "C:\path\to\your-project" /E /XD .git ``` -#### Linux / OSX Terminal: -```bash -git clone https://github.com/ChrisMcKee1/AI-Assisted-Coding.git -cd AI-Assisted-Coding - -# Copy all framework files to your project's root directory -# Replace '/path/to/your-project' with the actual path to your project -rsync -av --exclude='.git' . /path/to/your-project/ -``` +We will use the files in this repository once you've setup your local workspace. Keep them somewhere easy to access, we recommend a folder such as `C:\github\` or `~/github` on Linux & OSX. ### Step 2 (optional): Change the MCP Configuration -By default, we utilize the local Contex7 MCP server for caching and speed. However, if you don't have NodeJS installed, and do not wish to use NodeJS, you can instead utilize iether the public Context7 MCP server, or build and run the Docker version of the Context7 MCP tool. The `.vscode/mcp.json` file has commented out configuration options for these two paths. +By default, we utilize the local MCP tools for caching and speed. However, if you don't have NodeJS installed, and do not wish to use NodeJS, you can instead utilize docker versions of each tool. To build and run the Docker version of the the tools, you will need to see the documentation for each to setup the docker config: See the [Context7 Docker Readme](../../context7-docker.md) for instructions on how to use Docker to host Context7 locally. -Also, we are using Docker for hosting the Context Portal MCP server. We recommend you pull the image locally before you start up VS Code. To pull the image: +See the [Playwright MCP Readme](https://github.com/microsoft/playwright-mcp) for details on advanced configuration and docker support for Playwright. + +See the [Sequential Thinking MCP Readme](https://github.com/modelcontextprotocol/servers/tree/main/src/sequentialthinking) for details on how to configure it using docker. -`docker pull seiggy/context-portal-mcp:0.2.18` +See the [Memory MCP Readme](https://github.com/modelcontextprotocol/servers/tree/main/src/memory) for details on how to configure it using Docker -If you are using Podman, replace `docker` with `podman` in both the command, and in the `mcp.json` file. If you do not have Docker installed, you can run the Context Portal MCP server locally using Python and UV. For further details on how to use the ConPort MCP server with Python, see the [Context Portal Python Readme](../../conportal-python.md). +> [!IMPORTANT] +> If you intend to do this, please do this before coming to the workshop, and ensure they are setup and configured. This can take time to setup and troubleshoot, so be prepared. Our recommendation is to use the NodeJS versions of the tools. -### Step 3: Open Project in VS Code +### Step 3: Clone the Inventree Repository +You'll be building this challenge using the popular Inventree demo repository. Clone the [Inventree](https://github.com/inventree/InvenTree) Repository from github to an easy to access location on your machine. + +### Step 4: Copy the Framework to the inventree repository + +Copy the `.github` and `.vscode` folders from the AI-Assisted-Coding repository to the root of your inventree repository. An example script below assumes you cloned both repositories to the root of `C:\github` or `~/github` respectively. + +#### Powershell ```powershell -# Open your project in VS Code -code . +cd c:\github\AI-Assisted-Coding +robocopy . "C:\github\inventree" /E /XD .git +``` + +#### Bash +```bash +cd ~/github +rsync -av --exclude='.git' . ~/github/inventree ``` -### Step 4: Update Your Project Brief -Before proceeding, overwrite the `projectBrief.md` in your project root that you copied from the AI-Assisted-Coding repository, with the `projectBrief.md` file we provided in this repository. -### Step 5: Verify GitHub Copilot Integration +### Step 5: Run the Analyze Workflow -1. Ensure GitHub Copilot extension is installed and activated -2. The framework will automatically detect the `copilot-instructions.md` file +Open the inventree Solution using VS Code or your preferrred IDE. +Run the `/analyze-product` prompt in Copilot Agent Mode to analyze your woskpace and generate the documents that help the spec-driven development workflow operate smoothly. -### Step 6: Initialize ConPort MCP +### Step 6: Verify Framework setup is completed -After verifying Copilot integration, initialize the ConPort MCP memory system: +1. Answer any questions that the AI prompts you with, and wait for it to complete. +2. The framework will automatically generate the `copilot-instructions.md` file +3. The framework should also generate a series of files in a folder named `.docs` +4. Review these documents for accuracy, and fix any problems you see. -1. In the Copilot chat or comments, type: - ``` - Initialize ConPort - ``` -1. Follow the prompts to complete setup. - This step creates the persistent memory database and loads your project context before you start coding. -### Step 7: Have the AI conduct Architecture Review +### Step 7: Follow the instructions from the `README.md` file to run inventree locally -Have Copilot Create an Architecture Review of the repo. Take a look at the `README.md` in the AI Assisted Coding framework for help on how to conduct the Architecture review. When it completes, you should end up with a new folder named `architectureDiagrams` that contains 5 ore more markdown files with charts, documentation, and more about the eShop Application! Make sure you pass the #codebase token in your command, so that VS Code will give the AI access to the codebase for the context! +Follow the instructions in the `README.md` file at the base of the inventree Repository to setup the application locally. You'll want to ensure that you can reach the application, search inventory, and manage suppliers. ## Next challenge diff --git a/goals/python/2-requirements.md b/goals/python/2-requirements.md index 301f1f4..da2b6f7 100644 --- a/goals/python/2-requirements.md +++ b/goals/python/2-requirements.md @@ -16,7 +16,7 @@ Your mission is to **design and document a new feature** for the InvenTree refer Before you begin, familiarize yourself with the existing InvenTree architecture by reviewing: -- **Architecture Diagrams**: `architectureDiagrams/*.md` - Architecture documentation created by GitHub Copilot +- **AI Generated documentation**: `docs/*.md` - Project and Architecture documentation created by GitHub Copilot - **Current Services**: Parts, Stock, Build, Orders, Reports, Admin - **Technology Stack**: Python 3.11, Django, React @@ -34,161 +34,15 @@ Before you begin, familiarize yourself with the existing InvenTree architecture ### 🎨 **Your Own Idea** Create something unique that fits the inventory management application and showcases modern software engineering practices. -## πŸ“ Requirements Template +## πŸ“ Requirements Generation -Create a new markdown file named `feature-[your-feature-name].md` in a directory named `/backlog`. Here's a sample structure that can help! (also, see our example feature in the `exercise-files` folder!): +Pass your idea with as much detail as you'd like to the `/create-spec` prompt. The AI will then generate a `.docs/specs` folder, with a specification detailing what it understands of your requirements. You will want to go through all of these documents, and correct any mistakes, add anything it missed, or take off features and functionality you don't want it to implement. -```markdown -# Feature: [Your Feature Name] - -## πŸ“– Executive Summary - -### Business Value -- [ ] **Problem Statement**: What business problem does this solve? -- [ ] **Target Users**: Who will use this feature? -- [ ] **Success Metrics**: How will you measure success? -- [ ] **Business Impact**: Revenue, user engagement, operational efficiency? - -### Technical Alignment -- [ ] **Architecture Fit**: How does this align with the microservices architecture? -- [ ] **Service Boundaries**: Which services will be affected or created? -- [ ] **Data Ownership**: Which service owns the feature's data? - -## 🎯 Feature Requirements - -### Functional Requirements -1. **[Requirement ID]**: [Clear, testable requirement] - - **Acceptance Criteria**: - - Given [context] - - When [action] - - Then [expected outcome] - - **Priority**: Must Have / Should Have / Could Have / Won't Have - -2. **[Next Requirement]**... - -### Non-Functional Requirements -- **Performance**: Response time, throughput expectations -- **Scalability**: Expected load, growth patterns -- **Security**: Authentication, authorization, data protection -- **Reliability**: Availability, error handling, recovery -- **Usability**: User experience considerations - -## πŸ—οΈ Technical Design - -### Service Architecture -- **New Services**: What new microservices need to be created? -- **Modified Services**: Which existing services need changes? -- **Service Communication**: How will services communicate? (sync/async) -- **Data Flow**: Map the data flow through your feature - -### Database Design -- **New Tables/Collections**: Schema design -- **Data Relationships**: How does your data relate to existing entities? -- **Migration Strategy**: How will you handle schema changes? - -### Event Design -- **Domain Events**: What events will your feature publish? -- **Integration Events**: How will you integrate with other services? -- **Event Handlers**: What background processing is needed? - -### API Design - -# Example API endpoints - -GET /api/[service]/[resource] -POST /api/[service]/[resource] -PUT /api/[service]/[resource]/{id} -DELETE /api/[service]/[resource]/{id} - -## πŸ”„ Implementation Roadmap - -### Phase 1: Foundation (Week 1-2) -- [ ] Service scaffolding -- [ ] Database schema -- [ ] Basic CRUD operations - -### Phase 2: Core Features (Week 3-4) -- [ ] Business logic implementation -- [ ] Event integration -- [ ] API development - -### Phase 3: Integration (Week 5-6) -- [ ] Frontend integration -- [ ] Service communication -- [ ] End-to-end testing - -### Phase 4: Polish (Week 7-8) -- [ ] Performance optimization -- [ ] Security hardening -- [ ] Documentation - -## πŸ§ͺ Testing Strategy - -### Unit Testing -- [ ] Service layer tests -- [ ] Domain logic tests -- [ ] Repository tests - -### Integration Testing -- [ ] API endpoint tests -- [ ] Database integration tests -- [ ] Event handler tests - -### End-to-End Testing -- [ ] User workflow tests -- [ ] Cross-service integration tests -- [ ] Performance tests - -## πŸ“Š Monitoring & Observability - -### Metrics -- [ ] Business metrics to track -- [ ] Technical metrics to monitor -- [ ] SLA/SLO definitions - -### Logging -- [ ] Structured logging requirements -- [ ] Log correlation across services -- [ ] Security audit logging - -### Alerting -- [ ] Critical alerts -- [ ] Performance degradation alerts -- [ ] Business metric alerts - -## 🚨 Risk Assessment - -### Technical Risks -- **Risk**: [Description] - - **Probability**: High/Medium/Low - - **Impact**: High/Medium/Low - - **Mitigation**: [Strategy] - -### Business Risks -- **Risk**: [Description] - - **Mitigation**: [Strategy] - -## πŸŽ“ Decision Log - -Use this section to document key decisions made during feature design: - -### Decision 1: [Title] -- **Context**: Why was this decision needed? -- **Options Considered**: What alternatives were evaluated? -- **Decision**: What was chosen? -- **Rationale**: Why was this the best choice? -- **Consequences**: What are the implications? - -## πŸ“š References - -- [ ] External APIs or services referenced -- [ ] Design patterns used -- [ ] Industry best practices followed -- [ ] Performance benchmarks -``` + > [!IMPORTANT] + > Try to be specific! Don't just tell the AI: "add a rewards system", give it details, such as: `create a customer loyalty program where for every $1 spent on the store, the customer earns 10 points. And for each 1,000 points the customer can redeem those points for $10 off their next purchase. Show the customer's point balance on their profile. And show the number of points earned under each item on the product page` > [!TIP] - > Try using Copilot to have it write your requirements for you! Copy the template, and surround it with <|TEMPLATE_START|><|TEMPLATE_END|>, and describe to the AI the Feature you'd like to plan! Then review and edit the generated markdown file to ensure that it matches what you want! + > Super-pro tip! You can create your requirements in a markdown file and then reference it to the AI in the same way we've referenced other files. We have a sample requirements-template.md file in this repository that has a great starter template to use for feeding detailed requirements to GitHub Copilot! [Check it out!](../../requirements-template.md) ## βœ… Success Criteria @@ -243,23 +97,23 @@ Your feature requirements document should demonstrate: Remember: **Good requirements are the foundation of great software**. Take your time to think through the problem space before jumping into solutions. The InvenTree application is a reference for modern software engineering practicesβ€”your feature should exemplify the same level of thoughtfulness and technical excellence. -**Start by creating your feature file and begin documenting your thinking process. Use Chat mode to help you if you're stuck!** +**Once you have a well defined specification, you can start work!** -When you're created your requirements file, prompt Copilot in Agent Mode: +When you're created your specs, prompt Copilot in Agent Mode: -`Plan #feature-[your-feature-name].md` +`/execute-tasks` -This should trigger the planning process. The AI should validate and produce an implementation plan, and ask if you're ready to start! When you're ready to continue: +You can pass it alone, or reference the specific task id from the `tasks.md` file you'd like to start with. Remember to help the AI keep track of completed tasks, so that you can easily start new conversations as context windows fill up. You can have the AI pickup on a specific task id, and sub-task with the same command: -`Act: #feature-[your-feature-name].md` +`/execute-task Resume 03bd0240-fdcc-48a9-832c-71c44193a375 task 1.10` From here, the AI will begin implementing your feature. Ensure you interact with the AI often, running unit tests, building and validating it's progress, provide feedback. Continue to use this process as you go until your feature is completed! ```mermaid flowchart LR - Plan[Plan
#story] --> Act[Act
#story] - Act --> Status[Status
#story] - Status --> Debug[Debug
#story] - Debug --> Plan + Plan[Plan
/create-spec] --> Act[Act
/execute-tasks] + Act --> Status[Status
/generate-report] + Status --> Debug[Debug
/startDebugging] + Debug --> Act style Plan stroke:#4F8EF7,stroke-width:2px style Act stroke:#4F8EF7,stroke-width:2px style Status stroke:#4F8EF7,stroke-width:2px diff --git a/goals/python/projectBrief.md b/goals/python/projectBrief.md deleted file mode 100644 index 9d1e892..0000000 --- a/goals/python/projectBrief.md +++ /dev/null @@ -1,40 +0,0 @@ -# Project Brief: eShop (.NET Reference Application) - -## Overview -The **eShop** project is a reference .NET application developed by the .NET team to demonstrate modern application architecture and development practices using the latest .NET technologies. It implements a full-featured e-commerce website and serves as a practical example for developers building cloud-native, scalable, and maintainable applications. - -## Purpose -The primary goals of the eShop project are to: -- Showcase best practices in .NET application development. -- Demonstrate the use of **.NET Aspire**, a new stack for building distributed applications. -- Provide a hands-on learning tool for developers exploring microservices, containerization, and modern frontend/backend integration. - -## Key Features -- **Architecture**: Services-based architecture using .NET Aspire. -- **Frontend**: Built with Blazor WebAssembly and ASP.NET Core. -- **Backend**: Includes APIs, microservices, and integration with databases and messaging systems. -- **DevOps Ready**: Includes CI/CD pipelines and Docker support. -- **AI Integration**: Features an AI-powered chatbot to assist users in product discovery and shopping. -- **Performance Optimizations**: - - Uses `MapStaticAssets` for optimized static file handling. - - Implements Brotli compression and fingerprinted file names for aggressive caching. - -## Technology Stack -- .NET 9 (latest version) -- ASP.NET Core -- Blazor -- Docker -- Visual Studio 2022+ -- Optional: .NET MAUI for cross-platform client apps - -## Getting Started -To run the eShop project: -1. Clone the repository: [github.com/dotnet/eshop](https://github.com/dotnet/eshop) -2. Install prerequisites: - - .NET 9 SDK - - Visual Studio 2022 (with ASP.NET and .NET Aspire workloads) - - Docker Desktop -3. Run the solution using Visual Studio or configure your environment using the provided PowerShell scripts. - -## Internal Use and Demonstrations -The eShop app has been featured in internal presentations such as the `[dotnet conf 2024 Keynote - Copy 3](https://microsoft.sharepoint.com/teams/DevRelTeam/_layouts/15/Doc.aspx?sourcedoc=%7B15D653EE-CFB6-4C32-8669-EF8B2367FE95%7D&file=dotnet%20conf%202024%20Keynote%20-%20Copy%203.pptx&action=edit&mobileredirect=true&DefaultItemOpen=1&EntityRepresentationId=421e3d07-d5fd-4cc9-b4ee-9dac667e0f4b)`, where it was used to demonstrate .NET 9 \ No newline at end of file diff --git a/goals/typescript/1-setup.md b/goals/typescript/1-setup.md index 7c2a3f3..d64df0a 100644 --- a/goals/typescript/1-setup.md +++ b/goals/typescript/1-setup.md @@ -1,85 +1,85 @@ ## πŸš€ Setting up the AI Assisted Coding Framework in your project -This framework integrates two powerful MCP (Model Context Protocol) servers to supercharge your development workflow: +This framework integrates several powerful MCP (Model Context Protocol) tools to supercharge your development workflow: - **Context7 MCP**: Provides live documentation and code snippet retrieval for authoritative technical references -- **ConPort MCP**: Delivers persistent project memory, decision tracking, and knowledge graph capabilities +- **Memory MCP**: Delivers persistent project memory, decision tracking, and knowledge graph capabilities +- **Sequential Thinking MCP**: Assists the LLM with ordering tasks, and breaking down complex ideas +- **Microsoft.Learn MCP**: Give your LLM access to the entire Microsoft Learn knowledgebase! Together, they transform GitHub Copilot into an intelligent development assistant that remembers project context, tracks architectural decisions, and maintains comprehensive project knowledge across sessions. ## πŸ“‹ Prerequisites -- **Docker** or **Podman** (for ConPort MCP) -- **Node.js 16+** (for Context7 MCP server) - Optional, see step 2 below -- **VS Code** with GitHub Copilot extension +- **Node.js 16+** (for MCP tools) +- **IDE** with GitHub Copilot[^1] +- **Docker** is recommended, or you can use the devcontainer. - **Git** for version control ## πŸ› οΈ Installation & Setup -### Step 1: Clone and Copy Framework Files +### Step 1: Clone The Framework Repository #### Windows Terminal: ```powershell # Clone this repository git clone https://github.com/ChrisMcKee1/AI-Assisted-Coding.git -cd AI-Assisted-Coding - -# Copy all framework files to your project's root directory -# Replace 'your-project-path' with the actual path to your project -robocopy . "C:\path\to\your-project" /E /XD .git ``` -#### Linux / OSX Terminal: -```bash -git clone https://github.com/ChrisMcKee1/AI-Assisted-Coding.git -cd AI-Assisted-Coding - -# Copy all framework files to your project's root directory -# Replace '/path/to/your-project' with the actual path to your project -rsync -av --exclude='.git' . /path/to/your-project/ -``` +We will use the files in this repository once you've setup your local workspace. Keep them somewhere easy to access, we recommend a folder such as `C:\github\` or `~/github` on Linux & OSX. ### Step 2 (optional): Change the MCP Configuration -By default, we utilize the local Contex7 MCP server for caching and speed. However, if you don't have NodeJS installed, and do not wish to use NodeJS, you can instead utilize iether the public Context7 MCP server, or build and run the Docker version of the Context7 MCP tool. The `.vscode/mcp.json` file has commented out configuration options for these two paths. +By default, we utilize the local MCP tools for caching and speed. However, if you don't have NodeJS installed, and do not wish to use NodeJS, you can instead utilize docker versions of each tool. To build and run the Docker version of the the tools, you will need to see the documentation for each to setup the docker config: See the [Context7 Docker Readme](../../context7-docker.md) for instructions on how to use Docker to host Context7 locally. -Also, we are using Docker for hosting the Context Portal MCP server. We recommend you pull the image locally before you start up VS Code. To pull the image: +See the [Playwright MCP Readme](https://github.com/microsoft/playwright-mcp) for details on advanced configuration and docker support for Playwright. + +See the [Sequential Thinking MCP Readme](https://github.com/modelcontextprotocol/servers/tree/main/src/sequentialthinking) for details on how to configure it using docker. -`docker pull seiggy/context-portal-mcp:0.2.18` +See the [Memory MCP Readme](https://github.com/modelcontextprotocol/servers/tree/main/src/memory) for details on how to configure it using Docker -If you are using Podman, replace `docker` with `podman` in both the command, and in the `mcp.json` file. If you do not have Docker installed, you can run the Context Portal MCP server locally using Python and UV. For further details on how to use the ConPort MCP server with Python, see the [Context Portal Python Readme](../../conportal-python.md). +> [!IMPORTANT] +> If you intend to do this, please do this before coming to the workshop, and ensure they are setup and configured. This can take time to setup and troubleshoot, so be prepared. Our recommendation is to use the NodeJS versions of the tools. -### Step 3: Open Project in VS Code +### Step 3: Clone the ngLibrary Repository +You'll be building this challenge using the ngLibrary demo repository. Clone the [ngLibrary](https://github.com/mrWh1te/ngLibrary/) Repository from github to an easy to access location on your machine. + +### Step 4: Copy the Framework to the ngLibrary repository + +Copy the `.github` and `.vscode` folders from the AI-Assisted-Coding repository to the root of your ngLibrary repository. An example script below assumes you cloned both repositories to the root of `C:\github` or `~/github` respectively. + +#### Powershell ```powershell -# Open your project in VS Code -code . +cd c:\github\AI-Assisted-Coding +robocopy . "C:\github\ngLibrary" /E /XD .git +``` + +#### Bash +```bash +cd ~/github +rsync -av --exclude='.git' . ~/github/ngLibrary ``` -### Step 4: Update Your Project Brief -Before proceeding, overwrite the `projectBrief.md` in your project root that you copied from the AI-Assisted-Coding repository, with the `projectBrief.md` file we provided in this repository. -### Step 5: Verify GitHub Copilot Integration +### Step 5: Run the Analyze Workflow -1. Ensure GitHub Copilot extension is installed and activated -2. The framework will automatically detect the `copilot-instructions.md` file +Open the ngLibrary Solution using VS Code or your preferrred IDE. +Run the `/analyze-product` prompt in Copilot Agent Mode to analyze your woskpace and generate the documents that help the spec-driven development workflow operate smoothly. -### Step 6: Initialize ConPort MCP +### Step 6: Verify Framework setup is completed -After verifying Copilot integration, initialize the ConPort MCP memory system: +1. Answer any questions that the AI prompts you with, and wait for it to complete. +2. The framework will automatically generate the `copilot-instructions.md` file +3. The framework should also generate a series of files in a folder named `.docs` +4. Review these documents for accuracy, and fix any problems you see. -1. In the Copilot chat or comments, type: - ``` - Initialize ConPort - ``` -1. Follow the prompts to complete setup. - This step creates the persistent memory database and loads your project context before you start coding. -### Step 7: Have the AI conduct Architecture Review +### Step 7: Follow the instructions from the `README.md` file to run ngLibrary locally -Have Copilot Create an Architecture Review of the repo. Take a look at the `README.md` in the AI Assisted Coding framework for help on how to conduct the Architecture review. When it completes, you should end up with a new folder named `architectureDiagrams` that contains 5 ore more markdown files with charts, documentation, and more about the eShop Application! Make sure you pass the #codebase token in your command, so that VS Code will give the AI access to the codebase for the context! +Follow the instructions in the `README.md` file at the base of the ngLibrary Repository to setup the application locally. You'll want to ensure that you can reach the application, search inventory, and manage suppliers. ## Next challenge diff --git a/goals/typescript/2-requirements.md b/goals/typescript/2-requirements.md index abc82a0..03f880b 100644 --- a/goals/typescript/2-requirements.md +++ b/goals/typescript/2-requirements.md @@ -16,7 +16,7 @@ Your mission is to **design and document a new feature** for the ngLibrary refer Before you begin, familiarize yourself with the existing ngLibrary architecture by reviewing: -- **Architecture Diagrams**: `architectureDiagrams/*.md` - Architecture documentation created by GitHub Copilot +- **AI Generated documentation**: `docs/*.md` - Project and Architecture documentation created by GitHub Copilot - **Current Modules**: App, Books, Cart, Checkout, Layouts, Core - **Technology Stack**: Typescript 4, Angular 11 @@ -38,161 +38,15 @@ Before you begin, familiarize yourself with the existing ngLibrary architecture ### 🎨 **Your Own Idea** Create something unique that fits the library management domain and showcases modern software engineering practices. -## πŸ“ Requirements Template +## πŸ“ Requirements Generation -Create a new markdown file named `feature-[your-feature-name].md` in a directory named `/backlog`. Here's a sample structure that can help! (also, see our example feature in the `exercise-files` folder!): +Pass your idea with as much detail as you'd like to the `/create-spec` prompt. The AI will then generate a `.docs/specs` folder, with a specification detailing what it understands of your requirements. You will want to go through all of these documents, and correct any mistakes, add anything it missed, or take off features and functionality you don't want it to implement. -```markdown -# Feature: [Your Feature Name] - -## πŸ“– Executive Summary - -### Business Value -- [ ] **Problem Statement**: What business problem does this solve? -- [ ] **Target Users**: Who will use this feature? -- [ ] **Success Metrics**: How will you measure success? -- [ ] **Business Impact**: Revenue, user engagement, operational efficiency? - -### Technical Alignment -- [ ] **Architecture Fit**: How does this align with the microservices architecture? -- [ ] **Service Boundaries**: Which services will be affected or created? -- [ ] **Data Ownership**: Which service owns the feature's data? - -## 🎯 Feature Requirements - -### Functional Requirements -1. **[Requirement ID]**: [Clear, testable requirement] - - **Acceptance Criteria**: - - Given [context] - - When [action] - - Then [expected outcome] - - **Priority**: Must Have / Should Have / Could Have / Won't Have - -2. **[Next Requirement]**... - -### Non-Functional Requirements -- **Performance**: Response time, throughput expectations -- **Scalability**: Expected load, growth patterns -- **Security**: Authentication, authorization, data protection -- **Reliability**: Availability, error handling, recovery -- **Usability**: User experience considerations - -## πŸ—οΈ Technical Design - -### Service Architecture -- **New Services**: What new microservices need to be created? -- **Modified Services**: Which existing services need changes? -- **Service Communication**: How will services communicate? (sync/async) -- **Data Flow**: Map the data flow through your feature - -### Database Design -- **New Tables/Collections**: Schema design -- **Data Relationships**: How does your data relate to existing entities? -- **Migration Strategy**: How will you handle schema changes? - -### Event Design -- **Domain Events**: What events will your feature publish? -- **Integration Events**: How will you integrate with other services? -- **Event Handlers**: What background processing is needed? - -### API Design - -# Example API endpoints - -GET /api/[service]/[resource] -POST /api/[service]/[resource] -PUT /api/[service]/[resource]/{id} -DELETE /api/[service]/[resource]/{id} - -## πŸ”„ Implementation Roadmap - -### Phase 1: Foundation (Week 1-2) -- [ ] Service scaffolding -- [ ] Database schema -- [ ] Basic CRUD operations - -### Phase 2: Core Features (Week 3-4) -- [ ] Business logic implementation -- [ ] Event integration -- [ ] API development - -### Phase 3: Integration (Week 5-6) -- [ ] Frontend integration -- [ ] Service communication -- [ ] End-to-end testing - -### Phase 4: Polish (Week 7-8) -- [ ] Performance optimization -- [ ] Security hardening -- [ ] Documentation - -## πŸ§ͺ Testing Strategy - -### Unit Testing -- [ ] Service layer tests -- [ ] Domain logic tests -- [ ] Repository tests - -### Integration Testing -- [ ] API endpoint tests -- [ ] Database integration tests -- [ ] Event handler tests - -### End-to-End Testing -- [ ] User workflow tests -- [ ] Cross-service integration tests -- [ ] Performance tests - -## πŸ“Š Monitoring & Observability - -### Metrics -- [ ] Business metrics to track -- [ ] Technical metrics to monitor -- [ ] SLA/SLO definitions - -### Logging -- [ ] Structured logging requirements -- [ ] Log correlation across services -- [ ] Security audit logging - -### Alerting -- [ ] Critical alerts -- [ ] Performance degradation alerts -- [ ] Business metric alerts - -## 🚨 Risk Assessment - -### Technical Risks -- **Risk**: [Description] - - **Probability**: High/Medium/Low - - **Impact**: High/Medium/Low - - **Mitigation**: [Strategy] - -### Business Risks -- **Risk**: [Description] - - **Mitigation**: [Strategy] - -## πŸŽ“ Decision Log - -Use this section to document key decisions made during feature design: - -### Decision 1: [Title] -- **Context**: Why was this decision needed? -- **Options Considered**: What alternatives were evaluated? -- **Decision**: What was chosen? -- **Rationale**: Why was this the best choice? -- **Consequences**: What are the implications? - -## πŸ“š References - -- [ ] External APIs or services referenced -- [ ] Design patterns used -- [ ] Industry best practices followed -- [ ] Performance benchmarks -``` + > [!IMPORTANT] + > Try to be specific! Don't just tell the AI: "add a rewards system", give it details, such as: `create a customer loyalty program where for every $1 spent on the store, the customer earns 10 points. And for each 1,000 points the customer can redeem those points for $10 off their next purchase. Show the customer's point balance on their profile. And show the number of points earned under each item on the product page` > [!TIP] - > Try using Copilot to have it write your requirements for you! Copy the template, and surround it with <|TEMPLATE_START|><|TEMPLATE_END|>, and describe to the AI the Feature you'd like to plan! Then review and edit the generated markdown file to ensure that it matches what you want! + > Super-pro tip! You can create your requirements in a markdown file and then reference it to the AI in the same way we've referenced other files. We have a sample requirements-template.md file in this repository that has a great starter template to use for feeding detailed requirements to GitHub Copilot! [Check it out!](../../requirements-template.md) ## βœ… Success Criteria @@ -247,23 +101,23 @@ Your feature requirements document should demonstrate: Remember: **Good requirements are the foundation of great software**. Take your time to think through the problem space before jumping into solutions. The ngLibrary application is a reference for modern software engineering practicesβ€”your feature should exemplify the same level of thoughtfulness and technical excellence. -**Start by creating your feature file and begin documenting your thinking process. Use Chat mode to help you if you're stuck!** +**Once you have a well defined specification, you can start work!** -When you're created your requirements file, prompt Copilot in Agent Mode: +When you're created your specs, prompt Copilot in Agent Mode: -`Plan #feature-[your-feature-name].md` +`/execute-tasks` -This should trigger the planning process. The AI should validate and produce an implementation plan, and ask if you're ready to start! When you're ready to continue: +You can pass it alone, or reference the specific task id from the `tasks.md` file you'd like to start with. Remember to help the AI keep track of completed tasks, so that you can easily start new conversations as context windows fill up. You can have the AI pickup on a specific task id, and sub-task with the same command: -`Act: #feature-[your-feature-name].md` +`/execute-task Resume 03bd0240-fdcc-48a9-832c-71c44193a375 task 1.10` From here, the AI will begin implementing your feature. Ensure you interact with the AI often, running unit tests, building and validating it's progress, provide feedback. Continue to use this process as you go until your feature is completed! ```mermaid flowchart LR - Plan[Plan
#story] --> Act[Act
#story] - Act --> Status[Status
#story] - Status --> Debug[Debug
#story] - Debug --> Plan + Plan[Plan
/create-spec] --> Act[Act
/execute-tasks] + Act --> Status[Status
/generate-report] + Status --> Debug[Debug
/startDebugging] + Debug --> Act style Plan stroke:#4F8EF7,stroke-width:2px style Act stroke:#4F8EF7,stroke-width:2px style Status stroke:#4F8EF7,stroke-width:2px diff --git a/goals/typescript/projectBrief.md b/goals/typescript/projectBrief.md deleted file mode 100644 index 0fd51e2..0000000 --- a/goals/typescript/projectBrief.md +++ /dev/null @@ -1,40 +0,0 @@ -# Project Brief: ngLibrary (Angular Reference Application) - -## Overview -The **ngLibrary** project is a reference Angular application that demonstrates modern web application architecture and development practices. It implements a full-featured e-commerce website for a bookstore and serves as a practical example for developers building scalable and maintainable applications with Angular. - -## Purpose -The primary goals of the ngLibrary project are to: -- Showcase best practices in Angular application development. -- Demonstrate the use of **NgRx** for state management in a real-world application. -- Provide a hands-on learning tool for developers exploring reactive programming, component-based architecture, and comprehensive testing strategies. - -## Key Features -- **Architecture**: Feature-based, reactive architecture using NgRx for state management. It follows a clear separation of concerns with smart (container) and UI (presentational) components. -- **Frontend**: Built with **Angular 11** and TypeScript. -- **Backend Integration**: Fetches book data from the public **Open Library API**. -- **Comprehensive Testing**: Includes unit tests with **Jest** and end-to-end tests with **Cypress**. -- **Component-Driven Development**: Uses **Storybook** to develop and showcase UI components in isolation. -- **Performance Optimizations**: - - Implements **lazy loading** for feature modules to improve initial load time. - - Uses `PreloadAllModules` strategy to preload lazy-loaded modules in the background. - -## Technology Stack -- Angular 11 -- NgRx (for state management) -- RxJS -- TypeScript -- Jest (for unit testing) -- Cypress (for end-to-end testing) -- Storybook (for UI component development) - -## Getting Started -To run the ngLibrary project: -1. Clone the repository: `git clone https://github.com/mrWh1te/ngLibrary.git` -2. Install prerequisites: - - Node.js - - Angular CLI -3. Install dependencies: `npm install` -4. Run the development server: `npm start` -5. To run tests: `npm test` -6. To view components in Storybook: `npm run storybook` \ No newline at end of file diff --git a/requirements-template.md b/requirements-template.md new file mode 100644 index 0000000..44f757f --- /dev/null +++ b/requirements-template.md @@ -0,0 +1,146 @@ +# Feature: [Your Feature Name] + +## πŸ“– Executive Summary + +### Business Value +- [ ] **Problem Statement**: What business problem does this solve? +- [ ] **Target Users**: Who will use this feature? +- [ ] **Success Metrics**: How will you measure success? +- [ ] **Business Impact**: Revenue, user engagement, operational efficiency? + +### Technical Alignment +- [ ] **Architecture Fit**: How does this align with the microservices architecture? +- [ ] **Service Boundaries**: Which services will be affected or created? +- [ ] **Data Ownership**: Which service owns the feature's data? + +## 🎯 Feature Requirements + +### Functional Requirements +1. **[Requirement ID]**: [Clear, testable requirement] + - **Acceptance Criteria**: + - Given [context] + - When [action] + - Then [expected outcome] + - **Priority**: Must Have / Should Have / Could Have / Won't Have + +2. **[Next Requirement]**... + +### Non-Functional Requirements +- **Performance**: Response time, throughput expectations +- **Scalability**: Expected load, growth patterns +- **Security**: Authentication, authorization, data protection +- **Reliability**: Availability, error handling, recovery +- **Usability**: User experience considerations + +## πŸ—οΈ Technical Design + +### Service Architecture +- **New Services**: What new microservices need to be created? +- **Modified Services**: Which existing services need changes? +- **Service Communication**: How will services communicate? (sync/async) +- **Data Flow**: Map the data flow through your feature + +### Database Design +- **New Tables/Collections**: Schema design +- **Data Relationships**: How does your data relate to existing entities? +- **Migration Strategy**: How will you handle schema changes? + +### Event Design +- **Domain Events**: What events will your feature publish? +- **Integration Events**: How will you integrate with other services? +- **Event Handlers**: What background processing is needed? + +### API Design + +# Example API endpoints + +GET /api/[service]/[resource] +POST /api/[service]/[resource] +PUT /api/[service]/[resource]/{id} +DELETE /api/[service]/[resource]/{id} + +## πŸ”„ Implementation Roadmap + +### Phase 1: Foundation (Week 1-2) +- [ ] Service scaffolding +- [ ] Database schema +- [ ] Basic CRUD operations + +### Phase 2: Core Features (Week 3-4) +- [ ] Business logic implementation +- [ ] Event integration +- [ ] API development + +### Phase 3: Integration (Week 5-6) +- [ ] Frontend integration +- [ ] Service communication +- [ ] End-to-end testing + +### Phase 4: Polish (Week 7-8) +- [ ] Performance optimization +- [ ] Security hardening +- [ ] Documentation + +## πŸ§ͺ Testing Strategy + +### Unit Testing +- [ ] Service layer tests +- [ ] Domain logic tests +- [ ] Repository tests + +### Integration Testing +- [ ] API endpoint tests +- [ ] Database integration tests +- [ ] Event handler tests + +### End-to-End Testing +- [ ] User workflow tests +- [ ] Cross-service integration tests +- [ ] Performance tests + +## πŸ“Š Monitoring & Observability + +### Metrics +- [ ] Business metrics to track +- [ ] Technical metrics to monitor +- [ ] SLA/SLO definitions + +### Logging +- [ ] Structured logging requirements +- [ ] Log correlation across services +- [ ] Security audit logging + +### Alerting +- [ ] Critical alerts +- [ ] Performance degradation alerts +- [ ] Business metric alerts + +## 🚨 Risk Assessment + +### Technical Risks +- **Risk**: [Description] + - **Probability**: High/Medium/Low + - **Impact**: High/Medium/Low + - **Mitigation**: [Strategy] + +### Business Risks +- **Risk**: [Description] + - **Mitigation**: [Strategy] + +## πŸŽ“ Decision Log + +Use this section to document key decisions made during feature design: + +### Decision 1: [Title] +- **Context**: Why was this decision needed? +- **Options Considered**: What alternatives were evaluated? +- **Decision**: What was chosen? +- **Rationale**: Why was this the best choice? +- **Consequences**: What are the implications? + +## πŸ“š References + +- [ ] External APIs or services referenced +- [ ] Design patterns used +- [ ] Industry best practices followed +- [ ] Performance benchmarks \ No newline at end of file diff --git a/screenshots/rider_context_menu.png b/screenshots/rider_context_menu.png new file mode 100644 index 0000000..76ced88 Binary files /dev/null and b/screenshots/rider_context_menu.png differ