This project demonstrates a generative AI use case using AWS Lambda, CloudFront, and streaming responses. It leverages a serverless architecture to provide real-time chat capabilities.
apps/server: A Node.js/TypeScript server application containerized for AWS Lambda.infra: Terraform configuration and scripts for deploying the AWS infrastructure (CloudFront, Lambda, ECR, ACM).notebook: Jupyter notebooks for experimentation and testing.docs: Documentation for the project.
- Infrastructure: Details on the AWS resources and Terraform configuration.
- Lambda Streaming: Specifics on the streaming chat Lambda and its deployment.
- Components: Details on the application components (Server, CloudFront Functions).
- Infrastructure Scripts:
- Full Deployment:
make deploy - Test Streaming:
make test
- Full Deployment:
- CloudFront Function:
infra/functions/auth.js
Created by Warike technologies