Skip to content

SahithiAekka/Lex_Bot

Repository files navigation

Lex Bot — AI-Powered Professional Assistant

AWS Python Amazon Lex Amazon Bedrock AWS Lambda

An intelligent conversational AI assistant built on AWS that answers questions about my professional background using natural language processing and generative AI. Built to demonstrate serverless AI integration at conferences and networking events.

Architecture

flowchart LR
    User["👤 Recruiter / Contact"] -->|"Asks a question"| Lex["Amazon Lex V2\n(Chatbot)"]
    
    Lex -->|"Triggers"| Lambda["AWS Lambda\n(Python 3.11)"]
    
    Lambda -->|"1. Analyze entities"| Comprehend["Amazon\nComprehend\n(NLP)"]
    Lambda -->|"2. Generate response"| Bedrock["Amazon Bedrock\n(Titan Text Express)"]
    Lambda -->|"Logs"| CW["CloudWatch\nLogs"]
    
    Comprehend -->|"Entities detected"| Lambda
    Bedrock -->|"AI-generated answer"| Lambda
    Lambda -->|"Professional response"| Lex
    Lex -->|"Reply"| User

    subgraph IAM ["IAM Security"]
        Role["LambdaExecRole"]
        Role -.-> P1["AWSLambdaBasicExecutionRole"]
        Role -.-> P2["ComprehendReadOnly"]
        Role -.-> P3["BedrockInvokePolicy\n(least-privilege)"]
    end

    Lambda --- Role

    style Lex fill:#ff9900,color:#fff
    style Lambda fill:#ff9900,color:#fff
    style Comprehend fill:#3b48cc,color:#fff
    style Bedrock fill:#7b2d8e,color:#fff
    style CW fill:#ff4f8b,color:#fff
Loading

Problem → Solution → Result

Problem Networking at conferences means repeating the same career summary hundreds of times, often missing the chance to highlight relevant experience for each person's specific question.
Solution A conversational AI chatbot powered by Amazon Lex, Lambda, Comprehend (entity detection), and Bedrock (generative AI) that dynamically answers recruiter questions based on resume context.
Result Context-aware, professional responses generated in real-time — the bot detects what the person is asking about and tailors the answer to highlight relevant skills and experience.

How It Works

  1. User asks a question → Amazon Lex V2 captures the intent and input
  2. Lambda analyzes the message → Amazon Comprehend extracts entities (skills, job titles, companies)
  3. AI generates a response → Amazon Bedrock (Titan Text Express) crafts a professional, context-aware answer using the detected entities + resume summary
  4. Response returned → Natural, conversational reply sent back through Lex

Project Structure

.
├── lambda_function.py     # Core logic: Comprehend → Bedrock → Lex response
├── bedrock-policy.json    # Least-privilege IAM policy for Bedrock access
├── trust-policy.json      # Lambda execution role trust policy
├── commands.txt           # Full AWS CLI deployment commands with outputs
├── arch.txt               # Architecture decision notes
└── lex_bot_v1.png         # Bot configuration screenshot

Deployment

All infrastructure was deployed via AWS CLI (documented in commands.txt):

# 1. Create IAM role with Lambda trust policy
aws iam create-role --role-name LambdaExecRole \
  --assume-role-policy-document file://trust-policy.json

# 2. Attach required policies
aws iam attach-role-policy --role-name LambdaExecRole \
  --policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
aws iam attach-role-policy --role-name LambdaExecRole \
  --policy-arn arn:aws:iam::aws:policy/ComprehendReadOnly

# 3. Deploy Lambda function
aws lambda create-function --function-name LexChatbotFunction \
  --runtime python3.11 --role arn:aws:iam::<ACCOUNT_ID>:role/LambdaExecRole \
  --handler lambda_function.lambda_handler \
  --zip-file fileb://lambda_function.zip --timeout 30

# 4. Configure Lex V2 bot via Console
#    - Create bot with FallbackIntent
#    - Attach Lambda for fulfillment
#    - Build and test

Security Design

  • Least-privilege IAM: Bedrock policy scoped to specific model ARNs only (titan-text-express-v1)
  • No hardcoded credentials: Lambda uses IAM role, resume context via environment variable
  • Lex V2 permissions: Explicit lambda:InvokeFunction grant scoped to Lex service principal

Tech Stack

Component Technology Purpose
Chatbot Amazon Lex V2 Intent recognition and conversation flow
Compute AWS Lambda (Python 3.11) Serverless orchestration logic
NLP Amazon Comprehend Entity detection in user messages
GenAI Amazon Bedrock (Titan Text Express) Context-aware response generation
Security IAM Roles + Policies Least-privilege access control
Logging CloudWatch Logs Debugging and monitoring

About

An intelligent conversational AI assistant built on AWS that answers questions about my professional background using natural language processing and generative AI.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages