Skip to content

🍷 Sophisticated, persona-driven AI assistant built with Gemini 3 Flash. Features stateless history management, dynamic context injection, and modular prompt engineering

License

Notifications You must be signed in to change notification settings

RaptileBytez/giulia-ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

12 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

🍷 Giulia: Professional Executive AI Assistant

Enterprise-Grade Architecture | Model-Agnostic Design | Advanced Steerability

Raptile Bytez Logo
Giulia AI Cyber Jungle Banner

⚑ Quick Summary (Recruiter View)

  • What it is: A robust AI framework demonstrating high-precision persona control and model-agnostic architecture.
  • Core Skills: Advanced Python, LLM-Ops, System Integration, and Adapter-Pattern Architecture.
  • Seniority: Built by a PLM Expert with 15+ years of Enterprise experience, bridging the gap between legacy systems and AI.

πŸ“‹ Executive Summary

Giulia is a technical reference implementation for transitioning Enterprise PLM expertise into AI Engineering. This project solves the "Vendor Lock-in" problem by using a decoupled architecture, allowing businesses to switch between AI providers (Gemini, OpenAI, Llama) without rewriting their core logic.

Business Relevance: Companies need AI that follows strict corporate guidelines. Giulia proves that LLMs can be steered with 100% adherence to bilingual protocols and length constraints.


πŸ› οΈ Skills Demonstrated

  • System Architecture: Provider & Adapter Patterns for model-agnosticism.
  • Data Persistence: Normalized, vendor-neutral JSON state management.
  • AI Steerability: Precise control of tone, language, and logic through structured prompting.
  • Modern Tooling: High-performance dependency management via uv.

🧠 Engineering Highlights

  • Decoupled AI Core: Implementation of an AIModelInterface. Switch LLM providers via CLI without touching the orchestrator logic.
  • Universal History Manager: Automatically normalizes vendor-specific data (e.g., Gemini Content Objects) into a standard schema for long-term auditability.
  • Hierarchical Prompting: Managed asset structure (core, tasks, library) to optimize LLM performance while keeping code clean.

🎭 Persona Philosophy: Professional Context-Awareness

Instead of a generic chatbot, Giulia acts as a High-Context Executive Assistant:

  • Steerability Proof: Demonstrates how to maintain a sophisticated, charismatic, yet strictly professional tone over long conversations.
  • Operational Constraints: Hard-enforced 60-word limits and bilingual automatic detection.
  • Enterprise Identity: She is programmed to recognize and augment the user's specific background in PLM and System Architecture.

πŸš€ Getting Started

Installation

  1. Install uv: curl -LsSf https://astral.sh/uv/install.sh | sh
  2. Setup Environment:
    git clone [https://github.com/RaptileBytez/giulia-ai.git](https://github.com/RaptileBytez/giulia-ai.git)
    cd giulia-ai
    uv sync
    echo "GEMINI_API_KEY=your_key_here" > .env

Running Giulia

uv run main.py              # Launch Standard Session
uv run main.py --mock       # Developer Test Mode (Zero Cost)

πŸ“‚ Project Structure

β”œβ”€β”€ data/
β”‚   β”œβ”€β”€ chat_history/           # Session JSON files (Git-ignored)
β”‚   └── logs/                   # Application and API logs (Git-ignored)
β”œβ”€β”€ prompts/                    # New Hierarchical Structure
β”‚   β”œβ”€β”€ core/                   # Identity (Giulia persona, wrappers)
β”‚   β”œβ”€β”€ tasks/                  # Active production prompts (categorized)
β”‚   └── library/                # Research & model-optimized assets (OpenAI, etc.)
β”œβ”€β”€ utils/
β”‚   β”œβ”€β”€ ai/                     # AI Core Subpackage
β”‚   β”‚   β”œβ”€β”€ __init__.py         # Central exports
β”‚   β”‚   β”œβ”€β”€ model_interface.py  # Abstract base classes
β”‚   β”‚   β”œβ”€β”€ model_provider.py   # Gemini & Mock implementations
β”‚   β”‚   β”œβ”€β”€ prompt_loader.py    # Path-based templating engine
β”‚   β”‚   └── history_manager.py  # Persistence logic
β”‚   └── logger.py               # Unified logging system
β”œβ”€β”€ chatbot.py                  # Refactored Orchestrator using Interface
└── main.py                     # Entry point with argparse support

πŸ‘€ About the Author: Jesco Wurm

Behind the creative label Raptile Bytez is a seasoned Enterprise Systems Expert and PLM Consultant with 15+ years of experience in the Oracle Agile e6 ecosystem.

With a degree in Business Information Systems (2009), I focus on bringing "Industrial-Grade" stability to the world of AI Engineering. I build systems that don't just "talk," but integrate into complex professional workflows.

πŸ› οΈ Tech Expertise & Interests

  • Enterprise: Oracle Agile e6, PLM Architecture, System Integration.
  • AI & Automation: LLM Orchestration (Gemini, OpenAI), Prompt Engineering, Python.
  • Philosophy: Clean Code, Modular Design, and Stateless Architecture.

🀝 Let's Connect

I am currently building my network in the AI space. Whether you are an AI enthusiast, a fellow developer, or a recruiter looking for a consultant with both business logic and AI-coding skills β€” let's connect!

LinkedIn GitHub Follow


βš–οΈ License

MIT License - See LICENSE for details.

About

🍷 Sophisticated, persona-driven AI assistant built with Gemini 3 Flash. Features stateless history management, dynamic context injection, and modular prompt engineering

Topics

Resources

License

Stars

Watchers

Forks

Languages