An intelligent interview simulation system powered by LLMs that automatically generates relevant questions based on a candidate's CV, conducts interviews, and handles clarifications when needed.
This project implements an automated interview agent using LangChain and LangGraph. The system analyzes a candidate's CV to extract keywords, generates tailored interview questions, and conducts a natural conversation flow with clarification capabilities.
- Keyword Extraction: Automatically identifies key skills and experiences from CV text
- Dynamic Question Generation: Creates relevant technical questions based on extracted keywords
- Interactive Interview Flow: Simulates a real interview experience with follow-up questions
- Clarification Handling: Intelligently responds when candidates ask for clarification
- Conversation Management: Maintains context throughout the interview process
The system is built as a state machine using LangGraph with the following components:
- State Management: Tracks CV data, questions, answers, and conversation history
- Interviewer Agent: Handles keyword extraction, question generation, and conversation flow
- Decision Logic: Determines whether to continue asking questions, provide clarification, or end the interview
A visualization of the graph workflow is generated as graph.png when running the application.
- Clone the repository
- Create a virtual environment:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
- Install dependencies:
pip install -r requirements.txt
- Set up your environment variables in a
.envfile
- Ensure you have Ollama installed and the required LLM model (llama3.2:1b) available
- Create prompt files in the
prompts/directory:clarification_node.txtdecider_node.txtfetch_keyword.txtgenerate_question.txt
- Run the application:
python main.py
You can customize the interview agent by:
- Modifying the CV data in
main.py - Adjusting the prompt templates in the
prompts/directory - Changing the LLM model in the
Interviewerinitialization
- Python 3.8+
- LangChain and LangGraph libraries
- Ollama for local LLM inference
- Additional dependencies listed in
requirements.txt
This project is open source and available under the MIT License.