This project provides a web-based frontend to convert natural language queries into Robot Framework test cases using a Large Language Model (LLM) like Google's Gemini.
frontend/: Contains the HTML, CSS, and JavaScript for the user interface.backend/: Contains the Python Flask server that handles requests, interacts with the LLM, and generates Robot Framework code.backend/documentation/: Contains curated snippets of Robot Framework documentation used to provide context to the LLM.
- Node.js and npm (for the frontend)
- Python 3.x and pip (for the backend)
- Access to Google Gemini API and an API Key (for actual LLM integration - currently mocked)
-
Navigate to the
backenddirectory:cd backend -
Create and activate a Python virtual environment:
python3 -m venv .venv source .venv/bin/activate # On Windows use: .venv\Scripts\activate
-
Install Python dependencies:
pip install -r requirements.txt
To enable interaction with the Google Gemini API, set your API key as an environment variable. This is now crucial for the backend to initialize correctly.
Replace
"YOUR_API_KEY"with your actual key:export GOOGLE_API_KEY="YOUR_API_KEY" # On Windows use: set GOOGLE_API_KEY="YOUR_API_KEY"
Note: If this key is not set, the application will print a warning and API calls will likely fail.
-
Run the Flask development server:
python app.py
The backend server will start on
http://localhost:5000.
- In a new terminal window/tab, navigate to the
frontenddirectory:cd frontend- Install frontend development dependencies (including TypeScript):
npm install
- Compile the TypeScript code:
This will compile
npm run build
src/script.tstodist/script.js, which is used byindex.html.
- Install frontend development dependencies (including TypeScript):
1. Ensure the backend Flask server is running (see Backend Setup, make sure `GOOGLE_API_KEY` is set).
2. Ensure you have built the frontend TypeScript: navigate to the `frontend` directory and run `npm run build` if you haven't already or if you made changes to `src/script.ts`.
3. Open the `frontend/index.html` file directly in your web browser.
4. Type your natural language query into the text area (e.g., "Create a test that logs 'Hello World' to the console").
5. Click the "Generate Test" button.
- The generated Robot Framework code (currently mocked) will appear in the output area.
- The user enters a natural language query in the frontend.
- The frontend sends this query to the backend API (
/generate-test). - The backend Flask application receives the query.
- It constructs a prompt for the Gemini LLM, including the user's query and relevant snippets from the Robot Framework documentation (from
backend/documentation/). - (Currently Mocked) The backend would send this prompt to the Gemini API.
- (Currently Mocked) Gemini processes the prompt and returns generated Robot Framework code.
- The backend sends this generated code back to the frontend.
- The frontend displays the code to the user.
- Integrate with a live Gemini API.
- Improve Robot Framework documentation context (e.g., dynamic retrieval, vector database).
- More sophisticated error handling and user feedback.
- Add options for different Robot Framework output formats or settings.