-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
✅ Add Test Suite for Core Modules (No Token Usage)
Overview
Set up a test framework using pytest to ensure core system components behave as expected without relying on OpenAI API calls. All model interactions (e.g. GPT completions and embeddings) will be mocked for speed, determinism, and zero token usage.
📋 Plan of Action
1. Create /tests Directory
Structure tests by module:
/tests
test_indexing.py
test_query_zotero.py
test_query_pubmed.py
test_prompts.py
test_database.py
test_synthesis.py
2. Add Tests for Deterministic Components
- ✅ BibTeX parsing and metadata structure
- ✅ FAISS index and retrieval dimensions
- ✅ Prompt template loading and placeholders
- ✅ SQLite logging functions
3. Mock GPT Calls
Use unittest.mock or pytest-monkeypatch to override:
client.chat.completions.createclient.embeddings.create
This ensures tests:
- Use no tokens
- Run offline
- Are suitable for CI/CD (e.g., GitHub Actions)
4. Add pytest to Dev Requirements
If not already present:
# environment.yaml or requirements-dev.txt
- pytest5. Optional: Add CI Hook
Later, set up GitHub Actions to run tests on push or PR:
# .github/workflows/test.yml
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: conda-incubator/setup-miniconda@v2
with:
environment-file: environment.yaml
- run: pytest tests/Metadata
Metadata
Assignees
Labels
No labels