A stateless microservice that provides a unified API for querying Bible verses, performing word searches, and interacting with LLMs for biblical context. Designed for serverless platforms like Google Cloud Run.
- Verse Retrieval: Fetch verses by reference (e.g.,
John 3:16) with formatting preserved. - Word Search: Find verses by keywords.
- LLM Integration: Ask questions or provide instructions (e.g., "Summarize", "Cross-reference") using various LLM providers (OpenAI, Gemini, DeepSeek, OpenRouter, custom OpenAI-compatible endpoints).
- Smart Routing: Routes queries based on whether they are verse lookups, word searches, or LLM prompts.
- Feature Flags: Dynamic configuration via GitHub-hosted feature flags.
For detailed API documentation, see the OpenAPI specification.
-
Clone the repository:
git clone https://github.com/julwrites/BibleAIAPI.git cd BibleAIAPI -
Set up Environment Variables: Create a
.envfile or export these variables.API_KEYS: JSON string for local auth (e.g.,{"local": "secret"}).LLM_CONFIG: JSON object mapping provider names to model names (e.g.,{"deepseek":"deepseek-chat","openai":"gpt-4o","gemini":"gemini-1.5-pro","openrouter":"x-ai/grok-4.1-fast"}). If not set, falls back toLLM_PROVIDERS(deprecated).OPENAI_API_KEY: Required if using OpenAI.GEMINI_API_KEY: Required if using Gemini.DEEPSEEK_API_KEY: Required if using DeepSeek.OPENROUTER_API_KEY: Required if using OpenRouter.OPENAI_CUSTOM_API_KEYandOPENAI_CUSTOM_BASE_URL: Required if using custom OpenAI-compatible endpoint.GCP_PROJECT_ID: (Optional) Required if you want to test Google Secret Manager integration; otherwise, it falls back to env vars.
-
Run the service:
go run cmd/server/main.go
The server starts on port
8080. -
Test a Request:
curl -X POST http://localhost:8080/query \ -H "X-API-KEY: secret" \ -d '{"query": {"verses": ["John 3:16"]}}'
Note: For verse queries, the
contextobject is not allowed.LLM Prompt Request:
curl -X POST http://localhost:8080/query \ -H "X-API-KEY: secret" \ -d '{"query": {"prompt": "Explain this verse"}, "context": {"verses": ["John 3:16"], "user": {"version": "ESV"}}}'
LLM Prompt Request with Word Search Context:
curl -X POST http://localhost:8080/query \ -H "X-API-KEY: secret" \ -d '{"query": {"prompt": "Summarize the verses containing this word"}, "context": {"words": ["Grace"], "user": {"version": "ESV"}}}'
You can use the provided script to automatically build, run, and verify the API service:
./scripts/test_api.shThis script will:
- Build the server binary.
- Start the server in the background (configuring a dummy
API_KEYSif needed). - Run
curlrequests for Verse lookup (Prose & Poetry) and Word search. - Skip LLM tests if relevant API keys (
OPENAI_API_KEY, etc.) are not found in the environment.
docker build -t bible-api-service .
docker run -p 8080:8080 --env-file .env bible-api-service- Feature Flags: Managed via
go-feature-flag. The service retrieves flags from the GitHub repository by default, falling back toconfigs/flags.yamllocally. - Secrets: The service attempts to fetch secrets from Google Secret Manager. If unavailable (e.g., local dev), it falls back to environment variables.
This project uses a strict task documentation system. All work is tracked in docs/tasks/.
- View Tasks: Use
./scripts/tasks list - Pick a Task: Use
./scripts/tasks next - Create Task: Use
./scripts/tasks create
See AGENTS.md and docs/tasks/GUIDE.md for details.
cmd/server: Main entry point.internal/biblegateway: Scraper logic (Prose/Poetry handling).internal/llm: LLM provider implementations.internal/handlers: Request routing and processing.internal/secrets: Secret management (GSM/Env).docs: Architecture, deployment, and task documentation.docs/tasks: The source of truth for all ongoing work and task history.scripts: Utility scripts, including the Task Manager (scripts/tasks).