An API-backed system that allows registered users to submit neologisms, which are then evaluated by multiple LLM providers and reviewed for conflicts.
- ✅ User registration and authentication
- ✅ Neologism submission and management
- ✅ Integration with multiple LLM providers (OpenAI, Anthropic, Google)
- ✅ Automated conflict detection and resolution requests
- ✅ JSON-templated response format for word definitions
- ✅ SQLite database with proper schema
- ✅ Token-based authentication
- ✅ RESTful API design
-
Start the server:
python neologe_server.py
-
Try the example client:
python example_client.py
-
Run tests:
python test_api.py
POST /auth/register- Register a new userPOST /auth/login- User loginPOST /neologisms/- Submit a new neologismGET /neologisms/- List user's neologismsGET /neologisms/{id}- Get neologism detailsPOST /neologisms/{id}/resolve- Resolve conflicts for a neologism
- User submits a neologism with their definition and optional context
- Three LLM providers (OpenAI, Anthropic, Google) analyze the word independently
- Each provider responds with a standardized JSON format containing:
- Definition, part of speech, etymology
- Word variations (plural, adjective forms, etc.)
- Usage examples and confidence score
- A single LLM evaluates the three responses for conflicts
- If conflicts exist, the user is notified and can resolve them
- Final status is updated based on evaluation results
{
"word": "technophilic",
"definition": "Having a strong affinity for technology",
"part_of_speech": "adjective",
"etymology": "From Greek 'techno-' (art, skill) + 'philic' (loving)",
"variations": {
"noun": "technophile",
"adverb": "technophilically"
},
"usage_examples": ["She has a technophilic approach to problem-solving."],
"confidence": 0.92
}from neologe_client import NeologeClient
# Initialize client
client = NeologeClient("http://localhost:8000")
# Register and login
client.register("wordsmith", "user@example.com", "password")
client.login("wordsmith", "password")
# Submit a neologism
result = client.submit_neologism(
word="digitality",
definition="The quality or state of being digital",
context="Used in discussions about digital transformation"
)
# Check status and resolve conflicts if needed
if result['status'] == 'conflict':
client.resolve_conflict(result['id'], "accept_consensus")- See API_DOCUMENTATION.md for detailed API documentation
- Run the server and visit
http://localhost:8000/for endpoint reference
- Backend: Python with built-in libraries (http.server, sqlite3, json)
- Database: SQLite for development (easily replaceable with PostgreSQL/MySQL)
- Authentication: HMAC-based tokens with password hashing
- LLM Integration: HTTP clients for OpenAI, Anthropic, and Google APIs
- Conflict Resolution: Automated analysis with user override capabilities
pending- Initial submissionevaluated- LLM analysis complete, no conflictsconflict- Conflicts detected, user resolution requiredresolved- User has resolved conflictsllm_error- Error during LLM processing
This implementation uses Python's built-in libraries for maximum compatibility. For production use, consider:
- Replace
http.serverwith a proper WSGI server (Gunicorn, uvicorn) - Use production database (PostgreSQL, MySQL)
- Configure real LLM API keys in
.envfile - Add rate limiting, logging, and monitoring
- Implement proper CORS and security headers