This is an open source web interface for use with LM Studio's server API when running local models.
- Mutliple chats
- Automatic chat naming
- Code Markdown
- Model Details
- Remote and Local access detection
- Stop generating button
- Animations
- More:
You must asign your local/remote IP
LOCAL_API_URL and PUBLIC_API_URL
They are each marked with your-[local/remote]-ip-goes-here
Run a python server or just use the RunServer.bat and access either your local/public IP followed by the port.
This project is configured so setting up firebase or other services is easy