E-Mail Analyzer with Task-Extraction from Source-Mail and Non-Literal-Language Detection
Now with OpenAI cloud based Mode
You can skip the setup part for the local LLM if you only want to use the cloud based Mode, the local mode will not work then
When using the cloud based LLM be aware:
- the privacy implications of sending potentially sensitive mails to an OpenAI server
- you need to provide your own API key via
changeModel.sh->apikey
-
Download, Install and start docker
- On Windows you have to enable WSL2 so Docker and Shell-HelperScripts work
- Optional: Download and install ollama if you do not want to use the ollama container (e.g. for AppleSilicon GPU Support)
-
Run Docker-compose file
- This starts both the flask and the LLM containers
- (on first start the download of the model may take a while)
- if you try to query the model before the download finished it will show as not reachable
-
Standard First Start is DockerMode but this can be changed with changeModel.sh
-
Add API key for cloud based mode to work via
changeModel.sh->apikey -
Electron App:
On Host-Device: Open http://127.0.0.1:5000 in your browser In Network: IP-Address of Host-Device
FYI: On first startup or after a model change as a model gets downloaded, depending on your network connection and model size it may take some time until then the application will not work
-
Task Extraction - Task Extraction/Summary for an Input-Mail
-
Non-Literal Meaning Analysis - NonLitMeanings-Extraction for Input-Mail
-
Email answer templates - Generate answer to input mail
- Can detect Irony, Sayings and Non-Literal-Meaning Language in general
- Highlighting of NonLitMeaning and Linking between Analysis Result and original mail text
You can change the underlying local LLM used for analysis and whether to run fully dockerized or with native/BareMetal Ollama
-
Run
changeModel.sh:sudo /bin/bash /Users/autarkvui/PycharmProjects/MailSupport/scripts/changeModel.sh -
Choose
localand input a Model Name from here- No guaranty for model compatability for every model, template changes may be needed for optimal answer quality
-
choose between:
- docker: fully containerized
- native: BareMetalMode ollama on host system without container, only flask app in container
The API-Key for the cloud-based model has to be set via set_apiKey.sh
Should you want to edit the default/fallback options the application uses in case of missing/wrong inputs or files edit these files
- entrypoint_production.sh: change
DEFAULT_MODEL - llmInteraction: change defaults in
get_mode()andget_model_name()
- Model: Llama3.1:8b
- Ollama and Flask App in separate Docker containers
- Model: Llama3.1:8b
- Mode: Native
- mac_autostartup_on_boot.sh:
- if there are startup errors check here:
cat /tmp/com.myproject.docker.err.log - normal logging:
cat /tmp/com.myproject.docker.out.log
- if there are startup errors check here:
-
Docker
-
DockerDesktop
-
startup.command: executes
mac_autostartup_on_boot.sh -
switch.command: switch to native mode after startup
-
Safari in Focus: Automator Script to open application in Safari
-
/etc/sudoer got appended to enable no password startup: `autarkvui ALL = (ALL) NOPASSWD: ALL
| GPU | Support | |
|---|---|---|
| MacOS/Windows/Linux | CPU-only | ✅ |
| MacOS/Windows | Nvidia GPU | ✅ in theory but not fully implemented/tested |
| MacOS/Windows | AMD | If you get it working congrats :) |
| MacOS | Apple Silicon GPU | ✅ only native / not in docker |
| Linux | any GPU | not tested |
MailSupport/
├── app/ # Flask-App
│ ├── templates/ # HTML-Templates
│ ├── static/ # CSS & JavaScript
│ └── app.py # Main-Application
├── backend/ # Backend-Logic
└── docker-compose.yml # Docker-Config
Do these on after another should something not work or crash
- down all containers:
docker-compose -f /Users/autarkvui/PycharmProjects/MailSupport/docker-compose.yml down - if in native mode use modelChange to switch back to docker mode + ollama3.1:8b
- delete docker volume data -> ollama will pull the default image again at startup:
docker volume rm $(docker volume ls -q) - run docker-compose again via console:
docker-compose -f /Users/autarkvui/PycharmProjects/MailSupport/docker-compose.yml up -d
Alternatively do above steps via PycharmGui
Should docker or ollama itself misfunction or not start try these:
Docker Start :
open -a Docker
Docker restart:
pkill -f com.docker.backend
open -a Docker
Ollama
pkill -f ollama
ollama serve
SEE LICENSE for details.