This is an streamlit app of brain tumor detector that can automatically detect brain tumor of image from MRI machine.
The model is hosted on GCS(Google Cloud Storage) and this app use client library to connect to GCS server.
- With in project root directory create file
.streamlit/secrets.toml
. This is only for local development refer to here. - Add following to secrets.toml.
[connections.gcs]
type="XXX"
project_id="XXX"
private_key_id="XXX"
private_key="XXX"
client_email="XXX"
client_id="XXX"
auth_uri="XXX"
token_uri="XXX"
auth_provider_x509_cert_url="XXX"
client_x509_cert_url="XXX"
universe_domain="XXX"
Where XXX are value to be filled. This are data from google's service account which is used to connect to GCS(Google Cloud Storage). The deep learning model is hosted in GCS(Google Cloud Storage) These data are exposed to streamlit app.
- Add
.streamlit/secrets.toml
file to .gitignore as this contain sensitive data.
- Add content of
.streamlit/secrets.toml
to streamlit app's secret on cloud. Refer to here
Make sure you have done this Setup info for connecting GCS step
- Open terminal and cd to this directory
- Create python environment
python -m venv .\
- Activate environment
venv\scripts\activate
- Install dependencies
pip install tensorflow-cpu==2.12.0 streamlit opencv-python-headless imutils google_cloud_storage pipreqs
- Start streamlit
streamlit run tumor_detector_app.py
Make sure you have done this Setup info for connecting GCS step
- Follow Local development step
- Run command
pipreqs --force .
- In requirements.txt remove any packages that are not in the following list
and add
google_cloud_storage==3.0.0
- imutils
- opencv_python_headless
- tensorflow_cpu
Note:
- We must use opencv-python-headless package instead of opencv-python when deploy to streamlit otherwise an error will appear when app start
Importerror: libgl.so.1: cannot open shared object file: no such file or directory opencv error
. Same error was discussed here - Python 3.9 is recommend