Streamline Sales Suite is a comprehensive platform designed for data analysis and visualization, powered by Python and advanced deep learning techniques. This suite includes a Convolutional Neural Network (CNN) model tailored for item classification, enabling precise categorization and insightful data analysis.
To set up and use this repository, follow these steps:
-
Clone the Repository:
git clone <repository_url>
-
Install Poetry (if not already installed):
pip install poetry
-
Install Dependencies: Set up the Python virtual environment and install all necessary dependencies:
poetry install
-
Acquire Training Images: Obtain the images required to train the classification model:
poetry run python ./src/data_acquisition.py
-
Train the Classification Model: Train the CNN model with the acquired images:
poetry run python ./src/training_model.py
-
Run the Application: Launch the complete project using Streamlit:
poetry run streamlit run ./src/1_π _Home.py
-
Docker Deployment: A Dockerfile and Docker Compose are included for containerizing the application, which is particularly useful for deployment after the model is trained.
Note: The data analysis component relies on a private dataset and may not be functional without it. However, the project can be adapted to work with other datasets.
- Streamlit: For building and deploying the interactive web application.
- Docker: To containerize the application for easy deployment.
- Poetry: For dependency management and virtual environment setup.
- Black: To maintain consistent code formatting.
- Pandas & NumPy: For handling data processing and ETL tasks.
- Loguru: For efficient logging and monitoring of application processes.
- TensorFlow & Keras: For developing and training the deep learning model.
- MLflow: For tracking and managing the machine learning lifecycle.
- OpenCV: For image processing tasks.
Contributions are highly encouraged! Whether you have new tools, models, or techniques to share, your input is welcome. Please feel free to submit a pull request or open an issue to discuss your ideas.
This project is licensed under the MIT License, allowing you to freely use, modify, and distribute the code.
To utilize Nvidia GPUs within Docker, follow these steps:
- Ensure that the Nvidia drivers are installed on your host machine.
- Install the Nvidia Container Toolkit by following the official guide.
- Restart Docker to apply the changes:
sudo systemctl restart docker
- Verify the GPU setup with the following command:
sudo docker run --rm --gpus all nvidia/cuda:11.0.3-base-ubuntu20.04 nvidia-smi
- Build the Docker image for the project:
sudo docker build -t SLS-TF-image -f tensorflow.dockerfile .
- Run the Docker container:
sudo docker run --gpus all -p 8501:8501 --name SLS-TF-container SLS-TF-image