This repository details work done on:
- Building a segmentation model prototype end-to-end (link).
- Optimising the performance of the baseline model (link).
- Best practices for model evaluation (link).
W&B tools used:
- Tables
- Artifacts
- Experiments
- Reports
- Model registry
Dataset used:
- BDD1K dataset to be used for semantic segmentation.
Further W&B resources:
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt
wandb login