Depth estimation for garbage containers in an industrial setting.
This repository contains code for setting up the model and its training pipeline. This includes loss functions, dataset generating and loading, image augmentation etc. This code is intended to run on a tensorflow compatible device, preferably using CUDA.
Using virtual normal loss by https://github.com/YvanYin/VNL_Monocular_Depth_Prediction
Master's Thesis Project 2021 by
- Jonas Jungåker
- Victor Hanefors
link coming soon
Necessary dependencies:
- Python 3.8.x
- tensorflow 2.4.x
- tensorflow-addons
- matplotlib
- opencv-python
- numpy
- tensorflow-datasets
Recommended dependencies:
- Cuda/GPU support for tensorflow
The model was first pretrained on NYUDv2. The trained on data gathered using an Intel RealSense 3D-camera, code for that process can be found here. The dataset should contain a pairs of RGB and depth maps, stored together in a directory. The data can be seperated into several subfolder as long as each pair is in the same folder. The script will automaticaly split the input data randomly into training and validation, and saves them as seperate tfrecords.
To use NYUDv2 in training, simply load the dataset using load_nyudv2.
To use self gathered dataset, first convert the data to a tfrecord (for easier handling) using write_tfrecord. Then load the dataset using load_tfrecord_dataset.
Our dataset for training and validation will be available for download, coming soon.
An example of this process can be seen in src/main.py.
- Generate a model, we recommend the softmax model sm_model() for easy implementation with the loss functions.
- If training an existing model, instead load it using load_model()
- Decide on the type of training you wish to do, which dataset, what parameters, which loss functions etc.
- Dataset, learning rate and epochs is passed as parameters to the training_loop().
- Loss functions can be altered in the custom_loss() function, and their respective weighing for the total loss.
- Run training_loop() and wait for the training to complete.
- When training is completed, the model and training history will be saved.
(Optional)
- When the model has finished training, the model can tested by passing validation images to test_model().
- The model history can be visualized using plot_history() to help determine if overfitting issues are present.
Check out our thesis at: (link coming soon)
Cite our work with the following bib-file: (file coming soon)
Thanks to Scania Smart Factory Lab for making this work possible
Currently no license