BNN - Vegetation classification with ConvNets
Complete source of work
Set of all the work I did at UND OpenOrbiter REU program. I was assigned to work on detection and classification of different types of vegetation in mosaics. ConvNets were used, one being based on Stanford's cs231n class (stored in cs231n), the other being a Keras (tensorflow wrapper). The results were dismal, only achiving at best ~40% accuracy and 60-70% in binary classification.
All mosaics scanned:
Layout of project:
- cache - pickels and other files used for operations
- cs231n - source of Stanford based ConvNet, called by VegSolver.py. Has cpu and gpu versions.
- data_prep - Scripts to extract and prepare data for ConvNets
- tif_utils.py - Tiff related functions
- moz_utils.py - Mosaic related functions
- img_extract.py - Extraction of the images from tiffs
- data_utils.py - Manages the images after extraction such as sorting or pickeling data
- lbl_utils.py - Extraction of label data from Excel files and related functions
- Practice - Testing new things
- big_draw.py - Draws on each mosaic the extracted image as a red box, points as purple. Good for visual debugging image extraction.
- logs - cs231n checkpoint logs from each training session
- mnist - testing against mnist dataset
- output - Extracted and converted files such as a csv of the points
- Practice - Testing some random general scripts
- tflow - Early tensorflow training against Veg data
- tflow-testing - Later tensorflow, training against many different data sets including Veg data.
- visual - Mosaics with visual drawing of extracted images from big_draw.py. .txt files are point locations/pixels
- keras_binary.py - Tensorflow backend, trains binary against each class of dataset and stores results in txt file.
- keras_training.py - Tensorflow backend, general classification of veg dataset.
- prep_data.py - Cli for quickly extracting,sorting and pickeling veg data. Very basic atm, needs to be more advance.
- Veg_solver.py - Main for cs231n ConvNet, training it against Veg dataset.