ML Phenotype Data
cp-weiland edited this page Sep 15, 2017
·
6 revisions
- Plant biodiversity data can only be used to their fullest potential when they are discoverable, effectively mobilised, referenceable, transparently cross-linked and employable for different purposes.
- Prospect of the task is automatically extracting trait data and linking these data, ontologies and vocabularies; and applying them to the representation and integration of plant traits.
- Employ Machine Learning to extract trait data from on a plant image collection (ImageClef/LifeClef contest data 2014)
- Avoid training an entire DL net from the scratch by using ConvNet pretrained on a generic, sufficiently large and feature-rich data set (e.g. ImageNet contest data
- For a demo within the technical limitations of the BioHackathon, use _light-weight _ architectures like ResNet-18 and SqueezeNet with a comparatively small number of parameters as provided by pytorch's torchvision.models package
(layer4): Sequential (
(0): BasicBlock (
(conv1): Conv2d(256, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
[...]
(avgpool): AvgPool2d (size=7, stride=7, padding=0, ceil_mode=False, count_include_pad=True)
(fc): Linear (512 -> 6)
)
Low layer feature vectors of the pre-trained ConvNet, which provide the basic blocks of feature detection.
(ImageClef 2014 PlantTrait Data / Distro after removement of Scans)
('Entire', Counter({'Flower': 264, 'Leaf': 136, 'Entire': 98, 'Fruit': 37, 'Stem': 26, 'Branch': 18}))
('Flower', Counter({'Entire': 41, 'Stem': 33, 'Flower': 31, 'Leaf': 5, 'Branch': 3, 'Fruit': 2}))
('Leaf', Counter({'Fruit': 44, 'Entire': 33, 'Flower': 33, 'Stem': 15, 'Leaf': 11, 'Branch': 5}))
('Stem', Counter({'Flower': 94, 'Entire': 81, 'Leaf': 47, 'Stem': 20, 'Fruit': 9, 'Branch': 8}))
('Fruit', Counter({'Leaf': 24, 'Flower': 14, 'Stem': 12, 'Entire': 6, 'Branch': 6, 'Fruit': 5}))
('Branch', Counter({'Flower': 290, 'Leaf': 217, 'Fruit': 109, 'Entire': 98, 'Stem': 79, 'Branch': 46}))
Now retrain model & evaluate on GPU for 24 epochs: ## View