Repository consists of task-notebooks completed during MIPT CV course. Their relevance is questionable as it is not very clean and well-structured code. Yet they include a number of unevident math problems and a bit of code examples that sometimes are worth reusing.
-
HW_2_Linear_models.ipynb - Sandbox of classic ML models with the custom implementation of various ML tools, like batch generator, logistic regression, gradient descent etc. A practical value is minimal yet greatly improved the understanding of math behind those tools.
-
Churn prediction.ipynb - Classic ML prediction task from Kaggle. Imbalanced classes and correlating features demanded some data tuning. Best score achived via Gradient boosting with grid-searched parameters.
-
Simpsons-summary.ipynb - Journey to Springfield contest solved. Classification of simpson characters based on imbalanced image dataset. Surprisingly rebalancing did not improve the results. Another valuable discovery was various implementations of CNN, which I played with a bit. However best score was achieved on transfered Alexnet model.
-
HW6_semantic_segmentation - Dataset concisted of various skin lesions. The task was to create a mask for those lesions via image segmentation models. The crux of this notebook was an implementation of encoder-decoder architectures and some specific loss functions.