For a Machine Learning (crash) Course have been prepared practical examples, exercises and (toy) experiments .
The notebook, LassoRegularisation.ipynb , describes the property of the, so called, L1 regularization, and it shows how to determine how many input features are retained or not penalized. At the end there is an exercise, it aims to demonstrate the benefit of feature selection to avoid overfitting.
The notebook, CrossValidation_ModelSelection.ipynb , shows example of underfitting/overfitting and the benefits of implementing cross-validation.
Ex1_EnergyEfficiency.ipynb is an example notebook that shows the full Data Driven Pipeline for a Regression task.
Ex2_AirfoilSelfNoise.ipynb is an exercise, it inherited the structure from the previous example, but it is not completed, the tasks assigned are to complete the notebook with the most appropriate functions/module. A possible solution can be found on the notebook Ex2_AirfoilSelfNoise-Solution.ipynb
Ex3_SteelPlates.ipynb is an example notebook that shows the full Data Driven Pipeline for a Classification task.Ex4_FlowMeterDiagnostic.ipynb is an exercise, it inherited the structure from the previous example, but it is not completed, the tasks assigned are to complete the notebook with the most appropriate functions/module. A possible solution can be found on the notebook Ex4_FlowMeterDiagnostic-Solution.ipynb
Clustering_Example.ipynb is an example notebook that shows the implementation of clustering algorithms within a ML pipeline. Anomaly_Detection_Example.ipynb is an example notebook that shows the implementation of Anomaly Detection algorithms within a ML pipeline. DL_FFNN_Example.ipynb is an example notebook that shows the implementation of Feed-Forward Neural network for a image recognition task, plus a simple exercise at the end.