By: Ritchie Kwan
We will use our favorite Titanic dataset to run XGBoost.
We will run classification models we've already learned, grid search them, and then see how XGBoost compares in terms of accuracy and speed.
We will also grid search GradientBoosting and XGBoost using the same parameters to compare computational speed and predictive power.