- Decision Tree (30 pts): Implement functions for impurity (Gini or entropy), splitting logic, tree-building up to a specified maximum depth, and prediction functionality.
- Interpretation (10 pts): Identify and discuss the top three predictors for high sales. Provide your discussion in 3–5 sentences.
- Pruning (20 pts): Implement either pre-pruning (e.g., minimum samples per split, maximum depth) or post-pruning (e.g., reduced-error pruning). Compare the performance of the tree before and after pruning.
- Random Forest (30 pts): Implement a random forest using bagging of decision trees. Evaluate the model using accuracy, precision, recall, and F1-score. Discuss how the random forest performs upon a single decision tree.
- Comparison (10 pts): Compare three models, the baseline decision tree, the pruned tree, and the Random Forest in terms of performance metrics and interpretability.
-
Notifications
You must be signed in to change notification settings - Fork 0
VigneshDev16/MachineLearning
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published