Skip to content

sydney-machine-learning/ensemble-learning-tutorial

Repository files navigation

Ensemble learning tutorial using Python and Sklearn

Rohitash Chandra, 2026

Coverage

  • Provide average classification and prediction performance for N independent experimental runs using Adaboost, Gradient Boosting Machine, XGBoost, and Random Forests.
  • Investigate hyperparameters for different models - tree-based models - depth of tree
  • Compare the results with Simple Neural Networks (Adam/SGD) and Decision Trees.
  • Report mean and std of results for a selected classification and regression dataset.
  • Regression - Energy data: https://archive.ics.uci.edu/dataset/242/energy+efficiency
  • Classification - Pima data: https://www.kaggle.com/kumargh/pimaindiansdiabetescsv

Installation

Tutorial

Resources

About

Tutorial comparing ensemble learning: bagging and boosting methods

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors