Week 4
Introduction
This week will introduce tree-based methods for supervised learning, including decision trees, random forests, and gradient boosting.
Learning Objectives
By the end of this week, you will be able to:
- Understand the design and training of decision trees.
- Understand the principle of ensemble methods, including bagging and boosting.
- Understand the design and strengths of random forests and gradient boosting machines.
- Can apply tree-based methods from proper libraries (random forest from
sklearnand XGBoot fromXGBoost).
Lecture
- To access the lecture notes: Lecture
- An UPDATED lecture notes with more explanation on decision tree & impurity: UPDATED Lecture
Quiz
To access the quiz on Moodle, please check Moodle page.
Practical
Note
To save a copy of notebook to your own GitHub Repo: follow the GitHub link, click on Raw and then Save File As... to save it to your own computer. Make sure to change the extension from .ipynb.txt (which will probably be the default) to .ipynbbefore adding the file to your GitHub repository.
To access the practical: