From the course: Supervised Learning Essential Training

Unlock the full course today

Join today to access over 22,600 courses taught by industry experts or purchase this course individually.

Exploring how ensemble methods create strong learners

Exploring how ensemble methods create strong learners - Python Tutorial

From the course: Supervised Learning Essential Training

Start my 1-month free trial

Exploring how ensemble methods create strong learners

- [Instructor] Ensemble methods, like random forests, combine several decision trees to get better performance on test data. The idea is to train multiple models using the same learning algorithm to achieve better results. The two most common techniques to perform random forests are bagging and boosting. Bagging or bootstrap aggregation is used when the goal is to reduce the variance of a decision tree. Decision trees often suffer from high variance because small variations in the data might result in a completely different tree. Bagging solves this problem by creating parallel random subsets of the data from the training data. Each observation has the same probability to appear in a new subset. Next, each collection of subset data is used to train decision trees resulting in an ensemble of different trees. Finally, an average of all predictions of those different decision trees are used. This produces a more robust…

Contents