From the course: R Essential Training Part 2: Modeling Data
Unlock the full course today
Join today to access over 22,600 courses taught by industry experts or purchase this course individually.
Creating ensemble models with random forest classification
From the course: R Essential Training Part 2: Modeling Data
Creating ensemble models with random forest classification
- [Instructor] In our final demonstration of working in R to analyze and declassify data, I want to show you how to do something called a random forest. Now, in a previous video, I talked about decision trees, where you take your data and you have a whole series of yes, no decisions that split it off into these branches until you have a final model and it's able to portray it graphically which is wonderful. A random forest is essentially a large collection of decision trees. It's also an introduction into what's known as ensemble modeling. And what that is, instead of having just one model, use the wisdom of the crowds, you know, the idea that two heads are better than one, use many models and combine them because, typically, the averaged predictions of several different models are going to be more accurate than the predictions of any one model. Now to show you how this works, I'm going to load a few packages, including…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
-
-
-
-
(Locked)
Grouping cases with hierarchical clustering10m 58s
-
(Locked)
Grouping cases with k-means clustering7m 54s
-
(Locked)
Classifying cases with k-nearest neighbors11m 57s
-
(Locked)
Classifying cases with decision tree analysis9m 13s
-
(Locked)
Creating ensemble models with random forest classification9m 20s
-
(Locked)
-