AdaBoost is one of those machine learning methods that seems so much more confusing than it really is. It's really just a simple twist on decision trees and random forests.
NOTE: This video assumes you already know about Decision Trees...
[ Ссылка ]
...and Random Forests....
[ Ссылка ]
For a complete index of all the StatQuest videos, check out:
[ Ссылка ]
Sources:
The original AdaBoost paper by Robert E. Schapire and Yoav Freund
[ Ссылка ]
And a follow up by co-created Schapire:
[ Ссылка ]
The idea of using the weights to resample the original dataset comes from Boosting Foundations and Algorithms, by Robert E. Schapire and Yoav Freund
[ Ссылка ]
Lastly, Chris McCormick's tutorial was super helpful:
[ Ссылка ]
If you'd like to support StatQuest, please consider...
Buying The StatQuest Illustrated Guide to Machine Learning!!!
PDF - [ Ссылка ]
Paperback - [ Ссылка ]
Kindle eBook - [ Ссылка ]
Patreon: [ Ссылка ]
...or...
YouTube Membership: [ Ссылка ]
...a cool StatQuest t-shirt or sweatshirt:
[ Ссылка ]
...buying one or two of my songs (or go large and get a whole album!)
[ Ссылка ]
...or just donating to StatQuest!
[ Ссылка ]
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
[ Ссылка ]
0:00 Awesome song and introduction
0:56 The three main ideas behind AdaBoost
3:30 Review of the three main ideas
3:58 Building a stump with the GINI index
6:27 Determining the Amount of Say for a stump
10:45 Updating sample weights
14:47 Normalizing the sample weights
15:32 Using the normalized weights to make the second stump
19:06 Using stumps to make classifications
19:51 Review of the three main ideas behind AdaBoost
Correction:
10:18. The Amount of Say for Chest Pain = (1/2)*log((1-(3/8))/(3/8)) = 1/2*log(5/8/3/8) = 1/2*log(5/3) = 0.25, not 0.42.
#statquest #adaboost
Ещё видео!