Weak classifiers (or weak learners) are classifiers which 12 perform only slightly better than a random 11 classifier. These are thus classifiers which 10 have some clue on how to predict the right 9 labels, but not as much as strong classifiers 8 have like, e.g., Naive Bayes, Neurel Networks 7 or SVM.
One of the simplest weak classifiers 6 is the Decision Stump, which is a one-level Decision Tree. It 5 selects a threshold for one feature and 4 splits the data on that threshold. AdaBoost 3 will then train an army of these Decision Stumps 2 which each focus on one part of the characteristics 1 of the data.
When I used AdaBoost, my weak classifiers 6 were basically thresholds for each data 5 attribute. Those thresholds need to have 4 a performance of more than 50%, if not it 3 would be totally random.
Here is a good presentation 2 about Adaboost and how to calculate those 1 weak classifiers: https://user.ceng.metu.edu.tr/~tcan/ceng734_f1112/Schedule/adaboost.pdf
More Related questions