Another thing to add: There is a paper from Shapire called "An introduction to Boosting" in this paper he not only explains how AdaBoost works and shows you why it works but also states that it performs well on text input. In some cases it also performs better than the RandomForest if your base classifier is a "Decision Tree" (also called C3.5) because of the way it learns. In the same paper he states that AdaBoost can surely over-fit and that all depends on the shape of you data.
//EDIT
I don't know why there is no "Like" Button or so for articles but only for comments :|
//EDIT2
Found it...
Yes, this one, I liked it a lot although it explains the "sampling of the dataset" poorly, in my opinion. In that paper he shows some results on a "letter dataset". Never found it on the internet. If one finds the dataset I would like to have a link to it :)
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Another thing to add: There is a paper from Shapire called "An introduction to Boosting" in this paper he not only explains how AdaBoost works and shows you why it works but also states that it performs well on text input. In some cases it also performs better than the RandomForest if your base classifier is a "Decision Tree" (also called C3.5) because of the way it learns. In the same paper he states that AdaBoost can surely over-fit and that all depends on the shape of you data.
//EDIT
I don't know why there is no "Like" Button or so for articles but only for comments :|
//EDIT2
Found it...
thanks! i believe its this one? rob.schapire.net/papers/Schapire99... certainly remains a fascinating idea.
Yes, this one, I liked it a lot although it explains the "sampling of the dataset" poorly, in my opinion. In that paper he shows some results on a "letter dataset". Never found it on the internet. If one finds the dataset I would like to have a link to it :)