AdaBoost, short for Adaptive Boosting, is a machine learning ensemble technique that combines multiple weak classifiers to create a strong classifier. Introduced by Yoav Freund and Robert Schapire in 1995, AdaBoost focuses on adjusting the weights of misclassified instances so that subsequent classifiers concentrate more on these challenging cases.
Key Features:
- Adaptive Weighting: AdaBoost assigns higher weights to misclassified instances, prompting subsequent classifiers to focus on these harder-to-classify examples.
- Versatility: While commonly used with simple models like decision stumps (single-level decision trees), AdaBoost can also enhance the performance of more complex base learners.
- Reduced Overfitting: In certain scenarios, AdaBoost can be less susceptible to overfitting compared to other learning algorithms