
Information
- Boosting is a machine learning technique that combines multiple weak learners to create a strong learner.
- Weak learners are models that perform only slightly better than random guessing, such as decision trees with limited depth or simple linear models.
- Boosting iteratively trains a sequence of weak learners, where each subsequent model focuses on the samples that were misclassified by the previous model.
- During each iteration, the weights of the misclassified samples are increased, so that the next model focuses more on these difficult samples.
- The final prediction is a weighted combination of the predictions of all the weak learners, with higher weights given to the more accurate models.
- Boosting can be used for both classification and regression problems.
- AdaBoost (Adaptive Boosting) is one of the most popular boosting algorithms, which assigns higher weights to misclassified samples and lower weights to correctly classified samples.
- Gradient Boosting is another popular boosting algorithm, which trains each model to predict the negative gradient of the loss function with respect to the predicted values.
- Gradient Boosting can use different loss functions, such as mean squared error, cross-entropy, or Huber loss, depending on the type of problem.
- Gradient Boosting can also use different regularization techniques, such as shrinkage, subsampling, and tree pruning, to prevent overfitting and improve generalization.
- Boosting algorithms have achieved state-of-the-art performance on many machine learning benchmarks and have been widely used in various applications, such as computer vision, natural language processing, and recommendation systems.
'Zettelkasten > Terminology Information' 카테고리의 다른 글
Bagging (Bootstrap Aggregating) (0) | 2023.02.23 |
---|---|
Long-term dependency (0) | 2023.02.23 |
MSE (Mean Squared Error) (0) | 2023.02.22 |
DNN (Deep Neural Network) (0) | 2023.02.21 |
Autoformer (0) | 2023.02.21 |
댓글