Zettelkasten/Terminology Information

Ensemble learning

Computer-Nerd 2023. 3. 15.

Information

  • Ensemble learning combines the predictions of multiple models to achieve better accuracy than a single model.
  • There are three types of ensemble learning: bagging, boosting, and stacking.
  • Bagging (bootstrap aggregating) involves training multiple models on different subsets of the training data and then averaging their predictions.
  • Boosting involves training multiple models in sequence, with each model focusing on the examples that the previous model misclassified.
  • Stacking is to use several different models to make predictions on a set of data, and then use the outputs of these models as input features to a meta-model, which makes the final prediction.
  • Ensemble learning can help reduce overfitting, improve generalization, and increase model robustness.
  • However, it can also increase model complexity, training time, and computational resources required.
  • Ensemble learning is widely used in a variety of machine learning applications, including image and speech recognition, natural language processing, and recommender systems.

'Zettelkasten > Terminology Information' 카테고리의 다른 글

LSTM (Long Short-Term Memory)  (0) 2023.03.16
Multi-step ahead forecasting  (0) 2023.03.15
Short receptive field  (0) 2023.03.14
F-statistic  (0) 2023.03.14
BiLSTM (Bidirectional Long Short-Term Memory)  (0) 2023.03.13

댓글