asked 188k views
0 votes
Weights of each sample remain the same for each subsequent weak learner in an Adaboost model.

a. True
b. False

1 Answer

2 votes

Final answer:

The statement is false; in an AdaBoost model, weights of samples are updated after each iteration to focus subsequent learners on misclassified samples.

Step-by-step explanation:

The statement 'Weights of each sample remain the same for each subsequent weak learner in an AdaBoost model' is false. In the AdaBoost algorithm, the weights of samples are updated after each iteration based on whether the weak learner has correctly classified them. If a sample is misclassified, its weight is increased so that subsequent weak learners focus more on those difficult cases. This iterative process is designed to improve the model by focusing on the most challenging samples.

answered
User Finisinfinitatis
by
8.6k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.