Answer:
Simply, AdaBoost is best used when you want to boost the performance of decision trees on binary classification problems. So adaboost is an extension of a problem that's to be solved with decision trees, AND it's the same with the XGBoost as they're but boosted trees.
Step-by-step explanation: