A Decision Tree is a flowchart-like structure used for both classification and regression. Each internal node represents a 'test' on an attribute (e.g., is the customer older than 30?), each branch represents the outcome of the test, and each leaf node represents a class label or decision. They are intuitive and easy to interpret, mimicking human decision-making.
Decision tree learning algorithms like ID3 (1986) and C4.5 (1993) were developed by Ross Quinlan. CART (Classification and Regression Trees) was introduced by Breiman et al. in 1984.
While simple decision trees are prone to overfitting, they are the building blocks of powerful ensemble methods like Random Forests and Gradient Boosting Machines (XGBoost), which are dominant in tabular data competitions.