XGBoost gradient boosting with comprehensive feature engineering
3 seasons of data
after encoding
points prediction
improvement
| Stat | Val MAE | Val RMSE | Val R² | Baseline MAE | Improvement | Status |
|---|---|---|---|---|---|---|
| pts | 4.80 | 6.40 | 0.650 | 5.60 | +14.3% | Strong |
| reb | 2.10 | 2.80 | 0.580 | 2.40 | +12.5% | Good |
| ast | 1.80 | 2.50 | 0.620 | 2.10 | +14.3% | Strong |
| stl | 0.70 | 0.90 | 0.350 | 0.80 | +12.5% | Weak |
| blk | 0.60 | 0.80 | 0.400 | 0.70 | +14.3% | Good |
| tov | 1.00 | 1.40 | 0.450 | 1.20 | +16.7% | Good |
| fg3m | 0.90 | 1.30 | 0.500 | 1.10 | +18.2% | Good |
MAE = Mean Absolute Error (avg points off). RMSE = Root Mean Square Error (penalizes large errors). R² = Coefficient of determination (1.0 = perfect).
Head-to-head comparison once V5 predictions are generated