Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

XGBoost python custom objective function

According to documentation, http://xgboost.readthedocs.io/en/latest/python/python_api.html if we want to define custom objective function, it should have signature

objective(y_true, y_pred) -> grad, hess

where

hess: array_like of shape [n_samples]
The value of the second derivative for each sample point

But, if we have loss function, depending on N variables, we should have NxN matrix of second derivatives, but our hess's shape is only Nx1. Should we exlude "cross-variable" derivatives? Or what else?

like image 822
Evgeniy1089 Avatar asked Apr 22 '26 15:04

Evgeniy1089


1 Answers

I think, the derivative that you have to take is with respect to the score that is returned by the Booster. Thus, it results in one value per training example (=sample) leading to a [n_samples] array. The score is what goes into your objective function, i.e. x in (x-m)**2 in MSE or 1/(1+exp(-x)) in the logistic function.

like image 139
Mischa Lisovyi Avatar answered Apr 24 '26 06:04

Mischa Lisovyi