Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Transform SHAP values from raw to native units with lightgbm Tweedie objective?

The utility of Shapley Additive Explanations (SHAP values) is to understand how each feature contributes to a model's prediction. For some objectives, such as regression with RMSE as an objective function, SHAP values are in the native units of the label values. For example, SHAP values could be expressed as USD if estimating housing costs. As you will see below, this is not the case for all objective functions. In particular, Tweedie regression objectives do not yield SHAP values in native units. This is a problem for interpretation, as we would want to know how housing costs are impacted by features in terms of +/- dollars.

Given this information, my question is: How do we transform the SHAP values of each individual feature into the data space of the target labels when explaining models with a Tweedie regression objective?

I'm not aware of any packages that currently implements such a transformation. This remains unresolved in the package put out by the shap authors themselves.

I illustrate the finer points of this question with the R implementation of lightgbm in the following:

library(tweedie)
library(lightgbm)

set.seed(123)

tweedie_variance_power <- 1.2

labels <- rtweedie(1000, mu = 1, phi = 1, power = tweedie_variance_power)
hist(labels)

feat1 <- labels + rnorm(1000) #good signal for label with some noise
feat2 <-rnorm(1000) #garbage feature 
feat3 <-rnorm(1000) #garbage feature 

features <- cbind(feat1, feat2, feat3)

dTrain <- lgb.Dataset(data = features,
                      label = labels)

params <- c(objective = 'tweedie',
            tweedie_variance_power = tweedie_variance_power)

mod <- lgb.train(data = dTrain,
                 params = params,
                 nrounds = 100)

#Predictions in the native units of the labels
predsNative <- predict(mod, features, rawscore = FALSE)
#Predictions in the raw format
predsRaw <- predict(mod, features, rawscore = TRUE)

#We do not expect these values to be equal
all.equal(predsTrans, predsRaw)
"Mean relative difference: 1.503072"

#We expect values to be equal if raw scores are exponentiated
all.equal(predsTrans, exp(predsRaw))
"TRUE" #... our expectations are correct

#SHAP values 
shapNative <- predict(mod, features, rawscore = FALSE, predcontrib = TRUE)
shapRaw <- predict(mod, features, rawscore = TRUE, predcontrib = TRUE )

#Are there differences between shap values when rawscore is TRUE or FALSE?
all.equal(shapNative, shapRaw)
"TRUE" #outputs are identical, that is surprising!

#So are the shap values in raw or native formats?
#To anwser this question we can sum them

#testing raw the raw case first
all.equal(rowSums(shapRaw), predsRaw)
"TRUE" 

#from this we can conclude that shap values are not in native units,
#regardless of whether rawscore is TRUE or FALSE

#Test native scores just to prove point
all.equal(rowSums(shapNative), predsNative)
"Mean relative difference: 1.636892" # reaffirms that shap values are not in native units

#However, we can perform this operation on the raw shap scores
#to get the prediction in the native value
all.equal(exp(rowSums(shapRaw)), predsNative)
'TRUE'

#reversing the operations does not yield the same result
all.equal(rowSums(exp(shapRaw)), predsNative)
"Mean relative difference: 0.7662481"

#The last line is relevant because it implies 
#The relationship between native predictions
#and exponentiated shap values is not linear

#So, given the point of SHAP is to understand how each 
#feature impacts the prediction in its native units
#the raw shap values are not as useful as they could be

#Thus, how how would we convert 
#each of these four raw shap value elements to native units,
#thus understanding their contributions to their predictions
#in currency of native units?
shapRaw[1,]
-0.15429227  0.04858757 -0.27715359 -0.48454457

ORIGINAL POST AND EDIT

My understanding of SHAP values is that they are in the native units of the labels/response when conducting regression, and that the sum of the SHAP values approximates the model's prediction.

I am trying to extract SHAP values in LightGBM package, with a Tweedie regression objective, but find that the SHAP values are not in the native units of the labels and that they do not sum to predicted values.

It appears that they must be exponentiated, is this correct?

Side note: I understand that the final column of the SHAP values matrix represents the base prediction, and must be added.

Reproducible example:

library(tweedie)
library(caret)
library(lightgbm)

set.seed(123)

tweedie_variance_power <- 1.2

labels <- rtweedie(1000, mu = 1, phi = 1, power = tweedie_variance_power)
hist(labels)

feat1 <- labels + rnorm(1000) #good signal for label with some noise
feat2 <-rnorm(1000) #garbage feature 
feat3 <-rnorm(1000) #garbage feature 

features <- cbind(feat1, feat2, feat3)

dTrain <- lgb.Dataset(data = features,
                      label = labels)

params <- c(objective = 'tweedie',
            tweedie_variance_power = tweedie_variance_power)

mod <- lgb.train(data = dTrain,
                 params = params,
                 nrounds = 100)

preds <- predict(mod, features)

plot(preds, labels,
     main = paste('RMSE =', 
                  RMSE(pred = preds, obs = labels)))

#shap values are summing to negative values?
shap_vals <- predict(mod, features, predcontrib = TRUE, rawscore = FALSE)
shaps_sum <- rowSums(shap_vals)
plot(shaps_sum, labels, 
     main = paste('RMSE =', 
                  RMSE(pred = shaps_sum, obs = labels)))

#maybe we need to exponentiate?
shap_vals_exp <- exp(shap_vals)
shap_vals_exp_sum <- rowSums(shap_vals_exp)
#still looks a little weird, overpredicting 
plot(shap_vals_exp_sum, labels,
     main = paste('RMSE =',
                  RMSE(pred = shap_vals_exp_sum, obs = labels)))

EDIT

The order of operations is to sum first and then exponentiate the SHAP values, which will give you the predictions in native unit. Though I still am unclear on how to transform the feature level values to the native response units.

shap_vals_sum_exp <- exp(shaps_sum)
plot(shap_vals_sum_exp, labels,
     main = paste('RMSE =',
                  RMSE(pred = shap_vals_sum_exp, obs = labels)))
like image 847
kdoherty Avatar asked Sep 05 '25 03:09

kdoherty


1 Answers

I will show how to reconcile shap values and model predictions in Python, both in raw scores and original units. Hopefully it will help you understand where you are in R.

Step 1. Generate dataset

# pip install tweedie
import tweedie
y = tweedie.tweedie(1.2,1,1).rvs(size=1000)
X = np.random.randn(1000,3)

Step 2. Fit model

from lightgbm.sklearn import LGBMRegressor
lgb = LGBMRegressor(objective = 'tweedie')
lgb.fit(X,y)

Step 3. Understand what shap values are.

Shap values for 0th data point

shap_values = lgb.predict(X, pred_contrib=True)
shap_values[0]
array([ 0.36841812, -0.15985678,  0.28910617, -0.27317984])

The first 3 are model contributions to baseline, i.e. shap values themselves:

shap_values[0,:3].sum()
0.4976675073764354

The 4th is baseline in raw scores:

shap_values[0,3]
-0.2731798364061747

Sum of them add up to model prediction in raw scores:

shap_values[0,:3].sum() + shap_values[0,3]
0.22448767097026068

Let's check against raw model predictions:

preds = lgb.predict(X, raw_score=True)
preds[0]
0.2244876709702609

EDIT. Conversion between raw scores and original utits

To convert between raw scores and original units for Tweedie (and for Poisson and for Gamma) distribution you need to be aware of 2 facts:

  1. Original is exp of raw
  2. exp of sum is product of exps

Demo:

  1. 0th prediction in original units:
lgb.predict([X[0,:]])
array([0.39394102])
  1. Shap values for 0th row in raw score space:
shap_values = lgb.predict(X, pred_contrib=True, raw_score=True)
shap_values[0]
array([-0.77194274, -0.08343294,  0.22740536, -0.30358374])
  1. Conversion of shap values to original units (product of exponents):
np.prod(np.exp(shap_values[0]))
0.3939410249402226

Looks similar to me again.

like image 190
Sergey Bushmanov Avatar answered Sep 07 '25 21:09

Sergey Bushmanov