optuna_integration.ShapleyImportanceEvaluator
- class optuna_integration.ShapleyImportanceEvaluator(*, n_trees=64, max_depth=64, seed=None)[source]
Shapley (SHAP) parameter importance evaluator.
This evaluator fits a random forest regression model that predicts the objective values of
COMPLETE
trials given their parameter configurations. Feature importances are then computed as the mean absolute SHAP values.Note
This evaluator requires the sklearn Python package and SHAP. The model for the SHAP calculation is based on sklearn.ensemble.RandomForestClassifier.
- Parameters:
Note
Added in v3.0.0 as an experimental feature. The interface may change in newer versions without prior notice. See https://github.com/optuna/optuna/releases/tag/v3.0.0.
Methods
evaluate
(study[, params, target])Evaluate parameter importances based on completed trials in the given study.
- evaluate(study, params=None, *, target=None)[source]
Evaluate parameter importances based on completed trials in the given study.
Note
This method is not meant to be called by library users.
See also
Please refer to
get_param_importances()
for how a concrete evaluator should implement this method.- Parameters:
study (Study) – An optimized study.
params (list[str] | None) – A list of names of parameters to assess. If
None
, all parameters that are present in all of the completed trials are assessed.target (Callable[[FrozenTrial], float] | None) –
A function to specify the value to evaluate importances. If it is
None
andstudy
is being used for single-objective optimization, the objective values are used. Can also be used for other trial attributes, such as the duration, liketarget=lambda t: t.duration.total_seconds()
.Note
Specify this argument if
study
is being used for multi-objective optimization. For example, to get the hyperparameter importance of the first objective, usetarget=lambda t: t.values[0]
for the target parameter.
- Returns:
A
dict
where the keys are parameter names and the values are assessed importances.- Return type: