API Reference for Optuna-Integration
The Optuna-Integration package contains classes used to integrate Optuna with external machine learning frameworks.
All of these classes can be imported in two ways. One is “from optuna.integration import xxx” like a module in Optuna, and the other is “from optuna_integration import xxx” as an Optuna-Integration specific module. The former is provided for backward compatibility.
For most of the ML frameworks supported by Optuna, the corresponding Optuna integration class serves only to implement a callback object and functions, compliant with the framework’s specific callback API, to be called with each intermediate step in the model training. The functionality implemented in these callbacks across the different ML frameworks includes:
Reporting intermediate model scores back to the Optuna trial using optuna.trial.Trial.report,
According to the results of optuna.trial.Trial.should_prune, pruning the current model by raising optuna.TrialPruned, and
Reporting intermediate Optuna data such as the current trial number back to the framework, as done in
MLflowCallback
.
For scikit-learn, an integrated OptunaSearchCV
estimator is available that combines scikit-learn BaseEstimator functionality with access to a class-level Study
object.
AllenNLP
AllenNLP extension to use optuna with Jsonnet config file. |
|
Save JSON config file with environment variables and best performing hyperparameters. |
|
AllenNLP callback to prune unpromising trials. |
BoTorch
A sampler that uses BoTorch, a Bayesian optimization library built on top of PyTorch. |
|
Expected Hypervolume Improvement (EHVI). |
|
Log Expected Improvement (LogEI). |
|
Quasi MC-based batch Expected Improvement (qEI). |
|
Quasi MC-based batch Noisy Expected Improvement (qNEI). |
|
Quasi MC-based batch Knowledge Gradient (qKG). |
|
Quasi MC-based batch Expected Hypervolume Improvement (qEHVI). |
|
Quasi MC-based batch Noisy Expected Hypervolume Improvement (qNEHVI). |
|
Quasi MC-based extended ParEGO (qParEGO) for constrained multi-objective optimization. |
|
Quasi MC-based batch Hypervolume Knowledge Gradient (qHVKG). |
CatBoost
Callback for catboost to prune unpromising trials. |
Chainer
Chainer extension to prune unpromising trials. |
|
A wrapper of |
Comet
A callback for logging Optuna study trials to a Comet ML Experiment. |
Dask
Dask-compatible storage class. |
fast.ai
FastAI callback to prune unpromising trials for fastai. |
|
alias of |
Keras
Keras callback to prune unpromising trials. |
LightGBM
Callback for LightGBM to prune unpromising trials. |
|
Wrapper of LightGBM Training API to tune hyperparameters. |
|
Hyperparameter tuner for LightGBM. |
|
Hyperparameter tuner for LightGBM with cross-validation. |
MLflow
Callback to track Optuna trials with MLflow. |
MXNet
MXNet callback to prune unpromising trials. |
pycma
A Sampler using cma library as the backend. |
PyTorch
PyTorch Ignite handler to prune unpromising trials. |
|
PyTorch Lightning callback to prune unpromising trials. |
|
A wrapper of |
SHAP
Shapley (SHAP) parameter importance evaluator. |
sklearn
Hyperparameter search with cross-validation. |
skorch
Skorch callback to prune unpromising trials. |
TensorBoard
Callback to track Optuna trials with TensorBoard. |
TensorFlow
tf.keras callback to prune unpromising trials. |
Weights & Biases
Callback to track Optuna trials with Weights & Biases. |
XGBoost
Callback for XGBoost to prune unpromising trials. |