XGBoostΒΆ

XGBoost is a popular Gradient Boosting library with Python interface. eli5 supports eli5.explain_weights() and eli5.explain_prediction() for XGBClassifer and XGBRegressor estimators. It is tested for xgboost >= 0.6a2.

eli5.explain_weights() uses feature importances. Additional arguments for XGBClassifer and XGBRegressor:

  • importance_type is a way to get feature importance. Possible values are:
    • ‘gain’ - the average gain of the feature when it is used in trees (default)
    • ‘weight’ - the number of times a feature is used to split the data across all trees
    • ‘cover’ - the average coverage of the feature when it is used in trees

target_names and target arguments are ignored.

For eli5.explain_prediction() eli5 uses an approach based on ideas from http://blog.datadive.net/interpreting-random-forests/ : feature weights are calculated by following decision paths in trees of an ensemble. Each node of the tree has an output score, and contribution of a feature on the decision path is how much the score changes from parent to child.

Additional eli5.explain_prediction() keyword arguments supported for XGBClassifer and XGBRegressor:

  • vec is a vectorizer instance used to transform raw features to the input of the estimator xgb (e.g. a fitted CountVectorizer instance); you can pass it instead of feature_names.
  • vectorized is a flag which tells eli5 if doc should be passed through vec or not. By default it is False, meaning that if vec is not None, vec.transform([doc]) is passed to the estimator. Set it to True if you’re passing vec, but doc is already vectorized.

See the tutorial for a more detailed usage example.