XGBoost is a popular Gradient Boosting library with Python interface.
for XGBClassifer and XGBRegressor estimators. It is tested for
xgboost >= 0.6a2.
importance_typeis a way to get feature importance. Possible values are:
- ‘gain’ - the average gain of the feature when it is used in trees (default)
- ‘weight’ - the number of times a feature is used to split the data across all trees
- ‘cover’ - the average coverage of the feature when it is used in trees
target arguments are ignored.
eli5.explain_prediction() eli5 uses an approach based on ideas from
feature weights are calculated by following decision paths in trees
of an ensemble. Each node of the tree has an output score, and
contribution of a feature on the decision path is how much the score changes
from parent to child.
vecis a vectorizer instance used to transform raw features to the input of the estimator
xgb(e.g. a fitted CountVectorizer instance); you can pass it instead of
vectorizedis a flag which tells eli5 if
docshould be passed through
vecor not. By default it is False, meaning that if
vecis not None,
vec.transform([doc])is passed to the estimator. Set it to True if you’re passing
docis already vectorized.
See the tutorial for a more detailed usage example.