Gradient Boosted Trees (GBT) is a generalized boosting algorithm introduced by
Jerome Friedman: http://www.salfordsystems.com/doc/GreedyFuncApproxSS.pdf .
In contrast to the AdaBoost.M1 algorithm, GBT can deal with both multiclass
classification and regression problems. Moreover, it can use any
differential loss function, some popular ones are implemented.
Decision trees (CvDTree
) usage as base learners allows to process ordered
and categorical variables.
Gradient Boosted Trees model represents an ensemble of single regression trees built in a greedy fashion. Training procedure is an iterative process similar to the numerical optimization via the gradient descent method. Summary loss on the training set depends only on the current model predictions for the training samples, in other words . And the gradient can be computed as follows:
At every training step, a single regression tree is built to predict an antigradient vector components. Step length is computed corresponding to the loss function and separately for every region determined by the tree leaf. It can be eliminated by changing values of the leaves directly.
See below the main scheme of the training process:
The following loss functions are implemented for regression problems:
Squared loss (CvGBTrees::SQUARED_LOSS
):
Absolute loss (CvGBTrees::ABSOLUTE_LOSS
):
Huber loss (CvGBTrees::HUBER_LOSS
):
,
where is the -quantile estimation of the . In the current implementation .
The following loss functions are implemented for classification problems:
CvGBTrees::DEVIANCE_LOSS
):
functions are built, one function for each output class, and
,
where
is the estimation of the probability of .As a result, you get the following model:
where is the initial guess (the best constant model) and is a regularization parameter from the interval , further called shrinkage.
To get the GBT model prediction, you need to compute the sum of responses of all the trees in the ensemble. For regression problems, it is the answer. For classification problems, the result is .
CvGBTreesParams
: public CvDTreeParams
¶GBT training parameters.
The structure contains parameters for each single decision tree in the ensemble,
as well as the whole model characteristics. The structure is derived from
CvDTreeParams
but not all of the decision tree parameters are supported:
cross-validation, pruning, and class priorities are not used.
CvGBTreesParams::
CvGBTreesParams
()¶
CvGBTreesParams::
CvGBTreesParams
(int loss_function_type, int weak_count, float shrinkage, float subsample_portion, int max_depth, bool use_surrogates)¶Parameters: |
|
---|
By default the following constructor is used:
CvGBTreesParams(CvGBTrees::SQUARED_LOSS, 200, 0.8f, 0.01f, 3, false)
: CvDTreeParams( 3, 10, 0, false, 10, 0, false, false, 0 )
CvGBTrees
: public CvStatModel
¶The class implements the Gradient boosted tree model as described in the beginning of this section.
Default and training constructors.
CvGBTrees::
CvGBTrees
()¶
CvGBTrees::
CvGBTrees
(const Mat& trainData, int tflag, const Mat& responses, const Mat& varIdx=Mat(), const Mat& sampleIdx=Mat(), const Mat& varType=Mat(), const Mat& missingDataMask=Mat(), CvGBTreesParams params=CvGBTreesParams() )¶
CvGBTrees::
CvGBTrees
(const CvMat* trainData, int tflag, const CvMat* responses, const CvMat* varIdx=0, const CvMat* sampleIdx=0, const CvMat* varType=0, const CvMat* missingDataMask=0, CvGBTreesParams params=CvGBTreesParams() )¶
cv2.
GBTrees
([trainData, tflag, responses[, varIdx[, sampleIdx[, varType[, missingDataMask[, params]]]]]]) → <GBTrees object>¶The constructors follow conventions of CvStatModel::CvStatModel()
. See CvStatModel::train()
for parameters descriptions.
Trains a Gradient boosted tree model.
bool CvGBTrees::
train
(const Mat& trainData, int tflag, const Mat& responses, const Mat& varIdx=Mat(), const Mat& sampleIdx=Mat(), const Mat& varType=Mat(), const Mat& missingDataMask=Mat(), CvGBTreesParams params=CvGBTreesParams(), bool update=false)¶
bool CvGBTrees::
train
(const CvMat* trainData, int tflag, const CvMat* responses, const CvMat* varIdx=0, const CvMat* sampleIdx=0, const CvMat* varType=0, const CvMat* missingDataMask=0, CvGBTreesParams params=CvGBTreesParams(), bool update=false )¶
bool CvGBTrees::
train
(CvMLData* data, CvGBTreesParams params=CvGBTreesParams(), bool update=false)¶
cv2.GBTrees.
train
(trainData, tflag, responses[, varIdx[, sampleIdx[, varType[, missingDataMask[, params[, update]]]]]]) → retval¶The first train method follows the common template (see CvStatModel::train()
).
Both tflag
values (CV_ROW_SAMPLE
, CV_COL_SAMPLE
) are supported.
trainData
must be of the CV_32F
type. responses
must be a matrix of type
CV_32S
or CV_32F
. In both cases it is converted into the CV_32F
matrix inside the training procedure. varIdx
and sampleIdx
must be a
list of indices (CV_32S
) or a mask (CV_8U
or CV_8S
). update
is
a dummy parameter.
The second form of CvGBTrees::train()
function uses CvMLData
as a
data set container. update
is still a dummy parameter.
All parameters specific to the GBT model are passed into the training function
as a CvGBTreesParams
structure.
Predicts a response for an input sample.
float CvGBTrees::
predict
(const Mat& sample, const Mat& missing=Mat(), const Range& slice=Range::all(), int k=-1) const
¶
float CvGBTrees::
predict
(const CvMat* sample, const CvMat* missing=0, CvMat* weakResponses=0, CvSlice slice=CV_WHOLE_SEQ, int k=-1 ) const
¶
cv2.GBTrees.
predict
(sample[, missing[, slice[, k]]]) → retval¶Parameters: |
|
---|
The method predicts the response corresponding to the given sample
(see Predicting with the GBT Model).
The result is either the class label or the estimated function value. The
CvGBTrees::predict()
method enables using the parallel version of the GBT model
prediction if the OpenCV is built with the TBB library. In this case, predictions
of single trees are computed in a parallel fashion.
Clears the model.
void CvGBTrees::
clear
()¶
cv2.GBTrees.
clear
() → None¶The function deletes the data set information and all the weak models and sets all internal
variables to the initial state. The function is called in CvGBTrees::train()
and in the
destructor.
Calculates a training or testing error.
float CvGBTrees::
calc_error
(CvMLData* _data, int type, std::vector<float>* resp=0 )¶Parameters: |
|
---|
If the CvMLData
data is used to store the data set, CvGBTrees::calc_error()
can be
used to get a training/testing error easily and (optionally) all predictions
on the training/testing set. If the Intel* TBB* library is used, the error is computed in a
parallel way, namely, predictions for different samples are computed at the same time.
In case of a regression problem, a mean squared error is returned. For
classifications, the result is a misclassification error in percent.