天天看點

內建剪枝分類算法的Adaboost內建學習算法示例

boosting algorithms try to aggregate a couple of poor classifiers by order to make a powerful one. they assign weights to every labeled sample. when one of the poor classifier fails to correctly classify a sample, the weight of that sample is boosted. then it tries another poor classifier.

let’s take adaboost and pruning algorithms for example:

for the training set {(xi,yi)}ni=1, initialize their weights {wi}ni=1 as 1/n. and let f←0.

for j=1,…,b:

based on current sample weights {wi}ni=1, pick up the classifier with the smallest weighted error rate r: φj=argminφr(φ),r(φ)=∑j=1nwi2(1−φ(xi)yi)

calculate the weight of classifier φj:θj=12log1−r(φj)r(φj)

update the aggregated classifier f:f←f+θjφj

update the weights of samples {wi}ni=1:wi←exp(−f(xi)yi)∑nk=1exp(−f(xk)yk),∀i=1,2,…,n

內建剪枝分類算法的Adaboost內建學習算法示例

繼續閱讀