TY - JOUR
T1 - Coarse-to-fine multiclass learning and classification for time-critical domains
AU - Susnjak, Teo
AU - Barczak, Andre
AU - Reyes, Napoleon
AU - Hawick, Ken
N1 - Funding Information:
This research was in part funded by the New Zealand Tertiary Education Commission .
PY - 2013/6/1
Y1 - 2013/6/1
N2 - This paper presents a coarse-to-fine learning algorithm for multiclass problems. The algorithm is applied to ensemble-based learning by using boosting to construct cascades of classifiers. The goal is to address the training and detection runtime complexities found in an increasing number of classification domains. This research applies a separate-and-conquer strategy with respect to class labels, in order to realize efficiency in both the training and detection phases under limited computational resources, without compromising accuracy. The paper demonstrates how popular, non-cascaded algorithms like AdaBoost. M2, AdaBoost. OC and AdaBoost. ECC can be converted into robust cascaded classifiers. Additionally, a new multiclass weak learner is proposed that is custom designed for cascaded training. Experiments were conducted on 18 publicly available datasets and showed that the cascaded algorithms achieved considerable speed-ups over the original AdaBoost. M2, AdaBoost. OC and AdaBoost. ECC in both training and detection runtimes. The cascaded classifiers did not exhibit significant compromises in their generalization ability and in fact produced evidence of improved accuracies on datasets with biased-class distributions.
AB - This paper presents a coarse-to-fine learning algorithm for multiclass problems. The algorithm is applied to ensemble-based learning by using boosting to construct cascades of classifiers. The goal is to address the training and detection runtime complexities found in an increasing number of classification domains. This research applies a separate-and-conquer strategy with respect to class labels, in order to realize efficiency in both the training and detection phases under limited computational resources, without compromising accuracy. The paper demonstrates how popular, non-cascaded algorithms like AdaBoost. M2, AdaBoost. OC and AdaBoost. ECC can be converted into robust cascaded classifiers. Additionally, a new multiclass weak learner is proposed that is custom designed for cascaded training. Experiments were conducted on 18 publicly available datasets and showed that the cascaded algorithms achieved considerable speed-ups over the original AdaBoost. M2, AdaBoost. OC and AdaBoost. ECC in both training and detection runtimes. The cascaded classifiers did not exhibit significant compromises in their generalization ability and in fact produced evidence of improved accuracies on datasets with biased-class distributions.
UR - http://www.scopus.com/inward/record.url?scp=84893643386&partnerID=8YFLogxK
U2 - 10.1016/j.patrec.2013.01.011
DO - 10.1016/j.patrec.2013.01.011
M3 - Article
AN - SCOPUS:84893643386
SN - 0167-8655
VL - 34
SP - 884
EP - 894
JO - Pattern Recognition Letters
JF - Pattern Recognition Letters
IS - 8
ER -