Coarse-to-fine multiclass learning and classification for time-critical domains

Teo Susnjak*, Andre Barczak, Napoleon Reyes, Ken Hawick

*Corresponding author for this work

Research output: Contribution to journalArticleResearchpeer-review

Abstract

This paper presents a coarse-to-fine learning algorithm for multiclass problems. The algorithm is applied to ensemble-based learning by using boosting to construct cascades of classifiers. The goal is to address the training and detection runtime complexities found in an increasing number of classification domains. This research applies a separate-and-conquer strategy with respect to class labels, in order to realize efficiency in both the training and detection phases under limited computational resources, without compromising accuracy. The paper demonstrates how popular, non-cascaded algorithms like AdaBoost. M2, AdaBoost. OC and AdaBoost. ECC can be converted into robust cascaded classifiers. Additionally, a new multiclass weak learner is proposed that is custom designed for cascaded training. Experiments were conducted on 18 publicly available datasets and showed that the cascaded algorithms achieved considerable speed-ups over the original AdaBoost. M2, AdaBoost. OC and AdaBoost. ECC in both training and detection runtimes. The cascaded classifiers did not exhibit significant compromises in their generalization ability and in fact produced evidence of improved accuracies on datasets with biased-class distributions.

Original languageEnglish
Pages (from-to)884-894
Number of pages11
JournalPattern Recognition Letters
Volume34
Issue number8
DOIs
Publication statusPublished - 1 Jun 2013
Externally publishedYes

Fingerprint

Dive into the research topics of 'Coarse-to-fine multiclass learning and classification for time-critical domains'. Together they form a unique fingerprint.

Cite this