Accelerated classifier training using the PSL cascading structure

Teo Susnjak*, Andre L.C. Barczak

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

Abstract

This paper addresses the problem of excessively long classifier training times associated with using the Adaboost algorithm within the framework of a cascade of boosted ensembles (CoBE). We present new test results confirming the acceleration of the training phase and the robustness of the Parallel Strong classifier within the same Layer (PSL) training structure recently proposed by [1]. The findings demonstrate a speed up of an order of magnitude over the current training methods without a compromise in accuracy. We also present a modified version of the PSL training structure that further decreases the duration of the training phase while preserving accuracy.

Original languageEnglish
Title of host publicationAdvances in Neuro-Information Processing - 15th International Conference, ICONIP 2008, Revised Selected Papers
EditorsMario Köppen, Nikola Kasabov, George Coghill
PublisherSpringer
Pages945-952
Number of pages8
VolumePART 1
ISBN (Print)3642024890, 9783642024894
DOIs
Publication statusPublished - 2009
Externally publishedYes
Event15th International Conference on Neuro-Information Processing, ICONIP 2008 - Auckland, New Zealand
Duration: 25 Nov 200828 Nov 2008
Conference number: 15th

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 1
Volume5506 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference15th International Conference on Neuro-Information Processing, ICONIP 2008
Abbreviated titleICONIP 2008
Country/TerritoryNew Zealand
CityAuckland
Period25/11/0828/11/08

Fingerprint

Dive into the research topics of 'Accelerated classifier training using the PSL cascading structure'. Together they form a unique fingerprint.

Cite this