You are here
Multi-objective Evolutionary Optimization of Exemplar-based Classifiers: A PNN Test Case
|Title||Multi-objective Evolutionary Optimization of Exemplar-based Classifiers: A PNN Test Case|
|Publication Type||Conference Paper|
|Year of Publication||2011|
|Authors||Rubio T, Zhang T, Georgiopoulos M, Kaylani A|
|Conference Name||The 2011 International Joint Conference on Neural Networks (IJCNN'11)|
|Conference Location||San Jose, CA|
In this paper the major principles to effectively design a parameter-less, multi-objective evolutionary algorithm that optimizes a population of probabilistic neural network (PNN) classifier models are articulated; PNN is an example of an exemplar-based classifier. These design principles are extracted from experiences, discussed in this paper, which guided the creation of the parameter-less multi-objective evolutionary algorithm, named MO-EPNN (multi-objective evolutionary probabilistic neural network). Furthermore, these design principles are also corroborated by similar principles used for an earlier design of a parameter-less, multi-objective genetic algorithm used to optimize a population of ART (adaptive resonance theory) models, named MO-GART (multi-objective genetically optimized ART); the ART classifier model is another example of an exemplar-based classifier model. MO-EPNN's performance is compared to other popular classifier models, such as SVM (Support Vector Machines) and CART (Classification and Regression Trees), as well as to an alternate competitive method to genetically optimize the PNN. These comparisons indicate that MO-EPNN's performance (generalization on unseen data and size) compares favorably to the aforementioned classifier models and to the alternate genetically optimized PNN approach. MO-EPPN's good performance, and MO-GART's earlier reported good performance, both of whose design relies on the same principles, gives credence to these design principles, delineated in this paper.
Acceptance rate 75% (468/620).