You are here

Multitask Classification Hypothesis Space With Improved Generalization Bounds

TitleMultitask Classification Hypothesis Space With Improved Generalization Bounds
Publication TypeJournal Article
Year of Publication2015
AuthorsLi C, Georgiopoulos M, Anagnostopoulos GC
JournalNeural Networks and Learning Systems, IEEE Transactions on
Volume26
Issue7
Pagination1468-1479
ISSN2162-237X
KeywordsContext, Hilbert space, Kernel, Learning systems, Machine learning, pattern recognition, statistical learning, supervised learning, Support vector machines, Training, Upper bound
Abstract

This paper presents a pair of hypothesis spaces (HSs) of vector-valued functions intended to be used in the context of multitask classification. While both are parameterized on the elements of reproducing kernel Hilbert spaces and impose a feature mapping that is common to all tasks, one of them assumes this mapping as fixed, while the more general one learns the mapping via multiple kernel learning. For these new HSs, empirical Rademacher complexity-based generalization bounds are derived, and are shown to be tighter than the bound of a particular HS, which has appeared recently in the literature, leading to improved performance. As a matter of fact, the latter HS is shown to be a special case of ours. Based on an equivalence to Group-Lasso type HSs, the proposed HSs are utilized toward corresponding support vector machine-based formulations. Finally, experimental results on multitask learning problems underline the quality of the derived and validate this paper's analysis.

DOI10.1109/TNNLS.2014.2347054

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer