You are here
A Unifying Framework for Typical Multitask Multiple Kernel Learning Problems
Title | A Unifying Framework for Typical Multitask Multiple Kernel Learning Problems |
Publication Type | Journal Article |
Year of Publication | 2013 |
Authors | Li C, Georgiopoulos M, Anagnostopoulos GC |
Journal | Neural Networks and Learning Systems, IEEE Transactions on |
Volume | 25 |
Issue | 7 |
Pagination | 1287-1297 |
Date Published | July |
ISSN | 2162-237X |
Keywords | Algorithm design and analysis, Closed-form solutions, Kernel, Learning systems, Machine learning, Optimization, optimization methods, pattern recognition, supervised learning, Support vector machines, support vector machines (SVMs), support vector machines (SVMs)., Vectors |
Abstract | Over the past few years, multiple kernel learning (MKL) has received significant attention among data-driven feature selection techniques in the context of kernel-based learning. MKL formulations have been devised and solved for a broad spectrum of machine learning problems, including multitask learning (MTL). Solving different MKL formulations usually involves designing algorithms that are tailored to the problem at hand, which is, typically, a nontrivial accomplishment. In this paper we present a general multitask multiple kernel learning (MT-MKL) framework that subsumes well-known MT-MKL formulations, as well as several important MKL approaches on single-task problems. We then derive a simple algorithm that can solve the unifying framework. To demonstrate the flexibility of the proposed framework, we formulate a new learning problem, namely partially-shared common space MT-MKL, and demonstrate its merits through experimentation. |
DOI | 10.1109/TNNLS.2013.2291772 |
- Log in to post comments
- Google Scholar
- BibTeX