You are here

A Unifying Framework for Typical Multitask Multiple Kernel Learning Problems

TitleA Unifying Framework for Typical Multitask Multiple Kernel Learning Problems
Publication TypeJournal Article
Year of Publication2013
AuthorsLi C, Georgiopoulos M, Anagnostopoulos GC
JournalNeural Networks and Learning Systems, IEEE Transactions on
Volume25
Issue7
Pagination1287-1297
Date PublishedJuly
ISSN2162-237X
KeywordsAlgorithm design and analysis, Closed-form solutions, Kernel, Learning systems, Machine learning, Optimization, optimization methods, pattern recognition, supervised learning, Support vector machines, support vector machines (SVMs), support vector machines (SVMs)., Vectors
Abstract

Over the past few years, multiple kernel learning (MKL) has received significant attention among data-driven feature selection techniques in the context of kernel-based learning. MKL formulations have been devised and solved for a broad spectrum of machine learning problems, including multitask learning (MTL). Solving different MKL formulations usually involves designing algorithms that are tailored to the problem at hand, which is, typically, a nontrivial accomplishment. In this paper we present a general multitask multiple kernel learning (MT-MKL) framework that subsumes well-known MT-MKL formulations, as well as several important MKL approaches on single-task problems. We then derive a simple algorithm that can solve the unifying framework. To demonstrate the flexibility of the proposed framework, we formulate a new learning problem, namely partially-shared common space MT-MKL, and demonstrate its merits through experimentation.

DOI10.1109/TNNLS.2013.2291772

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer