access icon free Multi-task Joint Feature Selection for Multi-label Classification

Multi-label learning deals with each instance which may be associated with a set of class labels simultaneously. We propose a novel multi-label classification approach named MFSM (Multi-task joint feature selection for multi-label classification). In MFSM, we compute the asymmetric label correlation matrix in the label space. The multi-label learning problem can be formulated as a joint optimization problem including two regularization terms, one aims to consider the label correlations and the other is used to select the similar sparse features shared among multiple different classification tasks (each for one label). Our model can be reformulated into an equivalent smooth convex optimization problem which can be solved by the Nesterov’s method. The experiments on sixteen benchmark multi-label data sets demonstrate that our method outperforms the state-of-the-art multi-label learning algorithms.

Inspec keywords: pattern classification; optimisation; feature selection; learning (artificial intelligence)

Other keywords: joint optimization problem; multitask joint feature selection; equivalent smooth convex optimization problem; Nesterov's method; multilabel classification approach; multilabel learning problem

Subjects: Pattern recognition; Optimisation techniques; Learning in AI (theory)

http://iet.metastore.ingenta.com/content/journals/10.1049/cje.2015.04.009
Loading

Related content

content/journals/10.1049/cje.2015.04.009
pub_keyword,iet_inspecKeyword,pub_concept
6
6
Loading