Online metric/similarity learning has been widely used in data mining, information retrieval, and computer vision, mainly due to its high efficiency and scalability to large-scale dataset. Unlike most existing batch learning methods that learn metric model offline with all training samples, online learning aims to exploit of one or a group of samples to update the metric model iteratively, and is ideal for tasks in which data arrives sequentially.
However, most state-of-the art online metric learning models can only achieve online learning from fixed predefined t (t > 0) metric learning tasks and cannot add the new task.
In a study published in IEEE Transactions on Cybernetics, CONG Yang's team at Shenyang Institute of Automation of the Chinese Academy of Sciences proposed a lifelong metric learning (LML) to mimic “human learning”, i.e., endowing a new capability to the learned metric for a new task from new online samples and incorporating previous experiences and knowledge.
The LML maintains a common subspace for all learned metrics, namely lifelong dictionary. It is capable of transferring knowledge from the common subspace, in order to learn each new metric learning task with task-specific idiosyncrasy, and can redefine the common subspace over time, maximizing performance across all metric tasks.
Through extensive experiments on several multitask datasets, the research team verified that the proposed framework are well suited to the lifelong learning problem, and exhibited prominent performance in both effectiveness and efficiency.
Demonstration of the LML (Image by CONG yang)
86-10-68597521 (day)
86-10-68597289 (night)
86-10-68511095 (day)
86-10-68512458 (night)
cas_en@cas.cn
52 Sanlihe Rd., Xicheng District,
Beijing, China (100864)