Abstract:Domain adaptation (or cross domain) learning (DAL) aims to learn a robust target classifier for the target domain, which has none or a few labeled samples, by leveraging labeled samples from the source domain (or auxiliary domain). The key challenge in DAL is how to minimize the maximum distribution distance among different domains. To address the considerable change between feature distributions of different domains, this paper proposes a three-stage multiple kernel local learning-based domain adaptation (MKLDA) scheme:1) MKLDA simultaneously learns a reproduced multiple kernel Hilbert space and a initial support vector machine (SVM) by minimizing both the structure risk functional and the maximum mean discrepancy (MMD) between different domains, thus implementing the initial separation of patterns from target domain; 2) By employing the idea of local learning-based method, MKLDA predicts the label of each data point in target domain based on its neighbors and their labels in the kernel Hilbert space learned in 1); And 3) MKLDA learns a robust kernel classifier to classify the unseen data in target domain with training data well predicted in 2). Experimental results on real world problems show the outperformed or comparable effectiveness of the proposed approach compared to related approaches.