The L2-kernel classifier does not consider explicitly its classification margin when approximating the difference of densities (DoD) with the integrated squared error (ISE) criterion of probability densities, which is disadvantageous for improving the performance of classifiers to a certain extent. Its weights can simply be obtained by solving the corresponding QP problem which results in the comparatively slow training speed and is impractical especially for large datasets. With the aim of overcoming the above drawbacks, a new classification method is proposed in this paper, called the maximum margin logistic vector machine (MMLVM), which maximizes the DoDbased classification margin and finds the corresponding weight vector by solving a logistic optimization problem in gradient descent way. The theoretical analysis is provided in the globally optimal weights, the generalization error bound, and in the computational complexity of MMLVM. Experimental results on the artificial, UCI, PIE and USPS data sets demonstrate the effectiveness of the proposed approach in overcoming the drawbacks as above.