Abstract:Extreme learning machine (ELM) not only is an effective classifier in supervised learning, but also can be applied on semi-supervised learning.However, semi-supervised ELM (SS-ELM) is merely a surface learning algorithm similar to Laplacian smooth twin support vector machine (Lap-STSVM).Deep learning has the advantage of approximating the complicated function and alleviating the optimization difficulty associated with deep models.Multi layer extreme learning machine (ML-ELM) has been developed according to the idea of deep learning and extreme learning machine, which stacks extreme learning machines, based auto encoder (ELM-AE) to create a multi-layer neural network.ML-ELM not only approximates the complicated function but also avoids the need to iterate during training process, exhibiting the merits of high learning efficiency.In this article, manifold regularization is introduced into the model of ML-ELM and a Laplacian ML-ELM (Lap-ML-ELM) is put forward.Furthermore, in order to solve the over fitting problem with ELM-AE, weight uncertainty is brought into ELM-AE to form a weight uncertainty ELM-AE (WU-ELM-AE) which can learn more robust features.Finally, a weight uncertainty Laplacian ML-ELM (WUL-ML-ELM) is proposed based on the above two algorithms, which stacks WU-ELM-AE to create a deep network and uses the manifold regularization framework to obtain the output weights.Lap-ML-ELM and WUL-ML-ELM are more efficient than SS-ELM in classification and do not need to spend too much time.Experimental results show that Lap-ML-ELM and WUL-ML-ELM are efficient semi-supervised learning algorithms.