Abstract:In recent years, deep neural networks (DNNs) have achieved outstanding performance on many AI tasks, such as computer vision (CV) and natural language processing (NLP). However, the network design relies heavily on the expert knowledge, which is time-consuming and error-prone. As a result, as one of the important sub-fields of automated machine learning (AutoML), the neural architecture search (NAS) has been paid more and more attention to, aiming to automatically design deep neural networks with superior performance. In this study, the development process of NAS is reviewed in detail and systematically summarized. Firstly, the overall research framework of NAS is given, and the function of each research content is analyzed. Next, according to the development stage in NAS field, the existing methods are divided into four aspects, and the characteristic of each stage is introduced in detail. Then, the datasets are introduced which are often used to verify the effect of NAS methods at this stage, and the normalized evaluation criteria in NAS field are innovatively summarized, so as to ensure the fairness of experimental comparison and promote the long-term development of this field. Finally, the challenges of NAS research are proposed and discussed.