Abstract:As a sub-task for combinatory categorical grammar (CCG) based parsing, categorical tagging can improve parsing efficiency and accuracy. While traditional maximum entropy model solves this problem by designing meaningful feature templates, neural network can extract features automatically based on distributed representations. This paper proposes a neural categorical tagging model with two improvements. First, word embedding layer is extended with a part-of-speech embedding layer and a category embedding layer, which facilitates learning their distributed representations jointly by the back-propagation algorithm. Secondly, a beam search is used in the decoding to capture the dependencies among tags. These two improvements make the proposed model more accurate than the state-of-art maximum entropy based tagger (up to 1%).