This paper considers the problem of quality assessment of image fusion result. A novel similarity-based measure for the performance of image fusion is proposed. From human visual features, this measure evaluates the similarity of the gradient fields between the source image and fused image. Compared with the existing similarity-based measures, the proposed one has an omnidirectional recognition ability; Compared with the mutual information method, it is more consistent with the human perception nature as it considers only the local image variations. Experimental results show that the proposed measure can evaluate the image fusion performance satisfactorily and is also perceptually meaningful.