The Strip-Mining technique is significant for improving SRF bandwidth utilization on the stream processor. It is critical to quantify the program execution time influenced by the strip size for achieving optimalstrip size. In order to achieve the theoretical optimal strip size, this paper proposes an optimal strip-miningtechnique based on a parameter model to minimize the execution time. Firstly, the paper builds a prefetching and reusing optimizations guided parameter model that characterizes the effect of strip size on program behavios.Secondly, based on the model analysis, this paper explores the optimal strip size selection approaches to the computation intensive programs and memory intensive programs respectively. Finally, an optimal strip-miningtechnique for any program is proposed. The experimental results show that our strip-mining technique caneffectively hide and avoid the memory access latency, so as to exploit the powerful computation ability of stream processor.