ZHAI Juan , TANG Zhen-Hao , LI Bin , ZHAO Jian-Hua , LI Xuan-Dong
2017, 28(5):1051-1069. DOI: 10.13328/j.cnki.jos.005211
Abstract:Formal verification is an effective method to guarantee software reliability by proving the correctness of a program. Analyzing and verifying loops which are important and frequently-used statements is not only vital for formal verification, but also a hot topic in the research area of software development. This paper proposes using memories modified by a loop and new values stored in these memories after executing the loop to describe the execution effect of the loop. Such execution effect is defined as loop summary. In addition, this paper proposes an approach to automatically synthesize loop summaries for loops manipulating commonly-used data structures, including nested loops. Based on loop summaries, specifications can be generated automatically, including loop invariants, preconditions and post-conditions of loops. The proposed approach is implemented and integrated into the code-verification tool Accumulator. The approach is also evaluated with a variety of programs, and the results show that it is able to generate loop summaries and different kinds of specifications, therefore helping to ease the verification task by reducing the burden for programmers and improving the automatic level and efficiency.
WANG Shan-Xia , MA Ming-Hui , CHEN Wu , DENG Hui-Wen
2017, 28(5):1070-1079. DOI: 10.13328/j.cnki.jos.005208
Abstract:Regular models are models for non-normal modal logics. By defining some model operations, including disjoint union, C2t- bisimulation, generated submodel, and C2t-ultrafilter extension, this study proves that a class of regular models can be defined in the temporal language if and only if it is closed under disjoint unions, surjective C2t-bisimulations and C2t-ultrafilter extensions, while its complement is closed under ultrafilter extensions. This characterization theorem explains the expressive power of temporal language over regular models.
2017, 28(5):1080-1090. DOI: 10.13328/j.cnki.jos.005209
Abstract:Existing models for real-time system are not able to create new process at runtime dynamically. This paper proposes a new model named asynchronous multi-process timed automata based on timed automata. Each process is abstracted as a process timed automata, and new process is triggered by parts of the states. Consider that queuing can make the model Turing complete, processes are buffered in a set. However the model is still powerful enough to model many real-time systems. The paper proves that model's coverability is decidable by encoding it to read-arc timed petri net.
WEN Xi-Ming , YU Quan , CHANG Liang , WANG Ju
2017, 28(5):1091-1106. DOI: 10.13328/j.cnki.jos.005212
Abstract:Diagnosability is an important property of discrete-event system (DES) from the perspective of diagnosis. It requires that every fault can be detected and isolated within a finite number of observations after its occurrence. In numerous literatures, diagnosability is studied under the assumption that an observation is certain, i.e., the observation corresponds to the sequence of observable events exactly taking place in the DES. But in practical applications, the assumption may become inappropriate. Due to various reasons such as the precision of sensors and noises in transmission channels, the available observation may be uncertain. This paper focuses on the diagnosability of DESs with uncertain observations. It extends the definition of diagnosability to cope with uncertain observations. Methods are given to check the diagnosability with three types of uncertain observations accordingly. In a more general scenario where multiple uncertainties exist in the observation, a method is also provided to check the diagnosability with all the uncertainties of the observation together.
CHANG Xi , XUE Jian-Xin , ZHANG Zhuo , MAO Xiao-Guang
2017, 28(5):1107-1117. DOI: 10.13328/j.cnki.jos.005210
Abstract:Static trace simplification technique is a method to improve the efficiency of debugging through reducing the number of thread contexts in a buggy execution trace without changing the dependency information. However, the application of static trace simplification technique to simply a buggy trace still faces a strong challenge in that the distribution of reduced thread contexts is usually non-uniform, and therefore a random trace simplification technique is difficult to achieve. In this paper, a convergence-oriented static trace simplification approach is proposed to address this challenge. The essential idea of this approach is to build a static trace simplification model, and then repeatedly adopt a convergence-oriented merging algorithm to search the mergeable previous and succeeding intervals in the same thread. Experimental results show that this new approach can simply the traces with high quality. It can help programers find the thread interleaves that cause the faults.
LIU Tao , WANG Shu-Ling , ZHAN Nai-Jun
2017, 28(5):1118-1127. DOI: 10.13328/j.cnki.jos.005207
Abstract:In recent years, with the rapid development of artificial intelligence, people encounter more and more robots, such as soccer robots, unmanned aerial vehicles, and unmanned automobiles in their daily life. How to guarantee the safety with those autonomous robots, especially team of robots on the move, has drawn people's attention. Hybrid CSP (HCSP) is a modelling language for hybrid systems. As an extension of CSP, HCSP introduces into CSP differential equations to describe the continuous behavior and control logic in hybrid systems. It can be used to model large-scale control systems, especially those with occurrences of communication events. In this paper, the trajectory planning algorithm is first modeled for multi robots using HCSP, and some properties are then formally proved with the theorem prover HProver. The paper finally proves that collision will never happen during the whole process under some initial conditions.
DU De-Hui , ZAN Hui , JIANG Kai-Qiang , CHENG Bei
2017, 28(5):1128-1143. DOI: 10.13328/j.cnki.jos.005216
Abstract:Cyber-Physical systems (CPSs) are advanced embedded systems engaging more interaction between computer and physical environment. CPSs are widely used in the field of healthcare equipment, avionics, and smart building. Meanwhile, the correctness and reliability analysis of CPSs has attracted more and more attentions. Statistical model checking (SMC) is an effective technology for verifying CPSs, which facilitates the quantitative evaluation for system performance. However, it is still a challenge to improve the performance of SMC with the expansion of systems. To address this issue, this study explores several SMC algorithms and concludes that Bayesian interval estimate is the most practical and efficient algorithm. However, large scale of traces are needed when the actual probability is around 0.5 during the evaluation. To overcome this difficulty, an algorithm, AL-SMC is proposed based on abstraction and learning techniques to reduce the size of sampling space. AL-SMC adopts some sophisticated techniques such as property-based projection, extraction and construction of prefix frequency tree. In addition, to improve the efficiency of SMC further, a framework of self-adaptive SMC algorithm, which uses the proper algorithm by probability prediction adaptively, is presented. Finally, the self-adaptive SMC approach is implemented with three benchmarks. The experimental results show that the proposed approach can improve the performance within an acceptable error range.
ZHANG Yu , DONG Yun-Wei , FENG Wen-Long , HUANG Meng-Xing
2017, 28(5):1144-1166. DOI: 10.13328/j.cnki.jos.005214
Abstract:Cyber-Physical System (CPS) tightly integrates control software and embedded components, incorporating software with control domains. CPSs are pervasive and often mission-critical, therefore, they must have high level of security. With the extensive use of information technology, embedded control software plays a greater role in such systems. The close interactions between control software and embedded components demand co-verification. In this paper, an automata-theoretic approach is presented to co-verification. Co-verification, which verifies control software and embedded components together, is essential to establish the correctness of a complete system. The foundation of this approach is a unified model for co-verification and reachability analysis of the model. The LTL formula is converted into a Büchi automata, which is interleaved with the execution of the unified model under analysis. An online-capture offline-replay approach is proposed to improve the usefulness for formal verification. Case studies on a suite of realistic examples show that the presented approach has major potential in verifying system level properties, therefore improving the high-assurance of system.
LI Xuan-Song , TAO Xian-Ping , LÜ Jian , SONG Wei
2017, 28(5):1167-1182. DOI: 10.13328/j.cnki.jos.005215
Abstract:Activity-Oriented context-aware (AOCA) applications organize environment resources to support the smooth performing of user activities. These applications are developed with a light-weight and incremental method in order to deal with the openness of the environment and requirements related to user activities. In contrast to the methods which attempt to deal with information from an overall level, AOCA applications allow different developers take part in the development in different time. However, this method may lead to more problems such as inconsistency. These problems are difficult to be detected in the development phase. This study focuses on using runtime verification to enhance the reliability of AOCA applications. In this paper, a method is first proposed for specifying AOCA application runtime status and describing system-level and application-level properties. Next, an AOCA application monitor is designed and implemented. Moreover, a case study and a performance evaluation are described to demonstrate the usability of this method.
CHEN Ming-Song , BAO Yong-Xiang , SUN Hai-Ying , MIAO Wei-Kai , CHEN Xiao-Hong , ZHOU Ting-Liang
2017, 28(5):1183-1203. DOI: 10.13328/j.cnki.jos.005217
Abstract:Communication-based train control system (CBTC) has become the mainstream infrastructure for the railway signal systems around the world. Unlike traditional track circuit-based railway control systems, CBTC adopts a more flexible and accurate control mechanism to provide uninterrupted services to enable guarantee safeguard between adjacent trains and protection for over-speeding. Therefore, CBTC significantly improves the efficiency and safety of train-based transportation. Although CBTC can accurately conduct real-time control, its design and implementation are extremely complex due to the integration of heterogeneous computation, communication and control components. Consequently, breakdowns caused by CBTC design flaws are inevitable. Therefore, how to guarantee the trustworthiness of CBTC, as for any typical safety-critical system, becomes a big challenge for researchers and practitioners. Due to the huge success in both hardware and software domains, formal methods are now considered as a promising means for trustworthy construction of CBTC systems. This article surveys the three most important stages during the trustworthy construction of CBTC systems, i.e., requirement analysis, design modeling, and bottom-level implementation. It not only comprehensively presents the important roles of the state-of-the-art formal methods and tools during the trustworthy CBTC construction, but also introduces the development trends as well as technical challenges for future CBTC.
QIAO Lei , YANG Meng-Fei , TAN Yan-Liang , PU Ge-Guang , YANG Hua
2017, 28(5):1204-1220. DOI: 10.13328/j.cnki.jos.005218
Abstract:In the safety-critical system of spacecraft, memory management system is the essential part of operating system kernels to provide allocation and cleanup mechanism at the lowest level and is obviously critical to the reliability and safety of spacecraft computer systems. The memory management system should satisfy strict constraints such as real-time response, space usage limit, allocation efficiency and many boundary conditions. It has to use very complex data structures and algorithm to manage the whole memory space. In order to make the complex memory management system of spacecraft highly reliable, the formal verification of the system also becomes much more complicated as it embodies formal verification of complex data structures such as two level segregated double linked list with two level bitmaps, specification of operations, modeling on behavior, assertion definition of inner function, loop invariant definition and real-time verification. This paper provides an in-depth analysis on these problems and characteristics of spacecraft memory management system to find certain general method and theory based on hierarchical iteration for verifying a concrete operating system used on spacecraft. The results of this study offer potential benefits in improving the reliability of the spacecrafts of China.
GUO De-Gui , WANG Guan-Cheng , LÜ Shuai , LIU Lei
2017, 28(5):1221-1232. DOI: 10.13328/j.cnki.jos.005206
Abstract:Partial evaluation plays an important role in many areas such as program optimization and automatic software generation. This paper applies partial evaluation to validating compilers. The design of peC, which is a subset of C, and the formalized description of partial evaluation strategy for peC are presented. Furthermore, the implementation of a partial evaluator for peC, and a compiler testing framework based on partial evaluation are provided. Experiments show that this new approach can not only detect errors which can be detected by other methods in GCC and LLVM but also found some errors which are not detected by the other methods. In summary, the work by this paper demonstrates applying partial evaluation in testing compilers is effective.
SHANG Shu , GAN Yuan-Ke , SHI Gang , WANG Sheng-Yuan , DONG Yuan
2017, 28(5):1233-1246. DOI: 10.13328/j.cnki.jos.005213
Abstract:Synchronous data-flow languages, such as Lustre, have been widely used in safety-critical industrial areas, such as airplanes, high-speed railways, and nuclear power plants. The safety of development tools themselves for these types of applications is highly required. In better solving the "miscompilation" problem, very successful progress has been made recently to implement the construction and verification of a conventional imperative language compiler, such as the CompCert C compiler, by using reliable-by-construction proof assistants. L2C is a trustworthy compiler developed based on such an approach, with an extended Lustre language as its source, and Clight, a C subset used in ComperCert, as its target. L2C is an industry-level synchronous data-flow language compiler developed by using the same technique. The paper focuses on the key translations of L2C and the main issues and experience in its design and implementation.
ZHANG Xuan , WANG Xu , LI Tong , BAI Chuan , KANG Yan-Ni
2017, 28(5):1247-1270. DOI: 10.13328/j.cnki.jos.005106
Abstract:The non-functional requirements are the determinants of the software quality. The satisfaction of the non-functional requirements impacts the software quality. Considering the importance of the software quality and non-functional requirements trade-off, based on production theory, elasticity of substitution and linear programming, an approach to analyzing trade-off costs for non-functional requirements is proposed and a support tool is developed. Firstly, our previous work about satisfiability analysis on non-functional requirements is improved. A new coordination method for obtaining non-functional requirements from stakeholders is presented. Then, ontology for non-functional requirements and corresponding knowledge base are constructed. The previous reasoning for the strategies is provided for decision-making of software development and evolution. Because of the ability of addressing the conflict relationships of the non-functional requirements, our method is better for the software industry. Finally, through applying the non-functional requirements trade-off costs for maintenance and evolution of the trustworthy third-party certificate authority software case, feasibility of the proposed approach is described.
JIANG Jun-Yan , PENG Zhi-Yong , WU Xiao-Ying , PENG Cheng-Chen , WANG Min
2017, 28(5):1271-1295. DOI: 10.13328/j.cnki.jos.005105
Abstract:Many Web applications, such as multimedia data integration and online business data aggregation, require deep Web querying to integrate information from many data sources on the Web. The success of such applications is largely determined by the efficiency and effectiveness of querying methods over relevant sources. Existing studies on multiple data source integration have focused on ranking the relevance of queries w.r.t data sources without considering the impact of overlap among the sources over data source selection, resulting in not only query processing overhead but also increased workloads on data sources. In order to improve query efficiency on overlapping data sources, this work proposes a tuple-level stratified sampling approach for overlapping data source selection. The approach has two stages: the offline stage and the online stage. In the offline stage, tuple-level stratified sampling is applied to obtain sample tuples. In the online stage, samples are used to estimate query coverage and overlap among multiple data sources. A heuristic method is also designed to discover data sources with low overlap. Experimental results show that the proposed approach is more efficient and effective than the state of the art methods for selecting overlapping data sources.
CHEN Yu , ZHOU Wei , DUAN Zhe-Min , QIAN Ye-Kui , ZHAO Xin
2017, 28(5):1296-1314. DOI: 10.13328/j.cnki.jos.005148
Abstract:Addressing the shortcomings of existing link congestion loss rate range inference algorithms in large scale IP network, a new link congestion loss rate range inference algorithm based on greedy heuristic method is proposed. The strong dependency on the clock synchronization of single slot E2E path measurements is avoided through using multiple slots E2E path measurements. Each congested link can be located through adopting the link congestion Bayesian maximum a-posterior (BMAP) after learning prior probabilities of the link congestion. The set consisting of paths with related congested links and similar performance is constructed. Through solving the performance similarity coefficient dynamically, loss rate range of each congested link can be recurrently inferred. The accuracy and robustness of the algorithm proposed in this paper is verified by experiments.
CHEN Cun-Tong , ZHAO Jun-Qiao , YE Chen , DENG Rong , GUAN Lin-Ting , LI De-Yi
2017, 28(5):1315-1325. DOI: 10.13328/j.cnki.jos.005144
Abstract:Timely transmission of sensory data and control instructions is crucial between function modules of autonomous car systems. Socket-based message transmission mechanisms, e.g. LCM and IPC, are the de facto standard for this purpose because of their easy deployment and adaptability for distributed environment. However, they can no longer meet the increasing demands of high bandwidth for sharing large data packets, i.e. images and point clouds, among perception processing modules and the decision making module. The main reason is that socket-based mechanisms have to divide the large data packet into smaller data packets, which introduces extra costs for packing and unpacking, therefore results in high latency and low bandwidth. This paper proposes a new shared-memory-based inter-process message transmission mechanism. The shared memory segment is composed of a super block and several data blocks. The circular queues are deployed for fast scheduling data receiving and delivering. Meanwhile, the atomic operations are applied to improve the overall performance. Experimental results show that the average transmission delay of data packets of size 3MB is 2.5ms, which is drastically lower than LCM's 12ms and is also better than Sharedmem_transport's 3.9ms. The maximum throughput of this method is 1.1GB/s, which is much greater than LCM's 180MB/s and Sharedmem_transport's 600MB/s.