Volume 18,Issue 4,2007 Table of Contents

  • Display Type:
  • Text List
  • Abstract List
  • 1  Software Cost Estimation Method and Application
    LI Ming-Shu HE Mei YANG Da SHU Feng-Di WANG Qing
    2007, 18(4):775-795.
    [Abstract](10973) [HTML](0) [PDF 1.36 M](14807)
    Abstract:
    Software cost estimation has played an important role in software development since its emergence in 1960’s. Based on a classification of algorithmic model based methods, non- algorithmic model based methods and composite methods, the typical software cost estimation methods in history are overall reviewed. The issue of software sizing, which is closely related to software cost estimation, is also discussed in this paper. Then a three phases’ evaluation criterion of software cost estimation methods is proposed and a case study on cost estimation of government sponsored projects in China is analyzed. At last, six possible trends from estimation models, estimation evolutions, estimation applications, estimation contents, supporting tools and human factors, are presented as a primary conclusion in the paper while viewing the future development for software cost estimation.
    2  Detecting Feature Interactions by Pattern Analysis
    ZUO Ji-Hong WANG Qian-Xiang MEI Hong
    2007, 18(4):796-807.
    [Abstract](4066) [HTML](0) [PDF 701.40 K](5254)
    Abstract:
    This paper aims at proposing an approach that can detect feature interactions through patterns analysis. The basic idea is to extract the common conflict patterns from the known feature interactions and reuse them to detect the unknown ones. With this approach, the conditions for conflicts are described using a set of predicate formulae and the system model is specified in Java language. With the aid of an external tool, all the execution scenarios can be systematically explored. During the execution of the system model, all the feature behaviors will be collected and analyzed. Once a conflict condition is satisfied, a conflict report is produced. The approach has been applied to an E-mail system. The experimental results show that the approach can effectively detect feature interactions, both the known and the unknown, after handling more than 1 million execution scenarios.
    3  Error Flow Model: Modeling and Analysis of Software Propagating Hardware Faults
    YANG Xue-Jun GAO Long
    2007, 18(4):808-820.
    [Abstract](4405) [HTML](0) [PDF 795.63 K](5125)
    Abstract:
    Neither reliability models in reliability engineering nor in software reliability can be directly applied to describe the propagation of hardware errors in programs. This paper first sets up a computational data flow model, and then explains that a computational data flow graph for the program can be built, using the instruction set of URM (unlimited register machine) as an example. Upon the computational data flow model, the error flow model is set up. Errors are categorized into two kinds: Original errors and propagated errors. By analyzing the propagation rules of these two kinds of errors, 6 assumptions about error propagation are given, upon which the probabilities of errors at any time and at any place in a program can be calculated. At last, a sample of URM program is given to demonstrate the capability of the fault flow model.
    4  An Approach for Optimizing Test Suite Based on Testing Requirement Reduction
    ZHANG Xiao-Fang XU Bao-Wen NIE Chang-Hai SHI Liang
    2007, 18(4):821-831.
    [Abstract](5642) [HTML](0) [PDF 956.33 K](6845)
    Abstract:
    Test suite optimization aims at satisfying all testing objectives with the least number of test cases. According to the given testing objectives, the reduced testing requirement set can improve the effectiveness and efficiency of test suite optimization. This paper proposes a testing requirement reduction model that can describe the interrelations among the testing requirements in detail. Based on the model, this paper presents a testing requirement reduction method to generate the reduced testing requirement set, which is the basis of test suite generation, reduction and optimization. The experimental results show that the method is helpful to generate the smaller test suite and it contributes to the systematic, reasonable and effective testing.
    5  An Approach to Analyzing Dependence Based on Exception Propagation Analysis
    JIANG Shu-Juan XU Bao-Wen SHI Liang ZHOU Xiao-Yu
    2007, 18(4):832-841.
    [Abstract](4149) [HTML](0) [PDF 631.54 K](5426)
    Abstract:
    Based on analyzing the effects of exception handling constructs on dependence analysis, this paper proposes a precise and efficient representation for C++ programs with exception handling constructs—improved control flow graph. It proposes a new approach to analyzing the data dependences and control dependences of intra-function and inter-function in C++ programs with exception handling constructs, and an efficient algorithm is also presented. This method overcomes the limitations of the previous incorrect analysis because of failing to account for the effects of exception handling constructs, and also provides a basis for automatic dependence analysis that contains exception propagation. Finally, it discusses the application of the dependence analysis method in program slicing.
    6  Levelset Method with Multi-Speed-Function and Its Application in Segmentation of Medical Images
    CHEN Jian TIAN Jie XUE Jian DAI Ya-Kang
    2007, 18(4):842-849.
    [Abstract](4350) [HTML](0) [PDF 358.58 K](5556)
    Abstract:
    All of the former level set algorithms have only one level set function and only one speed function, and it is a complex procedure to minimize the energy function during the evolvement of the zero-level-set. Furthermore, there are a lot of problems in this single speed function. In this paper, a new multi-level-set algorithm with multiplicate speed functions is proposed according to the different properties of different objects: Different level set functions are constructed in different regions, and so are different speed functions accordingly; many zero-level-sets are evolved at the same time and act on one another in order to segment. This method not only enhances the accuracy of segmentation, but also solves the bounder gap problem well, which is quite a puzzle for single level set algorithm. Perfect results are achieved when this method is applied to segment the MR and CT images.
    7  Block Motion Estimation Based on Immune Clonal Selection
    LIU Fang PAN Xiao-Ying
    2007, 18(4):850-860.
    [Abstract](3993) [HTML](0) [PDF 749.62 K](5304)
    Abstract:
    Motion estimation is a key technique in video compress and coding. Based on the analysis of the character of motion vector, a novel block motion estimation based on immune clonal selection (BMEICS) is proposed in this paper with some strategies like prediction of initial search point, Gray encoding and effective stop criteria. BMEICS synthesizes the character of block motion estimation and that of global search, diversity, and no prone to premature in immune clonal selection. It speeds up the process of motion estimation while maintaining the average with little loss. Experimental results show that BMEICS obtains almost the same as the full search algorithm with fewer search points, and outperforms the existing fast block-matching algorithms for most sequences in terms of speed and quality. Furthermore, BMEICS is applicable to all types of video sequences in spite of the degree of motion.
    8  A Simpler and More Effective Particle Swarm Optimization Algorithm
    HU Wang LI Zhi-Shu
    2007, 18(4):861-868.
    [Abstract](10564) [HTML](0) [PDF 692.44 K](13077)
    Abstract:
    The basic particle swarm optimization (bPSO) has some demerits, such as relapsing into local extremum, slow convergence velocity and low convergence precision in the late evolutionary. Three algorithms, based on the simple evolutionary equations and the extrenum disturbed arithmetic operators, are proposed to overcome the demerits of the bPSO. The simple PSO (sPSO) discards the particle velocity and reduces the bPSO from the second order to the first order difference equation. The evolutionary process is only controlled by the variables of the particles position. The extremum disturbed PSO (tPSO) accelerates the particles to overstep the local extremum. The experiment results of some classic benchmark functions show that the sPSO improves extraordinarily the convergence velocity and precision in the evolutionary optimization, and the tPSO can effectively break away from the local extremum. tsPSO, combined the sPSO and tPSO, can obtain the splendiferous optimization results with smaller population size and evolution generations. The algorithms improve the practicality of the particle swarm optimization.
    9  A More Topologically Stable ISOMAP Algorithm
    SHAO Chao HUANG Hou-Kuan ZHAO Lian-Wei
    2007, 18(4):869-877.
    [Abstract](5203) [HTML](0) [PDF 741.24 K](6778)
    Abstract:
    The success of ISOMAP depends greatly on being able to choose a suitable neighborhood size, however, it is still an open problem how to do this effectively. Based on the fact that “short circuit” edges pass the area with the relatively lower local densities, this paper presents a new variant of ISOMAP, i.e. P-ISOMAP (pruned-ISOMAP), which can prune effectively “short circuit” edges existed possibly in the neighborhood graph and thus is much less sensitive to the neighborhood size and more topologically stable than ISOMAP. Consequently, P-ISOMAP can be applied to data visualization more easily than ISOMAP because the open problem described above can be avoided to the utmost extent. The effectivity of P-ISOMAP is verified by the experimental results very well.
    10  Robust Speech Recognition Based on Neighborhood Space
    YAN Bin-Feng ZHU Xiao-Yan ZHANG Zhi-Jiang ZHANG Fan
    2007, 18(4):878-883.
    [Abstract](4993) [HTML](0) [PDF 405.05 K](5264)
    Abstract:
    This paper presents an approach to robust speech recognition based on neighborhood space, which can achieve performance robustness under mismatch between training and testing conditions. This approach uses neighborhood space of each underlying model to produce Bayesian predictive density as observation probability density. Experimental results show that the proposed method improves the performance robustness.
    11  An Efficient Algorithm for Mining Approximate Frequent Item over Data Streams
    WANG Wei-Ping LI Jian-Zhong ZHANG Dong-Dong GUO Long-Jiang
    2007, 18(4):884-892.
    [Abstract](5173) [HTML](0) [PDF 545.31 K](6952)
    Abstract:
    A frequent item of a data stream is a data point whose occurrence frequency is above a given threshold. Finding frequent item of data stream has wide applications in various fields, such as network traffic monitor, data stream OLAP and data stream mining, etc. In data stream model, the algorithm can only scan the data in one pass and the available memory space is very limited relative to the volume of a data stream, therefore it is usually unable to find all the accurate frequent items of a data stream. This paper proposes a novel algorithm to find ε-approximate frequent items of a data stream, its space complexity is O(ε-1) and the processing time for each item is O(1) in average. Moreover, the frequency error bound of the results returned by the proposed algorithm is ε(1-s+ε)N. Among all the existed approaches, this method is the best.
    12  Complex Twig Pattern Query Processing over XML Streams
    YANG Wei-Dong WANG Qing-Ming SHI Bai-Le
    2007, 18(4):893-904.
    [Abstract](4802) [HTML](0) [PDF 758.91 K](5420)
    Abstract:
    The problem of processing streaming XML data is gaining widespread attention from the research community. In this paper, a novel approach for processing complex Twig Pattern with OR-predicates and AND-predicates over XML documents stream is presented. For the improvement of the processing performance of Twig Patterns, all the Twig Patterns are combined into a single prefix query tree that represents such queries by sharing their common prefixes. Its OR-predicates and AND-predicates of a node are represented as a separate abstract syntax tree associated with the node. Consequently, all the Twig Patterns are evaluated in a single, document-order pass over the input document stream for avoiding the interim results produced by the post-processing nested paths of YFilter. Compared with the existing approach, experimental results show that it can significantly improve the performance for matching complex Twig Patterns over XML document stream, especially for large size XML documents. Based on the prior works, the optimization of twig patters under DTD (document type definition) by using structural and constraint information of DTD is also addressed, which is static, namely, it is processed before the runtime of stream processing.
    13  Clustering Evolving Data Streams over Sliding Windows
    CHANG Jian-Long CAO Feng ZHOU Ao-Ying
    2007, 18(4):905-918.
    [Abstract](5459) [HTML](0) [PDF 860.78 K](7530)
    Abstract:
    To address the sliding window based clustering, two types of exponential histogram of cluster features, false positive and false negative, are introduced in this paper. With these structures, a clustering algorithm based on sliding windows is proposed. The algorithm can precisely obtain the distribution of recent records with limited memory, thus it can produce the clustering result over sliding windows. Furthermore, it can be extended to deal with the clustering problem over N-n window (an extended model of the sliding window). The evolving data streams in the experiments include KDD-CUP’99 and KDD-CUP’98 real data sets and synthetic data sets with changing Gaussian distribution. Theoretical analysis and comprehensive experimental results demonstrate that the proposed method is of high quality, little memory and fast processing rate.
    14  Layered Solution for SLCA Problem in XML Information Retrieval
    KONG Ling-Bo TANG Shi-Wei YANG Dong-Qing WANG Teng-Jiao GAO Jun
    2007, 18(4):919-932.
    [Abstract](4917) [HTML](0) [PDF 843.04 K](6010)
    Abstract:
    SLCA (smallest lowest common ancestor) problem is a basic task of keyword search in XML information retrieval. It means to find all the nodes corresponding to the tightest subtrees in XML data, which involves the given. Xu, et al., illustrate three algorithms-Indexed lookup eager (ILE), stack algorithm and scan eager (SE), and manifest that ILE has the best performance. Different from the complicated-B+-tree-based ILE algorithm, this paper proposes a layered solution for SLCA problem, named as LISA (layered intersection scan algorithm). It benefits from the distribution rule of SLCA nodes in XML tree, and calculates the SLCA nodes level by level (the deepest level runs first). That is, based on the retrieved Dewey codes corresponding to given keywords, the Dewey codes of SLCA nodes can be gotten by intersecting the prefix Dewey codes of each level. Compared with the ILE algorithm, LISA solutions need not sophisticated data structures, and have comparatively runtime performance. There are two instances following the LISA idea, called LISA I and LISA II respectively. They are distinguished from each other according to whether keeping Dewey codes in computation or transforming Dewey codes into integer sequences. Extensive experiments evaluate the performance of algorithms and prove the efficiency of LISA II.
    15  A Fast Outlier Detection Algorithm for High Dimensional Categorical Data Streams
    ZHOU Xiao-Yun SUN Zhi-Hui ZHANG Bai-Li YANG Yi-Dong
    2007, 18(4):933-942.
    [Abstract](5178) [HTML](0) [PDF 571.89 K](6568)
    Abstract:
    This paper considers the problem of outlier detection in data stream, proposes a new metric called weighted frequent pattern outlier factor for categorical data streams, and presents a novel fast outlier detection algorithm named FODFP-Stream (fast outlier detection for high dimensional categorical data streams based on frequent pattern). FODFP-Stream computes the outlier measure through discovering and maintaining the frequent patterns dynamically, and can deal with the high dimensional categorical data streams effectively. FODFP-Stream can also be extended to resolve continuous attributes and mixed attributes data streams. The experimental results on synthetic and real data sets show the promising availabilities of the approaches.
    16  Topology Control for Wireless Sensor Networks
    ZHANG Xue LU Sang-Lu CHEN Gui-Hai CHEN Dao-Xu XIE Li
    2007, 18(4):943-954.
    [Abstract](9565) [HTML](0) [PDF 699.81 K](11692)
    Abstract:
    Topology control is one of the most fundamental problems in wireless sensor networks. It is of great importance for prolonging network lifetime, reducing radio interference, increasing the efficiency of MAC (media access control) protocols and routing protocols, among other things. This paper makes a full-scale introduction to the advancement of research on topology control. Firstly, the topology control problem and the design objectives are clearly presented. Secondly, an introduction is made to representative research efforts, along with analyses and comparisons, in two aspects, power control and sleep scheduling, respectively. At the same time, the defects of those efforts are clearly pointed out. Finally, existing problems, open issues and research trends are analyzed and summarized.
    17  Access Control in New Network Environment
    LIN Chuang FENG Fu-Jun LI Jun-Shan
    2007, 18(4):955-966.
    [Abstract](5558) [HTML](0) [PDF 695.74 K](7804)
    Abstract:
    Access control is an important technology for system security, and its mechanism is different for different networks. This paper first introduces the characteristics and applications of three traditional access control policies which are DAC (discretionary access control), MAC (mandatory access control) and RBAC (role-based access control), introduces the UCON (usage control) model, and then analyzes access control technology and current researches in Grid, P2P and wireless environment respectively. In addition, this paper proposes that trustworthy networks as the developing goal of the next generation Internet require using trust-based the access control model to assure security. This paper investigates on the trust and reputation model in detail, and finally gives the prospects of access control.
    18  Anomaly Detection Based on Web Users' Browsing Behaviors
    XIE Yi YU Shun-Zheng
    2007, 18(4):967-977.
    [Abstract](5735) [HTML](0) [PDF 672.08 K](7846)
    Abstract:
    This paper proposes an anomaly detection based on Web user access behavior for the defense of application layer Distributed Denial-of-Service (DDoS) attack. Based on the hyperlink characteristics of Web pages and the HTTP responding effect of different proxies in the Internet, this paper uses hidden semi-Markov model (HsMM) to describe the Web user browsing behavior observed at Web server, and employs likelihood of the observation sequence on user browsing behaviors fitting to the model as a measure of user’s normality. A parameterized model and its recursive formulae are derived and an on-line anomaly detection approach is introduced. Some issues involved in practical implementations of the model and the anomaly detection approach are discussed. Finally, an experiment is conducted to validate the model and the algorithm, which is based on a set of data colleted from a heavy-loaded Web server and an emulated DDoS attack that launches HTTP flooding to the Web site. The experimental results show that the model is effective in measuring the user behaviors and in detecting the application layer DDoS attacks.
    19  Towards the Optimization of Access Control List
    ZENG Kuang-Yi YANG Jia-Hai
    2007, 18(4):978-986.
    [Abstract](4211) [HTML](0) [PDF 564.98 K](5781)
    Abstract:
    Access control list (ACL) is proposed to solve or improve the network security problem. It is widely deployed in network devices such as routers, switches and firewall appliances, to filter the packets. However, the performance of the network device will be degraded when access control lists are applied in data forwarding interfaces of the device. The optimization of the ACL can greatly improve the performance of the devices in packets forwarding. The paper studies the optimization problem of ACL, outlines the overlapping or containing relationships between single clause and multiple clauses or among multiple clauses, proposes a formula representation of the problem based on the studies, and draws three important conclusions. Based on these conclusions, an approximate optimization algorithm is designed and implemented. Simulation experiments show better performance than the similar commercial products, implying that the research not only provides theoretical references, but also has important practical application.
    20  A Cluster-Based Multipath Routing Algorithm in Mobile Ad Hoc Networks
    AN Hui-Yao LU Xi-Cheng PENG Wei
    2007, 18(4):987-995.
    [Abstract](5134) [HTML](0) [PDF 524.29 K](5622)
    Abstract:
    This paper proposes a cluster-based multi-path routing in MANET (CBMRP). It distributes traffic among diverse multiple paths to avoid congestion, optimize bandwidth used and improve the sharing rate of channel. It uses clustering’s hierarchical structure diverse to decrease routing control overhead and improve the networks scalability. This algorithm is implemented on the OPNET environment, and the results show that the algorithm can balance the load of the network and deal with the change of the network topology effectively, and also improve the reliability, throughput and stability of the network efficiently.
    21  Percolation-Based Probabilistic Reliable Dissemination for Mobile Ad Hoc Networks
    ZHENG Jing LU Xi-Cheng CAO Jian-Nong
    2007, 18(4):996-1006.
    [Abstract](4061) [HTML](0) [PDF 663.52 K](4688)
    Abstract:
    This paper proposes PLPD, a novel percolation-based probabilistic reliable dissemination protocol for information dissemination among a group of nodes in a MANET. Being different from other probabilistic reliable dissemination protocols, PLPD is aware of the network topology and directed dissemination, which doesn't require flooding the network with control messages to manage group members. In PLPD, each group member maintains only a partial view of other members in its neighborhood and disseminates data messages with probability p to a subset of the view. The dissemination process of the PLPD protocol is modeled with the percolation theory, and it is proved that PLPD can achieve a probability close to 1 for all group members to receive every message, if p is greater than a certain critical threshold. The simulation results show that the PLPD protocol effectively reduces the network load while providing high reliability, and scales well to large system sizes.
    22  Forking Lemma and the Security Proofs for a Class of ID-Based Signatures
    GU Chun-Xiang ZHU Yue-Fei PAN Xiao-Yu
    2007, 18(4):1007-1014.
    [Abstract](4852) [HTML](0) [PDF 570.08 K](7284)
    Abstract:
    This paper offers arguments for the provable security of a class of ID-based signature schemes called ID-based generic signature schemes in the random oracle model. The theoretical result can be viewed as an extension of the Forking Lemma due to Pointcheval and Stern for ID-based signature schemes, and can help to understand and simplify the security proofs of previous work such as Cha-Cheon's scheme, Hess's scheme-1, Cheon-Kim-Yoon's scheme, and so on.
    23  Independent Oblivious Transfer
    HUANG Qiong ZHAO Yi-Ming
    2007, 18(4):1015-1025.
    [Abstract](4556) [HTML](0) [PDF 681.46 K](4957)
    Abstract:
    This paper presents a flavor of OT named Independent Oblivious Transfer in the Public-Key Public-Randomness Model (PKPR IOT, in short), with respect to the open problem given by De Santis. First it gives a non-interactive implementation of IOT, which can independently and obliviously transfer polynomial messages. The implementation is based on Quadratic Residuosity Assumption. Since it is limited to pre-fixed times, then another non-interactive implementation of IOT is presented, which can independently transfer messages for any times. The second implementation requires the sender to be honest and the receiver couldn't make his choice independently. So, the third interactive implementation is given. It is based on the BBCS oblivious transfer scheme of Rivest, and is rather more efficient than the above two non-interactive counterparts. All the three implementations presented are secure against receivers with unlimited computational power.
    24  Macor: A Maintainable XQuery Model for Representing Nested Schema Mappings
    QIAN Gang DONG Yi-Sheng
    2007, 18(4):1026-1038.
    [Abstract](3923) [HTML](0) [PDF 524.03 K](5471)
    Abstract:
    This paper proposes a model called mapping & correlation (Macor) to represent nested schema mappings. With Macor, a full mapping is modeled as a number of simple atomic ones that are correlated with correlations. The expressive power of Macor by a fragment of XQuery called CoXQ is studied and the issues in implementing Macor are addressed. Preliminary experimental results show that with Macor a full mapping can be modeled incrementally in a piecemeal fashion, and in refining or maintaining the mappings, Macor makes it possible to locate modifications to few atomic mappings and correlations, and reuse other parts of the full mapping.
    25  A High Performance Computer System Evaluation Model Based on HPCC and Analytic Hierarchy Process
    LIU Chuan-Yi WANG Dong-Sheng
    2007, 18(4):1039-1046.
    [Abstract](4662) [HTML](0) [PDF 551.10 K](6671)
    Abstract:
    HPCC (high performance computing challenge) benchmark is issued by DARPA HPCS (high productivity computing system) project which is meant to define the program performance boundaries, and it has received broad attention from both industry and academy. But actually, HPCC is not very popular now, which mostly can be attributed to its multiple performance parameters. Because of lacking of a unique and clear evaluation mark, HPCC requests the analyzers and deciders to make a decision manually from the several test results. This paper proposes AHPCC—A high performance computer system evaluation model based on HPCC and analytic hierarchy process. When the candidate systems run the HPCC benchmark set and obtain the evaluation results, AHPCC builds the hierarchy structure graph based on these benchmark results and the application objects, then calculates the related weight vector for the application object, and finally it gives a unique weight for the ultimate application object per candidate system. This paper calculates and analyses the AHPCC results for 12 high performance systems which have been measured by HPCC benchmark set. The experimental results indicate that AHPCC model can efficiently and correctly evaluate the application characteristics of high performance systems.
    26  SimOS-Goodson: A Goodson-Processor Based Multi-Core Full-System Simulator
    GAO Xiang ZHANG Fu-Xin TANG Yan ZHANG Long-Bing HU Wei-Wu TANG Zhi-Min
    2007, 18(4):1047-1055.
    [Abstract](5213) [HTML](0) [PDF 570.91 K](6969)
    Abstract:
    As the Chip MultiProcessors (CMPs) have become the trend of high performance microprocessors, the target workloads become more and more diversified. The traditional user-level simulators cannot handle them, so new simulators are needed for the future architecture research. Based on the SimOS full-system environment, a new multi-core full-system simulator of Goodson processors, SimOS-Goodson, has been designed and implemented. The SimOS-Goodson decouples the simulation functionality and timing. It adopts a new value-prediction approach to implement memory consistency in the simulation environment. The credibility and accuracy of SimOS-Goodson are achieved by cross-validating the simulator with the actual hardware. The simulator inherits the benefits such as high speed and high flexibility from the traditional user-level simulators. It also has the new benefits such as accuracy, full-system support and easy to use. By porting the entire Linux OS, analysis and evaluation of the microarchitecture and workloads can be conducted easily in the SimOS-Goodson full-system environment. On a machine of Pentium4 3.0GHz, the speed of SimOS-Goodson exceeds 300K instructions per second. SimOS-Goodson will play a key role in the research of future Goodson multi-core architecture.
    27  Automatedly Checking Function Execution Context of Kernel Programs in Operation Systems
    WANG Li YANG Xue-Jun WANG Ji LUO Yu
    2007, 18(4):1056-1067.
    [Abstract](4768) [HTML](0) [PDF 690.53 K](5754)
    Abstract:
    Function execution context correctness is one of the most easily violated critical properties by OS (operation system) kernel programs while it is non-trivial to be checked out. The existing solutions suffer some difficulty and limitation. This paper presents a framework PRPF to check the correctness of function execution context, as well as the modeling process and algorithms in detail. The PRPF has the advantages, such as direct checking source code, no need writing formal specifications, low time and space costs, and perfect scalability, etc., over the existing techniques. The technique has been applied in checking the Linux kernel source 2.4.20. The experimental results show that PRPF can check the correctness of function execution context as expected by automatically exploring all paths in the sources. As a result, 23 errors and 5 false positives are found in the ‘drivers/net’ source directory. The technique is very helpful in improving the quality of OS kernel codes.
    28  Mission Reliability Analysis of Generalized Phased Mission Systems
    MO Yu-Chang YANG Xiao-Zong CUI Gang LIU Hong-Wei
    2007, 18(4):1068-1076.
    [Abstract](5029) [HTML](0) [PDF 559.63 K](5989)
    Abstract:
    To attack the weak points of the state-of-the-art, this paper shows how to analyze mission reliability of the generalized PMS with random phases duration and Markov regenerative intraphase processes. These generalized PMS are modeled under a simple and practical 5-tuple analysis model. This paper proves that the underlying intraphase processes are Markov regenerative processes and shows how to compute the conditional transient occupation probability matrix for each phase. Then it shows how to use these matrices to compute mission reliability of the generalized PMS. Using Laplace transformation can simplify the whole analysis process. Equipped with the analysis model, reliability of the generalized PMS can be efficiently analyzed.

    Current Issue


    Volume , No.

    Table of Contents

    Archive

    Volume

    Issue

    联系方式
    • 《Journal of Software 》
    • 主办单位:Institute of Software, CAS, China
    • 邮编:100190
    • 电话:010-62562563
    • 电子邮箱:jos@iscas.ac.cn
    • 网址:https://www.jos.org.cn
    • 刊号:ISSN 1000-9825
    •           CN 11-2560/TP
    • 国内定价:70元
    You are the firstVisitors
    Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
    Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
    Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
    Technical Support:Beijing Qinyun Technology Development Co., Ltd.

    Beijing Public Network Security No. 11040202500063