• Volume 20,Issue 6,2009 Table of Contents
    Select All
    Display Type: |
    • >Review Articles
    • Combinatorial Testing: Principles and Methods

      2009, 20(6):1393-1405.

      Abstract (12369) HTML (0) PDF 831.86 K (20920) Comment (0) Favorites

      Abstract:Combinatorial testing can use a small number of test cases to test systems while preserving fault detection ability. However, the complexity of test case generation problem for combinatorial testing is NP-complete. The efficiency and complexity of this testing method have attracted many researchers from the area of combinatorics and software engineering. This paper summarizes the research works on this topic in recent years. They include: various combinatorial test criteria, the relations between the test generation problem and other NP-complete problems, the mathematical methods for constructing test cases, the computer search techniques for test generation and fault localization techniques based on combinatorial testing.

    • Parallel Model of DDS and Its Formalization

      2009, 20(6):1406-1413.

      Abstract (4849) HTML (0) PDF 554.88 K (7815) Comment (0) Favorites

      Abstract:Although the model of DDS (deadline-driven scheduler) is a classical model of real-time system, the space-condition is not included in its original framework. Based on the extension of the original framework of DDS, multi-processes task scheduling with space -constraint is investigated. By studying the parallel model of DDS, the concept of maximal separated task-set, the primary scheduling algorithm and the general scheduling algorithm are presented. In order to formalize the parallel model of DDS, the paper extend duration calculus to DC* with the idea of separation logic, which can express the space-constraint successfully, and give the formalization too.

    • SAT-Based Compositional Verification Strategy for Concurrent Software with States, Events

      2009, 20(6):1414-1424.

      Abstract (5024) HTML (0) PDF 671.35 K (5939) Comment (0) Favorites

      Abstract:For the state/event linear temporal logic SE-LTL, an SAT-based Bounded Model Checking procedure which avoids the space blow up of BDDs is presented. For SE-LTL-X, it is shown how to integrate the procedure and the stuttering equivalent technique. The integration speeds up the verification procedure. Furthermore, a framework for model checking concurrent software systems which integrates three powerful verification techniques is presented: SAT-based Bounded Model Checking, counterexample-guided abstraction refinement and compositional reasoning. In the framework the abstraction and refinement steps are performed over each component separately, and the model checking step is symbolic. Example shows that the framework can reduce verification time and space.

    • >Review Articles
    • Research on Software Fault Injection Testing

      2009, 20(6):1425-1443.

      Abstract (10247) HTML (0) PDF 1.09 M (18493) Comment (0) Favorites

      Abstract:The software fault injection testing (SFIT) technique has been developed for thirty years. It is one of the most active parts in software testing research. As a non-traditional testing technique, it plays a very important role in enhancing software quality, eliminating software failures and improving the process of software development. A detailed review of the research on SFIT is presented based on the survey and classification of the current SFIT techniques. Then, some important testing frameworks and tools that are effective at present are also discussed. Meanwhile, a brief description of the testing system CSTS (Component Security Testing System) is provided as well. Based on the precise investigation on SFIT, the issues and challenges of SFIT are pointed out and the future development trend for SFIT is proposed.

    • Heuristic Symbolic Verification of Safety Properties for Parameterized Systems

      2009, 20(6):1444-1456.

      Abstract (4496) HTML (0) PDF 840.10 K (6333) Comment (0) Favorites

      Abstract:A parameterized system is a system that involves numerous instantiations of the same finite-state process, and depends on a parameter which defines its size. The backward reachability analysis has been widely used for verifying parameterized systems against safety properties modeled as a set of upward-closed sets. As in the finite-state case, the verification of parameterized systems also faces the state explosion problem and the success of model checking depends on the data structure used for representing a set of states. Several constraint-based approaches have been proposed to symbolically represent upward-closed sets with infinite states. But those approaches are still facing the symbolic state explosion problem or the containment problem, i.e. to decide whether a set of concrete states represented by one set of constraints is a subset of another set of constraints, which is co-NP complete. As a result, those examples investigated in the literature would be considered of negligible size in finite-state model checking. This paper presents several heuristic rules specific to parameterized systems that can help to mitigate the problem. Experimental results show that the efficiency is significantly improved and the heuristic algorithm is several orders of magnitude faster than the original one in certain cases.

    • Non-Functional Attributes Oriented Description and Verification in UML Class Diagrams

      2009, 20(6):1457-1469.

      Abstract (4859) HTML (0) PDF 854.96 K (7974) Comment (0) Favorites

      Abstract:Modeling for a system is a very important activity in software development. A model with high quality should not only include the description of functional attributes of the system, i.e., what the system can do, but also the description of non-functional attributes, i.e., what is the quality of the system. Although the de facto modeling approaches and tools adequately support modeling for the functional attributes, they neglect modeling for the non-functional attributes, especially, on how to integrate the description of the functional and non-functional attributes in one model and provide methods to verify some properties about the non-functional attributes. In the paper, UML Class Diagram is extended to describe the non-functional attributes by adding the model elements, i.e., the non-functional attributes notation and the constraints table. An approach is given to verify the consistency and satisfiability of the non-functional attributes in the extended UML Class Diagram. An example is used to demonstrate our proposal and a tool that supports the description and verification of non-functional attributes in UML is introduced.

    • Service Collaboration Approach Based on Decentralized Mediation Model

      2009, 20(6):1470-1486.

      Abstract (4536) HTML (0) PDF 1.19 M (7002) Comment (0) Favorites

      Abstract:As services are developed by independent providers, it is impossible to predict all potential interactive possibility at development stage. In order to ensure the correct service collaboration, it is necessary to check the interaction compatibilities among participating services. This paper presents a top-down approach to improve the reusability of available services and the flexibility of service collaboration based on a decentralized mediation model. Through failure-equivalent projection, the collaboration process is transformed to decentralized processes, which facilitates a more efficient pair-wise compatibility check. A mediation-based method is proposed to check if the compatibilities can be achieved by using mediation mechanism. Algorithms of compatibility check and automatic mediator generation are also provided. This approach makes the coupling between available services and collaborative environment more looser, thus improving the flexibility of collaboration while preserving the autonomy of services.

    • Automatic Hierarchical Layout Algorithm for UML Class Diagram

      2009, 20(6):1487-1498.

      Abstract (6073) HTML (0) PDF 784.25 K (10178) Comment (0) Favorites

      Abstract:UML class diagrams are helpful for understanding complicated object-oriented software systems. The reasonable placement of diagram elements can make class diagrams more readable and understandable. As inheritance is regarded as a hierarchical relation, the hierarchical layout method is usually adopted to draw UML class diagrams. Because the domain specific knowledge and drawing criteria related to class diagrams must be considered in the layout of diagrams, general hierarchical layout algorithms for nested digraphs should be extended according to these criteria. But existing hierarchical layout algorithms for class diagrams cannot handle the nested relations among packages, classes, and interfaces, and existing compound layout methods for digraphs cannot be used directly to draw class diagrams. Layout criteria are selected based on the knowledge of graph drawing aesthetics, UML class diagram semantics and software visualization. In addition, the nested constraints in rank assignment, edge crossing, and coordinate assignment of hierarchical layout method were also analyzed in this paper. Then, existing hierarchical layout algorithm was extended to cope with nested graphs according to nested criteria. Experiment results show that the drawings of the reversed class diagrams are more readable and understandable with proper hierarchies, nested relations, less crossings and optimal drawing area.

    • Benchmarking Software Task Performance Based on Data Envelopment Analysis

      2009, 20(6):1499-1510.

      Abstract (4627) HTML (0) PDF 822.75 K (6339) Comment (0) Favorites

      Abstract:In this paper, a software task performance benchmarking method based on Data Envelopment Analysis (DEA)-TaskBeD is proposed. TaskBeD’s fundamental benchmarking model and algorithms (transforming undesirable outputs, identifying high performance tasks, establishing reference sets and performing sensitivity analysis) are introduced. Experimental results show that the proposed TaskBeD method achieves a good result of dealing with multivariate and VRS.

    • Discrete No-Fit Polygon, A Simple Structure for the 2-D Irregular Packing Problem

      2009, 20(6):1511-1520.

      Abstract (5194) HTML (0) PDF 687.65 K (7470) Comment (0) Favorites

      Abstract:This paper presents a model based on discrete no-fit polygon for the two-dimensional irregular packing problem. Burke et al. have presented an effective BLF algorithm to solve the irregular packing problem, however, their algorithm might generate invalid results for some special cases. To solve this problem, a model based on discrete no-fit polygon is proposed, and its correctness has been strictly proved. Only points and intervals are only considered by this model, which greatly decreases the geometry complexity of the original problem and makes the problem easily solved by many heuristic strategies. Computational results show that the algorithm based on discrete no-fit polygon model is very efficient.

    • Extension Rule Algorithms Based on IMOM and IBOHM Heuristics Strategies

      2009, 20(6):1521-1527.

      Abstract (5277) HTML (0) PDF 459.54 K (6441) Comment (0) Favorites

      Abstract:Extension rule-based theorem proving is a splendid reasoning method. Based on the extension rule algorithm IER (improved extension rule), this paper proposes IMOM (improved maximum occurrences on clauses of maximum size) and IBOHM (improved BOHM) heuristic strategies in order to give guidance while choosing the clause that limits the search space. This paper applies these two heuristic strategies to the algorithm IER, designs and implements the algorithms IMOMH_IER and IBOHMH_IER. Since more appropriate search space can be obtained with these two heuristic strategies, the algorithms can decide whether the original question is satisfiable in a fraction of a second. Experimental results show that the speed of the algorithms is 10~200 times that of the existing algorithms DR (directional resolution) and IER.

    • Lowest-Level Left Align Best-Fit Algorithm for the 2D Rectangular Strip Packing Problem

      2009, 20(6):1528-1538.

      Abstract (7040) HTML (0) PDF 829.39 K (11742) Comment (0) Favorites

      Abstract:In this paper, a heuristic placement algorithm for the two-dimensional rectangular strip packing problem, lowest-level left align best fit (LLABF) algorithm, is presented. The LLABF algorithm is based on the best-fit priority principle with overall consideration of several heuristic rules, such as full-fit first rule, width-fit first rule, height-fit first rule, joint-width-fit first rule and placeable first rule. Unlike the bottom-left (BL), the improved-bottom-left (IBL) and the bottom-left-fill (BLF) heuristic placement algorithms, LLABF algorithm dynamically selects the best-fit rectangle for packing. The computation result shows that 2DR-SPP can be solved more effectively by combining the LLABF algorithm with the genetic algorithm (GA).

    • >Review Articles
    • Research of QoS on Wireless Mesh Networks

      2009, 20(6):1539-1552.

      Abstract (8035) HTML (0) PDF 2.32 M (9949) Comment (0) Favorites

      Abstract:As a key technology for next-generation wireless networking, wireless mesh networks (WMNs) should support the quality of service (QoS) because of its capability of integrating with other networks and providing various services. In this paper, the state-of-the-art QoS research on WMNs is presented. By analyzing the current QoS architectures, the QoS architecture on WMNs is discussed. To address the QoS issues below network layer, the research of recent years on the power control, wireless environment awareness, MAC(medium access control) protocol based on QoS, QoS routing, cross-layer design for QoS and other aspects is comprehensively summarized and deeply analyzed. At the end of this paper, future work of QoS on WMNs is proposed.

    • Biometric Template Protection

      2009, 20(6):1553-1573.

      Abstract (11007) HTML (0) PDF 1.60 M (15985) Comment (0) Favorites

      Abstract:This paper reviews the state-of-the-art of biometric template protection technology domestic and abroad, and then systemizes almost all the related research directions. First it is clarified that the underlying disadvantages of traditional biometric systems and the attacks they are vulnerable to. Then the necessity and difficulty of protecting biometric template are drawn naturally. Afterwards this paper classifies the methods and algorithms into various categories based on the operation manner, and elaborates specifically some representative ones, such as Biohashing and Fuzzy Vault and so on. In the experiment, evaluations of Biohashing and Fuzzy Vault are carried on different fingerprint databases, and the results show the advantages and disadvantages of Biohashing method, as well as our improved Fuzzy Fingerprint Vault’s better performance and security on FVC2002 DB2.

    • Network Distance Prediction Technology Research

      2009, 20(6):1574-1590.

      Abstract (7745) HTML (0) PDF 1.06 M (11863) Comment (0) Favorites

      Abstract:The distance information between nodes in P2P network is the basis for achieving topology-awareness which aims at optimizing the applications of overlay and solving the problems such as network monitoring. However, it seems infeasible to accurately and completely measure the distances between nodes due to the characteristics of P2P, such as being large-scale, self-organized, highly dynamic and so on. Consequently, researchers have put forward various prediction methods, and currently the network distance prediction technology is emerging as a new hotspot of research in P2P area. Firstly, a research framework is proposed, based on which the main aspects and the related technical issues of the research are analyzed. Meanwhile, the research history and the analysis of the classification are investigated. Many typical methods are introduced and compared. Lastly, the metrics of precision, as well as future research trends of network distance prediction is reviewed.

    • Run Length Coding-Based Dynamic Trust Model for P2P Network

      2009, 20(6):1602-1616.

      Abstract (5409) HTML (0) PDF 1.11 M (7433) Comment (0) Favorites

      Abstract:In this paper, RunTrust, a trust model based on Run-length coding algorithm, performs trust evaluation by compressing records of peers’ behaviors which contain more information including time dimension, so it can exhibit good performance in trust evaluation and detecting malicious especially oscillatory behaviors. Moreover, the capability of filtering false feedbacks is also improved. The simulating results indicated that RunTrust has significantly raised the efficiency of trust management system only at a very low cost. The work on RunTrust has paved for the research on trust data compacting.

    • Set of Necessary and Sufficient Conditions in Collision Attacks on MD5

      2009, 20(6):1617-1624.

      Abstract (5298) HTML (0) PDF 542.42 K (6982) Comment (0) Favorites

      Abstract:By analyzing the properties of the nonlinear functions used in MD5 and the differences in terms of XOR and subtraction modulo 232, this paper proves that some sufficient conditions presented by Liang Jie and Lai Xuejia are also necessary to guarantee the differential path from the 23rd step to the 62nd step and give a set of necessary and sufficient conditions to guarantee the output differences of the last two steps. Then, according to the set of necessary and sufficient conditions this paper presents an improved collision attack algorithm on MD5. Finally, it analyzes the average computational complexity of the attack algorithm which is 0.718 7 times of that of the previous collision attack algorithms and proves the efficiency of the improved algorithm by computer simulations.

    • Component Property Based Remote Attestation

      2009, 20(6):1625-1641.

      Abstract (5829) HTML (0) PDF 1.28 M (8288) Comment (0) Favorites

      Abstract:A fine-grained property attestation based on the components is proposed to prove that the user platform satisfies the security property predefined by remote relying party. Compared with other remote attestation schemes, CPBA (component property based attestation) has the advantage of semantic and property expression to some extent. It is not only more fine-grained and extensive, but also easier to implement issuing, verifying and revoking the property certificate. CPBA guarantees the authenticity of attestation by component commitment, and protects the privacy of platform components by zero-knowledge proof. It is proved secure in Random Oracle Model under strong RSA Assumption. The experimental result of its prototype system indicates that CPBA is a flexible, usable, highly efficient attestation, and has no influence on system performance.

    • Topology-Transparent Hybrid MAC Protocol for Ad Hoc Networks

      2009, 20(6):1642-1650.

      Abstract (4866) HTML (0) PDF 568.13 K (6088) Comment (0) Favorites

      Abstract:A topology-transparent protocol, called topology-transparent hybrid MAC (TTHM) protocol is proposed for ad hoc networks. The proposed protocol is based on a protocol threading technique which is the basis of the threaded time spread multiple access (T-TSMA) protocol and a hybrid channel access strategy. According to the current network topology and traffic load, the proposed TTHM protocol can control each node to utilize its assigned slots and its non-assigned slots effectively. TTHM protocol is suitable for distributed application since it keeps the advantage of the topology transparency and eliminates the maximum nodal degree constraint. After the TTHM protocol is presented, its performance is analyzed. Simulation results show that the proposed TTHM protocol is better than the T-TSMA protocol.

    • Robust Appearance-Based Method for Head Pose Estimation

      2009, 20(6):1651-1663.

      Abstract (4483) HTML (0) PDF 1.11 M (6389) Comment (0) Favorites

      Abstract:This paper proposes a new pose estimation method based on the appearance of 2D head image. First, the 1D Gabor filters are used to extract the features on the raw images. Compared with the traditional 2D Gaborrepresents, the 1D Gabor represents are more closely related to the head pose, while the advantages of computation and storage are obvious. Second, for the extracted features, a new method, named kernel local fisher discriminant analysis, is applied to eliminate the multimodal problem, while at the same time enhance the discrimination ability.Experimental results show that the proposed method is effective for pose estimation. It must be pointed out that the generalizability of the proposed method is illustrated by the impressive performance when the training dataset and the testing dataset are heterogeneous.

    • Motion Synthesis for Virtual Human Using Functional Data Analysis

      2009, 20(6):1664-1672.

      Abstract (5321) HTML (0) PDF 857.93 K (7645) Comment (0) Favorites

      Abstract:In many virtual reality applications the virtual human, as the digital representation of human, is one of the most important elements to improve the interactive capability and immersive experience. However, it remains a challenge for modeling virtual human to synthesize natural and controllable motions. This paper presents a novel method for motion synthesis based on functional data analysis. A low-dimensional functional space is constructed from a set of example motions by using functional principal components analysis. This functional space can not only discover the true dimension of the examples, but also provide an approach to synthesize natural and smooth motions with purpose by controlling the coefficients of each functional basis. This synthesis process is very efficient because there is no time-consuming calculation, which can meet the requirement of real-time applications. The experiments have proven the robustness and effectiveness of this method.

    • Constructing Surface on Irregular Meshes

      2009, 20(6):1673-1684.

      Abstract (5048) HTML (0) PDF 831.51 K (7247) Comment (0) Favorites

      Abstract:A method is presented for constructing surfaces on irregular meshes. The basic idea is to extend the B-spline method to irregular meshes through the decomposition and classification of uniform bi-cubic B-spline basis function. Given a quad mesh of control points, a basis function is constructed for each control point. Then the surface is defined by the weighted combination of all the control points using their associated basis functions. This surface is a piecewise bi-cubic rational parametric polynomial surface. It is an extension to uniform B-spline surfaces in the sense that its definition is an analogy of the B-spline surface, and it produces a uniform bi-cubic B-spline surface if the control mesh is a regular quad mesh. Examples are also included to show that the new method can be used to construct surface on irregular meshes effectively.

    • Improved Coherent Hierarchical Culling Algorithm Based on Probability Computing Model

      2009, 20(6):1685-1693.

      Abstract (4516) HTML (0) PDF 666.58 K (6341) Comment (0) Favorites

      Abstract:The efficient visibility culling method is one of the important research aspects of real time rendering of complexle dynamic scene. This paper researches the visibility culling problem and improves the coherent hierarchical culling (CHC) algorithm. To handle the redundancy and unnecessary visibility culling of CHC, a probability computing model is presented. The model first calculates the expectation of occlusion query time and rendering time, then compares them to improve the query strategy of CHC algorithm. Experimental results show that with the model high culling efficiency can be achieved in the dynamic scene with highly complex depth and large amount of objects, and also achieve real-time rendering results.

    • Modified Algorithm with Bidirectional Diffusion for TV Image Interpolation

      2009, 20(6):1694-1702.

      Abstract (4371) HTML (0) PDF 945.62 K (6766) Comment (0) Favorites

      Abstract:In this paper a method of image interpolation based on bidirectional diffusion is proposed. With this method, the edge width of interpolated images is effectively reduced and crisp and smooth edges are obtained. In this bidirectional diffusion, forward diffusion occurs in brighter lateral on edge ramp and backward diffusion proceeds in darker lateral. And the intensity of diffusion is adjusted adaptively according to the image features, which avoid the appearance of artifacts and false textures in the interpolated image. Numerical experiments on real images show that images interpolated with the proposed method have smaller edge width and are almost artifact-free.

Current Issue


Volume , No.

Table of Contents

Archive

Volume

Issue

联系方式
  • 《Journal of Software 》
  • 主办单位:Institute of Software, CAS, China
  • 邮编:100190
  • 电话:010-62562563
  • 电子邮箱:jos@iscas.ac.cn
  • 网址:https://www.jos.org.cn
  • 刊号:ISSN 1000-9825
  •           CN 11-2560/TP
  • 国内定价:70元
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063