ZOU Yang , Lü Jian , CAO Chun , HU Hao , SONG Wei , YANG Qi-Liang
2012, 23(7):1635-1655. DOI: 10.3724/SP.J.1001.2012.04085 CSTR:
Abstract:Context-Sensitive graph grammars are formal tools used for specifying visual languages. In order to intuitively describe and parse visual languages, current research has stressed the formalisms and algorithms of graph grammars, but has neglected the comparison of their expressiveness. Based on the analysis and induction of the key characteristics of context-sensitive graph grammar, the relationships between their expressiveness are uncovered and proved in this paper by constructing formalism-transforming algorithms. Moreover, the proposed algorithms correlate with these formalisms; thus, facilitating the usage of context-sensitive graph grammars, as alternative formalisms rather than merely one can be chosen to separately specify and parse visual objects in applications.
ZHOU Cong-Hua , LIU Zhi-Feng , WANG Chang-Da
2012, 23(7):1656-1668. DOI: 10.3724/SP.J.1001.2012.04089 CSTR:
Abstract:In order to overcome the state explosion problem in model checking the probabilistic computation tree logic, a bounded model checking technique is proposed. First, the bounded semantics of the probabilistic computation tree logic is presented, and then its correctness is proved. Second, by a simple instance the criterion of the traditional termination, based on the length of path, is shown to fail. Based on the termination criterion used in the Newton iteration in numerical computing, a new criterion is given. Third, the bounded model checking procedure of the probabilistic computation tree logic is transformed into linear equations. Finally, three benchmarks are used to present the advantages of the bounded model checking.
CHEN Wei , WEI Jun , HUANG Tao
2012, 23(7):1669-1687. DOI: 10.3724/SP.J.1001.2012.04105 CSTR:
Abstract:Deployment, as a post-productive activity, is an important phase of software lifecycle, in which software execution is supported through configuration, installation, activation, and other activities. In order to systematically know the state of art and technical progress of software deployment, this paper builds a multi-dimensional and fine-grained framework, W4H, to characterize the technologies and software systems. This framework consists of 5 aspects and 12 dimensions, covering the subject, object, scope, fashion style, and process of software deployment. Based on the W4H analytical framework, current representatives of the software deployment method and technique have been analyzed and summarized. The study results show that the analytical framework is capable of providing a more comprehensive analysis and significant guidance for the selection and development of software deployment methods and techniques.
2012, 23(7):1688-1701. DOI: 10.3724/SP.J.1001.2012.04126 CSTR:
Abstract:Random testing is a widely practiced black-box testing technique. Recently, adaptive random testing has been proposed to improve the random testing, and simulation results show that the improvements depend on the characteristic of failure-causing regions of program under test. This paper presents the concept of test constraints and employs them to specify the distribution of failure-causing regions within the input domain of program under test. Characteristic analysis of failure-causing regions can be conducted on the base of their test constraints, which are derived using the available program analysis techniques. To evaluate the proposed technique, a case study on a real-life application was conducted, and the results show that the proposed test constraint provides an insight into how a failure is triggered and propagated, and the constraint-based analysis helps to improve the quality of test case design and assess the applicability of the adaptive random testing.
DENG Tie-Qing , REN Gen-Quan , LIU Ying-Bo
2012, 23(7):1702-1716. DOI: 10.3724/SP.J.1001.2012.04222 CSTR:
Abstract:In workflow management systems (WFMSs), appropriate consideration of applying scheduling techniques to manage actors’ personal worklists is essential for successful implementation of workflow technology. Mainly, the attention of existing workflow scheduling has focused on the process perspective. As a result, issues associated with personal worklist’s perspective, i.e., worklists that contain actors’ to-do activity instances, have been largely neglected. Given this motivation, this paper for the first time, investigates issues in personal worklist scheduling under dynamic workflow environment. Towards these issues, the paper proposes a novel genetic algorithm to optimize the personal worklist management. This algorithm recommends for each actor a feasible worklist that will ensure the worklist’s activity instances’ successful execution while minimizing the total overtime costs for all personal worklists. Through comparing with other well-known workflow scheduling algorithms, the paper evaluates the effectiveness of the proposed genetic algorithm with a specific example and a simulation experiment.
WEI Shuai , ZHAO Rong-Cai , YAO Yuan
2012, 23(7):1717-1728. DOI: 10.3724/SP.J.1001.2012.04106 CSTR:
Abstract:Nowadays, more and more processors are integrated with SIMD (single instruction multiple data) extensions, and most of the compilers have applied automatic vectorization, but the vectorization usually targets the innermost loop, there have been no easy vectorization approaches that deal with the loop nest. This paper brings out an automatic vectorization approach to vectorize nested loops form outer to inner. The paper first analyzes whether the loop can do direct unroll-and-jam through dependency analysis. Next, this study collects the values about the loop that will influence vectorization performance, including whether it can do direct unroll-and-jam, the number of array references that are continuous for this loop index and the loop region. Moreover, the study also presents an aggressive algorithm that will be used to decide which loops need to do unroll-and-jam at last generate SIMD code using SLP (superword level parallelism) algorithm. The test results on Intel platform show that the average speedup factor of some numerical/video/communication kernels achieved by this approach is 2.13/1.41, better than the innermost loop vectorization and simple outer-loop vectorization, the speedup factor of some common kernels can reach 5.3.
HU Wei , BAI Wen-Yang , QU Yu-Zhong
2012, 23(7):1729-1744. DOI: 10.3724/SP.J.1001.2012.04215 CSTR:
Abstract:Semantic Web data proliferates with the rapid growth of the semantic Web. An object on the semantic Web is likely to be denoted with many identifiers (e.g., URIs) by different parties. Resolving an object coreference on the semantic Web is to identify different identifiers for the same object and eliminate the inconsistency between their involved RDF (resource description framework) data, which is important for semantic Web data fusion, search, browsing, etc. In this paper, the problem of resolving object coreference on the semantic Web is first formalized. Next, the state of the art of works are surveyed and categorized into five aspects: The used characteristics for coreference identification, the mechanisms for data conflict resolution, the applicable scopes of current approaches, prototypes, and benchmarks. Finally, open research issues are discussed and possible future research directions are also pointed out.
ZHANG Qing-Hua , WANG Guo-Yin , XIAO Yu
2012, 23(7):1745-1759. DOI: 10.3724/SP.J.1001.2012.04226 CSTR:
Abstract:Rough sets proposed by professor Pawlak in 1982 is an important tool to process the uncertainty of a set’s boundary, and it describes the uncertainty of set X (or concept) with two crisp boundaries that are upperapproximation set and lower-approximation set of X. However, a rough set does not give out the method for precisely, or approximately describe the uncertain set X (or concept) with existing knowledge base. In this paper, the similaritybetween two sets is proposed at first, the disadvantages of using upper-approximation set R(X) or lower- approximation set R(X) as an approximation set of the uncertain set X (or concept) are analyzed, and then amethod for building an approximation set of the uncertain set X is presented, the conclusion that the set R0.5(X) is the optimal approximation set is proved. Finally, the changing regularities of similarity between R0.5(X) and X with the change of knowledge granulatity in knowledge space are disscussed in detail. From the new viewpoint, this paper presents a new method for building an approximation set of the uncertain set X, and it will promote the development of rough set model.
WANG Rong-Fang , JIAO Li-Cheng , LIU Fang , YANG Shu-Yuan
2012, 23(7):1760-1772. DOI: 10.3724/SP.J.1001.2012.04151 CSTR:
Abstract:A new self-adaptive dynamic control strategy of population size is proposed. This strategy can be easily combined with various nature computation methods because its implementation is independent of the evolutionary operation details. The framework of the strategy is first given. Based on the framework, the study proposes a method which can vary the number of increase/decrease on the basis of the logistic model. The study also designs an increase operator giving consideration to the effectiveness and diversity adaptively, as well as a decrease operator with the diversity. The strategy is applied to two different nature computation methods. Experimental evaluation is conducted on both a set of standard test functions and a new set of benchmark functions CEC05. The results show that the new algorithms with proposed strategy outperform the original algorithms on both the precision and convergence rate.
SHANG Rong-Hua , JIAO Li-Cheng , HU Chao-Xu , MA Jing-Jing
2012, 23(7):1773-1786. DOI: 10.3724/SP.J.1001.2012.04108 CSTR:
Abstract:This paper proposes a modified immune clonal constrained multi-objective algorithm for constrained multi-objective optimization problems. By introducing a new constrained handling strategy to modify the objective values of individuals, the proposed algorithm optimizes the individuals with the modified objective values and stores the non-dominated feasible individuals in an elitist population. In the optimization process, the algorithm not only preserves the non-dominated feasible individuals, but also utilizes the infeasible solutions with smaller constrained violation values. Meanwhile the new algorithm introduces the overall cloning strategy to improve the distribution diversity of the solutions. The proposed algorithm has been tested on several popular constrained test problems and compared with the other two constrained multi-objective optimization algorithms. The results show that the optimal solutions of the proposed algorithm are more diverse than the other two algorithms and better in terms of convergence and uniformity.
WANG Yan-Jie , LIU Xia-Bi , JIA Yun-De
2012, 23(7):1787-1795. DOI: 10.3724/SP.J.1001.2012.04082 CSTR:
Abstract:This paper proposes a visual word soft-histogram for image representation based on statistical modeling and discriminative learning of visual words. This type of learning uses Gaussian mixture models (GMM) to reflect the appearance variation of each visual word and employs the max-min posterior pseudo-probabilities discriminative learning method to estimate GMMs of visual words. The similarities between each visual word and corresponding local features are computed, summed, and normalized to construct a soft-histogram. This paper also discusses the implementation of two representation methods. The first one is called classification-based soft histogram, in which each local feature is assigned to only one visual word with maximum similarity. The second one is called completely soft histogram, in which each local feature is assigned to all the visual words. The experimental results of Caltech-4 and PASCAL VOC 2006 confirm the effectiveness of this method.
WANG Chong-Jun , WU Jun , ZHANG Lei , XIE Jun-Yuan
2012, 23(7):1796-1804. DOI: 10.3724/SP.J.1001.2012.04135 CSTR:
Abstract:Coalitional Normative System (CNS) is an extension of a Normative System (NS) by enabling selective constraining a coalition’s joint behavior. The study extends the semantics of ATL and proposes Coordinated ATL (Co-ATL) to support the formalizing of CNS. The paper classifies all the CNSs that control the same coalition into the same class and characterizes the power limitation of each such class by identifying two fragments of Co-ATL language corresponding to two types of system properties that are unchangeable. The relation between NS and CNS is discussed, and it turns out that by the result better characterize the power limitation of NS. Moreover, the study extends further the CNS by including a finite state machine encoding the execution history, and results show that the power limitation characterization for CNS is invariant under this extension.
TAO Xin-Min , LIU Fu-Rong , LIU Yu , TONG Zhi-Jing
2012, 23(7):1805-1815. DOI: 10.3724/SP.J.1001.2012.04128 CSTR:
Abstract:To deal with the problem of premature convergence and low precision of the traditional particle swarm optimization algorithm, a particle swarm optimization (PSO) algorithm based on multi-scale cooperative mutation, is proposed, which is guaranteed to converge to the global optimal solution with probability one. The special multi-scale Gaussian mutation operators are introduced to make the particles explore the search space more efficiently. The large-scale mutation operators can be utilized to quickly locate the global optimal space during early evolution. The small-scale mutation operators, which are gradually reduced according to the change of the fitness value can implement the accuracy of the solution at the late evolution. The proposed method is applied to six typical complex function optimization problems, and the comparison of the performance of the proposed method with other PSO algorithms is experimented. The results show that the proposed method can effectively speed up the convergence and improve the stability.
LI Hong-Bo , LI Zhan-Shan , WANG Tao
2012, 23(7):1816-1823. DOI: 10.3724/SP.J.1001.2012.04129 CSTR:
Abstract:Constraint satisfaction problems have been widely investigated in artificial intelligence area. This paper investigates whether the coarse-grained maintaining arc is consistent, which is used to solve CSPs. The study has found that during the search for a solution, there are ineffective revisions, which revise the arcs that point to assigned variables. This study has proved that such revisions are redundant and proposed a method to avoid such redundant revisions. The paper gives the improved basic frame for the coarse-grained arc consistency algorithms, named AC3_frame_ARR. The new frame can be applied to improve all the coarse-grained AC algorithms. The experimental results show that the improved algorithms can save at most 80% revisions and 40% CPU time.
KAN Bao-Qiang , FAN Jian-Hua , WANG Jian-Ye
2012, 23(7):1824-1837. DOI: 10.3724/SP.J.1001.2012.04225 CSTR:
Abstract:The cognitive radio network (CRN) has been paid more attentions as it can enable the device to dynamically access unused spectrum without interference to primary users (PUs). The medium access control (MAC) protocol of CRN, has being intensively researched for its key function position. This paper reviews the demand of MAC for CRN, the state-of-the-art spectrum sensing technology, and a detailed classification of the MAC protocols. In this overview, the characteristic features, advantages, and the limiting factors of the existing CRN MAC protocols are thoroughly investigated for both underlay-based and overlay-based networks. Several challenges and future research directions are also presented in this paper.
ZHANG Yu-Jun , ZHANG Han-Wen , XU Zhi-Jun
2012, 23(7):1838-1848. DOI: 10.3724/SP.J.1001.2012.04093 CSTR:
Abstract:This paper proposes an active routing based intra-domain mobility management (ARMM) for wireless mesh networks. ARMM combines the location update procedure of mobile stations with the active routing procedure of mesh routes, which makes the mesh routers perform location management, on behalf of the mobile stations that are attached to them. Thus, the mesh routes can locate the mobile stations attached to them. This paper proposes the two-layer routing model, routing algorithm and location update mechanism. The performance analysis shows that, compared to the existing solution, ARMM has shorter handover delay, and a smaller location update and delivery cost.
ZHENG Zhong , WANG Yi-Jie , MA Xing-Kong , YANG Yong-Tao
2012, 23(7):1849-1868. DOI: 10.3724/SP.J.1001.2012.04097 CSTR:
Abstract:In many P2P applications, nodes can be classified into different types according to their interests or resources, and the routing for the nodes with a specified type over overlay networks is the key for data distribution and query in these applications. Unstructured overlays have a low maintenance cost and high robustness, but fail to ensure the routing efficiency. This paper proposes a gossip-based type sampling approach—TypeSampler, which samples the nodes of different types with the same probability (called type sampling). The type sampling ensures the lower bound probability of finding a neighbor node with a specified type at any node, and thus ensures the routing efficiency over the unstructured overlay. For type sampling, TypeSampler first implements the proportion estimation of types through peer sampling and anti-entropy aggregation based on gossip. Next, TypeSampler maintains a type sampling table at each node, periodically, based on the estimated proportion values. Theoretical analysis and experimental results reveal that TypeSampler can not only achieve precise proportion estimation and approximately uniform random type sampling, but can also work well even in the dynamic network environment. Moreover, TypeSampler can support more efficient routing and has better scalability compared to the existing approaches.
HUANG Wei , ZHAO Xian-Feng , FENG Deng-Guo , SHENG Ren-Nong
2012, 23(7):1869-1879. DOI: 10.3724/SP.J.1001.2012.04107 CSTR:
Abstract:To solve problems in the existing JPEG steganalysis schemes, such as high redundancy in features and failure to make good use of the complementarity among them, this study proposes a JPEG steganalysis approach based on feature fusion by the principal component analysis (PCA) and analysis of the complementarity among features. The study fuses complementary features to reflect the statistical differences between cover and stego signals in the round, isolates redundant components by PCA, and finally achieves the goal of improving accuracy. Experimental results show that in various datasets and embedding rates, this scheme provides more accuracy than the main JPEG steganalysis schemes against steganographic methods of high concealment (e.g. F5, MME and PQ) and greatly reduces the time cost of the existing fusion methods on feature level.
WU Di , FENG Deng-Guo , LIAN Yi-Feng , CHEN Kai
2012, 23(7):1880-1898. DOI: 10.3724/SP.J.1001.2012.04112 CSTR:
Abstract:The efficiency evaluation of information system’s security measures is important to improve the information system security. Conventional evaluation methods did not consider the interactivity and inter-influence of the business dataflow, attack flow, and security measures factors when evaluating system’s security measures. Thus, they can not ensure the effectiveness of the evaluation process and results. An efficiency evaluating approach for information system’s security measures under the given vulnerability set is presented in this paper. It employs colored Petri-Net tools to uniform modeling and simulates the interaction among the system’s workflow, attack flow, and security measures. Based on this modeling method, the paper proposes an inter-nodes vulnerabilities exploiting graph generation algorithm and improves Dijkstra algorithm to identify shortest-attack-paths, which can cause damage to the information system’s security attributes. Next, it constructs a hierarchical model to evaluate the effectiveness of the security measures and employs a gray multiple attributes decision-making algorithm to choose the best effectiveness-improving alternatives. By using this approach, the dependency on evaluators’ subjectivity in the process of the evaluation of information system’s security measures can be alleviated. Also, it helps to ensure the consistency and traceability of the evaluation results. Finally, a practical Web business system is taken as a case study to validate the correctness and effectiveness of the evaluation model.
YANG Xiao , FAN Xiu-Bin , WU Chuan-Kun , YU Yu-Yin , FENG Xiu-Tao
2012, 23(7):1899-1907. DOI: 10.3724/SP.J.1001.2012.04123 CSTR:
Abstract:BOMM is a byte-oriented mixed type algorithm with memory, which is used to disorder a given byte sequence. It has been used as a main component in a new stream cipher called Loiss for having many good cryptographic properties. This paper builds an algebraic equation system with degree 5 for BOMM, and based on this equation system, discusses the complexity of algebraic attack on Loiss. In addition, the paper also discusses the statistic weakness of BOMM and gives an analysis of the security of Loiss under a specific class of weak keys.
LIU Zhi-Hui , SUN Bin , GU Li-Ze , YANG Yi-Xian
2012, 23(7):1908-1923. DOI: 10.3724/SP.J.1001.2012.04125 CSTR:
Abstract:A new origin authentication scheme based on a threaded balanced binary stored hash tree for authenticated delegation/assignment dictionaries is proposed to solve the problems of BGP (border gateway protocol) address prefix hijacking. BGP address prefix announcement is made up of AS number and IP address prefix, and this paper makes use of the number value range to uniformly define two kinds of BGP address prefix announcement resources, so the two kinds of BGP address prefix announcement resources’ origin trustworthy problems are issued by one efficient origin authentication scheme in this paper. This scheme inherits the merit of a threaded binary stored hash tree to correct the shortcomings existing in the William Aiello and Patrick McDaniel’s origin authentication scheme that the amount of the evidence for invalid delegation/assignment is double that of the valid. Meanwhile, in contrast with original OA scheme, this scheme reduces the number of tree nodes to half the amount of the delegation/assignment attestation set, which is smaller, so this scheme is more efficient.
2012, 23(7):1924-1934. DOI: 10.3724/SP.J.1001.2012.04130 CSTR:
Abstract:With the aid of multipath transport protocols, a multi-homed host can transfer data through multiple paths in parallel to improve goodput and robustness. However, the receiver has to deal with a large quantity of out-of-order packets due to the discrepancy of paths in terms of bandwidth, delay and packet losses. Theoretical analysis suggests that there are two approaches to reducing the memory overhead of caching out-of-order packets. One is to minimize the quantity of packets backlogged in outgoing queues of senders, and another is to decrease the packet sending rate. From the former, the study proposes a packet scheduling algorithm, named SOD (Scheduling On Demand), which assigns packets to each path according to the free window size. From the latter, a simple flow control method is proposed, which leverages the window feedback advertisement mechanism to limit the packet sending rate. Experimental results show that compared with existing algorithms, SOD suffers from the lowest memory overhead and obtains the highest goodput when receivers enable flow control. Additionally, SOD works steadily in the cases of diverse simulation scenarios.