WANG Shou-Xin , ZHANG Li , WANG Shuai , SHEN Ju-Fang , LIU Yu
2011, 22(4):593-608. DOI: 10.3724/SP.J.1001.2011.03736 CSTR:
Abstract:Satisfiability representation and reasoning are important issues in goal-oriented requirements engineering. According to the uncertainty of subjective cognition during abstract qualitative concepts from quantitative universal set, this paper proposes a representation model based on the cloud model for goals satisfiability. The proposed model integrates randomness and fuzziness of subjective perception of goals satisfiability. It has qualitative semantic clarity as well as quantitative accuracy of goals satisfiability. On the basis of the model, the paper presents a reasoning approach based on the core idea of ordered weighted aggregation operators. This approach deduces satisfiability of parent goals between the minimal and maximal satisfiability of sub-goals which reflects the pecularity of human thinking and avoids absolute reasoning results based on pure logic “and” and “or”. The main characteristics are analyzed through theorem proving and comparison experiments. Finally, conclusions are drawn and research directions are pointed out.
DONG Meng-Gao , MAO Xin-Jun , CHANG Zhi-Ming , WANG Ji , QI Zhi-Chang
2011, 22(4):609-624. DOI: 10.3724/SP.J.1001.2011.03762 CSTR:
Abstract:Self-Adaptive systems have many complex properties such as openness in a situated environment, sensibility to changes, and dynamic of a system. Finding ways to develop such complex systems is still a problem in the literature of software engineering. In this paper, the autonomous entities in self-adaptive systems are abstracted as software agents, and the dynamic binding mechanism for self-adaptation is proposed based on the organization metaphors. A self-adaptive strategy description language called SADL (self-adaptive strategy description language) is presented to describe adaptive strategies that express how agents adapt to the changes. The complier and operating environment for SADL have been developed. This approach enables developers to separate the adaptation logic from functional logic of self-adaptive systems and explicitly describe self-adaptation in order to simplify the development and maintenance of complex self-adaptive systems. This case is studied in detail to illustrate the effectiveness of this proposed approach.
DING Bo , WANG Huai-Min , SHI Dian-Xi , TANG Yang-Bin
2011, 22(4):625-639. DOI: 10.3724/SP.J.1001.2011.03765 CSTR:
Abstract:Many challenges in multi-agent coordination can be modeled as Distributed Constraint Optimization Problems (DCOPs). Aiming at DCOPs with low constraint density, this paper proposes a distributed algorithm based on the idea of greed and backjumping. In this algorithm, each agent makes decisions according to the greedy principle that the most assignment combinations in the problems with low constraint density occur at a zero cost, and the backjumping mechanism among the agents ensures the success of this algorithm, even when this greedy principle leads to a local optimum. In contrast with the existing mainstream DCOP algorithms, this algorithm can solve problems with low constraint density with fewer messages while keeping the polynomial message length and space complexity. The correctness of the key mechanisms in this algorithm has been proved, and those advantages in performance have been verified by experiments.
BU Lei , LI You , WANG Lin-Zhang , LI Xuan-Dong
2011, 22(4):640-658. DOI: 10.3724/SP.J.1001.2011.03766 CSTR:
Abstract:The model-checking problem for hybrid systems is very difficult to resolve. Even for a relatively simple class of hybrid systems, the class of linear hybrid automata, the most common problem of reachability is unsolvable. Existing techniques for the reachability analysis of linear hybrid automata do not scale well to problem sizes of practical interest. Instead of developing a tool to perform a reachability checking of the complete state space of linear hybrid automata, a prototype toolset BACH (bounded reachability checker) is presented to perform path-oriented reachability checking and bounded reachability checking of the linear hybrid automata and the compositional linear hybrid systems, where the length of the path being checked can be made very large, and the size of the system can be made large enough to handle problems of practical interest. The experiment data shows that BACH has good performance and scalability and supports the fact that BACH can become a powerful assistant for design engineers in the reachability analysis of linear hybrid automata.
LI Wen-Rui , WANG Zhi-Jian , ZHANG Peng-Cheng
2011, 22(4):659-675. DOI: 10.3724/SP.J.1001.2011.03776 CSTR:
Abstract:The UML 2.0 Sequence Diagram has been extensively applied in industry. However, the vague semantics of UML 2.0 Sequence Diagram prevent it from being applied effectively. Modal Sequence Diagram is the modal extension of UML 2.0 Sequence Diagram, which distinguishes mandatory scenarios (described by universal MSD, denoted as uMSD) from possible scenarios (described by existential MSD, denoted as eMSD). uMSD is more expressive than eMSD and can represent the temporal properties of concurrent systems. Therefore, the main work of the paper is on uMSD. In order to make uMSD extensively used for formal analysis, verification, and monitoring, the formal semantics of uMSD, based on the Weak Alternating Büchi automaton, are represented, and the transformation algorithms of various operators are given in detail. Next, the expressiveness of uMSD is measured by the well known property specification patterns. Finally, an example is studied, and its future applications are discussed.
WANG Guo-Yin , ZHANG Qing-Hua , MA Xi-Ao , YANG Qing-Shan
2011, 22(4):676-694. DOI: 10.3724/SP.J.1001.2011.03954 CSTR:
Abstract:Knowledge is not only an important cornerstone that constructs the cognitive ability in human beings, but is also one of the basic issues in intelligence science. With the development of intelligence science and technology, the study of knowledge uncertainty is attracting more and more attention. Knowledge uncertainty includes the inherent uncertainty of knowledge itself and the effect of external influence on knowledge. In the view of granular computing (GrC), the knowledge uncertainties of the fuzzy set theory model, the rough set theory model, the quotient space theory model, and some other related granular computing models are studied in this paper. The state-of-the-art and key issues of knowledge uncertainty are discussed.
SHEN Yuan-Xia , WANG Guo-Yin , ZENG Chuan-Hua
2011, 22(4):695-708. DOI: 10.3724/SP.J.1001.2011.03728 CSTR:
Abstract:In the study of particle swarm optimization, propertly using the individual experience and social sharing information of particles has always been a problem. To solve this problem, this paper analyzes the random factors in updating the velocity eguation in the view of cognition and creates the intrinsic cognitive relation between individual experience and social sharing information. First, a correlative particle swarm optimization model is developed, which uses the Copula function to measure the dependence among random factors. In the new model, the different correlation structures and degrees of correlation between random factors can denote different strategies, which are used to process individual experience and social sharing experience. Meanwhile, this paper provides a flowchart of the correlative particle swarm optimization model, based on Gaussian Copula. Second, the relationship between the degrees of correlation and population diversity is presented, which shows that the random factors with positive linear correlation avail to maintain population diversity. Finally, the relationship between the degrees of correlation and convergence is analyzed and the convergence conditions of the correlative particle swarm optimization model are provided. Experimental simulations show that the correlation of random factors have a much greater influence on the performance of the new model, which can greatly improve convergence velocity and precision when the random factors are a completely positive linear correlation.
ZHAO Qiang-Li , JIANG Yan-Huang , XU Ming
2011, 22(4):709-721. DOI: 10.3724/SP.J.1001.2011.03752 CSTR:
Abstract:By selecting parts of base classifiers to combine, ensemble pruning aims to achieve a better generalization and have less prediction time than the ensemble of all base classifiers. While, most of the ensemble pruning algorithms in literature consume much time for classifiers selection. This paper presents a fast ensemble pruning approach: CPM-EP (coverage based pattern mining for ensemble pruning). The algorithm converts an ensemble pruning task into a transaction database process, where the prediction results of all base classifiers for the validation set are organized as a transaction database. For each possible size k, CPM-EP obtains a refined transaction database and builds a FP-Tree to compact it. Next, CPM-EP selects an ensemble of size k. Among the obtained ensembles of all different sizes, the one with the best predictive accuracy for the validation set is output. Experimental results show that CPM-EP reduces computational overhead considerably. The selection time of CPM-EP is about 1/19 that of GASEN and 1/8 that of Forward Selection. Additionally, this approach achieves the best generalization, and the size of the pruned result is small.
ZHENG Zhong , WANG Yi-Jie , MA Xing-Kong
2011, 22(4):722-735. DOI: 10.3724/SP.J.1001.2011.03770 CSTR:
Abstract:As an infrastructure for data distribution, overlay networks must incorporate efficient routing and adequate robustness in order to achieve fast and accurate data distribution in an environment with a high node churn. Considering that the existing overlay networks mostly focus on a single optimization objective and fail to ensure routing efficiency and robustness. Simultaneously, a hybrid overlay network for data distribution, Laurel, is proposed in this paper. Laurel achieves a better trade-off between routing efficiency and robustness by combining the inter-cluster multiple structured topologies with the intra-cluster unstructured topologies. Laurel also provides mechanisms for a dynamic, concurrent cluster creation, cluster departure, and load balance to make data distribution more adaptive to the dynamic network environment. Experimental results show that compared with existing overlay networks, Laurel can support faster and more accurate data distribution, even when a large amount of nodes fail in the system and balance the load within clusters.
ZHANG Hui , FANG Xu-Ming , YUAN Qin
2011, 22(4):736-744. DOI: 10.3724/SP.J.1001.2011.03779 CSTR:
Abstract:Since the radio spectrum is a very scarce resource, the call admission control (CAC) is one of the most important parts in radio resource management of mobile communication systems. With the aim at resolving the problem of streaming a medium’s over-use of resource, a cooperative game theory-based CAC strategy is proposed in this paper. The players are the services in connection, and a new service requests to access the system. The base station is the external power, which ensures the protocol operation. The results show that the capture effect of resource caused by streaming media is eased by adopting the CAC strategy; therefore, the access fairness is guaranteed. It is very significant to improve the performance of practical system applications.
LI Zhi-Jun , JIANG Shou-Xu , LI Xiao-Yi
2011, 22(4):745-760. DOI: 10.3724/SP.J.1001.2011.03754 CSTR:
Abstract:Some peers may receive service and information of low-quality from other peers in peer-to-peer (or P2P) networks. Reputation evaluation is the normal method used to reduce the above phenomena. P2P reputation, based on score feedback, is defective because it can not distinguish the malicious feedback from the erring feedback returned by honest peers. It needs long time to converge the reputation and evaluate feedback. It is inflexible and unnatural to depict the reputation of a peer through a lot of numbers. In fact, the reputation is used to determine the rank of the peers. A reputation model called RbRf (reputation based ranking feedback) based on the rank feedback, is presented in this paper. Mathematical models unfolds in this paper show that the influence of erring feedbacks attenuates with the exponential function of RbRf. The influence of unintended malicious feedbacks is attenuated with the polynomial function in RbRf. The intended collusive feedbacks are counteracted by the correct information introduced by these feedbacks. In summary, the defection of score feedback, such as the need of a second evaluation of the trust of feedback, does not in RbRf any longer because the RbRf uses rank feedback, instead of score feedback, and the RbRf can achieve a better effect when resisting to malicious attacks. All results are verified by experimental data.
WANG Shang-Guang , SUN Qi-Bo , YANG Fang-Chun
2011, 22(4):761-772. DOI: 10.3724/SP.J.1001.2011.03818 CSTR:
Abstract:To detect the SIP (session initiation protocol) flooding attack against the IP Multimedia Subsystem of 3G core network, this study proposes a double sampling and a variable sampling interval detection approach. Based on the Counting Bloom Filter for the statistical detection of characteristic information, this approach divides the detection space into five areas, namely, the normal range, interesting range, detection range, precise detection range, and attack range, and detects the statistic data falling in each of the different ranges. Simulation experimental results show that this approach has a good detection performance.
ZHU Gui-Ming , GUO De-Ke , JIN Shi-Yao
2011, 22(4):773-781. DOI: 10.3724/SP.J.1001.2011.03757 CSTR:
Abstract:It is hard to optimize query latency, query hit, and query cost at the same time for the resource location of unstructured peer-to-peer network. For this problem, this paper presents a probabilistic routing algorithm called DCBF (data copying and Bloom Filter), which is based on data copying and a Bloom Filter technique. DCBF makes a few copies of each shared resource and places each copy on a random selected node, based on a directed random network. Each node forwards membership information to neighboring nodes with distributed declining Bloom Filters. Analysis and experimental results show that DCBF can make the most of the nodes, use the membership information of resource objects by making only a few copies, and forward membership information with distributed declining Bloom Filter to achieve high query hits with low cost and low latency.
2011, 22(4):782-788. DOI: 10.3724/SP.J.1001.2011.03730 CSTR:
Abstract:The main operations of elliptic curve cryptosystems (ECCs) are scalar multiplications and multi-scalar multiplications, which heavily determined the overall implementation of the efficiency of ECC. This algorithm extends the fixed-base window method by using the signed integer factorial expansions of scalar. The main characteristic of this method is that only a point addition computation is required, and it greatly improves the computational performance of a multi-scalar. Furthermore, the correctness proof and complexity analysis of the new algorithm are presented. At last, experimental results show that the computational efficiency increases about 47.8% to 56.5% when compared with other existing methods in the case m=2.
LIANG Yun , SU Zhuo , LUO Xiao-Nan , WANG Dong
2011, 22(4):789-800. DOI: 10.3724/SP.J.1001.2011.03824 CSTR:
Abstract:Image shrinkage is the process of reducing image resolution to adapt to display screens with different aspect ratios and different sizes. Its key is to highlight important areas, keep continuity and avoid twists. This paper presents a novel image shrinking method. First, this paper iteratively shrinks the quad mesh covering the original image to the target size under the constraint of energy distortion. Then, this paper obtains the arget image by interpolating and mapping the target mesh. The energy distortion function reflects the effects of highlighting important regions, preserving structure and avoiding twists. Less distortion owns better result. Under the constraint of energy distortion, every sub quad of mesh shrinks non-uniformly. Quads with more importance shrink less. In order to accurately calculate quad’s importance, this paper proposes a new method named as Hot-Target map to calculate image importance according to image saliency and edges. Finally, this paper avoids distortion by preserving image linear edges named as featured straight edge. To increase efficiency and reduce complexity, the method is carried out by solving linear equations. Experimental results verify its effectiveness.
LI Zhi-Xin , SHI Zhi-Ping , LI Zhi-Qing , SHI Zhong-Zhi
2011, 22(4):801-812. DOI: 10.3724/SP.J.1001.2011.03742 CSTR:
Abstract:Automatic image annotation has become an important issue, due to the existence of a semantic gap. Based on probabilistic latent semantic analysis (PLSA), this paper presents an approach to annotate and retrieve images by fusing semantic topics. First, in order to precisely model training data, each image is represented as a bag of visual words. Then, a probabilistic model is designed to capture latent semantic topics from visual and textual modalities, respectively. Furthermore, an adaptive asymmetric learning approach is proposed to fuse these semantic topics. For each image document, the topic distribution of each modality is fused by multiplying different weights, which is determined by the entropy of the distribution of visual words. Consequently, the probabilistic model can predict semantic annotations for an unseen image because it associates visual and textual modalities properly. This approach is compared with several other state-of-the-art approaches on a standard Corel dataset. The experimental results show that this approach performs more effectively and accurately.
ZHANG Jun , DAI Xia , SUN De-Quan , WANG Bang-Ping
2011, 22(4):813-825. DOI: 10.3724/SP.J.1001.2011.03795 CSTR:
Abstract:In this paper, an efficient and robust image fusion method is proposed to extract pixel intensity under the best exposure value directly from a low dynamic range of images with variable exposure values. Using this method, the HDR (high dynamic range) image can be computed and shown on the traditional display device without any prior information on a camera or scene. The proposed image fusion method is based on a mathematical model of a pixel intensity curve, established by a special robust curve fitting arithmetic criteria to evaluate the image quality of each pixel. The sufficient experimentations on various scenes indicate that the proposed method is more efficient than traditional HDRI (high dynamic range imaging) technologies and more robust when imaging noise, object movement, and small camera displacement simultaneously.
CHEN Bo , WANG Hong-Xia , CHENG Li-Zhi
2011, 22(4):826-832. DOI: 10.3724/SP.J.1001.2011.03805 CSTR:
Abstract:The traditional discrete cosine transform (DCT) can only sparsely represent the horizontal and vertical edges in images. The computation complexity of directional prediction DCT (DPDCT), which has the ability to represent more directions, is much higher. To overcome these shortcomings, the fast directional discrete cosine transforms (FDDCT) is proposed in this paper, in which the transformation is performed on the predefined direction mode. Compared with DPDCT, no interpolation is needed in FDDCT, so FDDCT can sparsely represent the anisotropic edges in much faster images. A special lifting algorithm is designed between adjacent blocks to ensure the perfect reconstruction, which compacts energy in edges lying across the blocks. The experimental results show that the computation of FDDCT is no more than 1.4 times that of DCT’s. Coding with the same set partition method, PSNR compressed images that are combined with FDDCT are 0.4~1.6dB higher than those with DCT and DPDCT. Also, the edges and the details in the images are much clearer and less distortion exists.