LIU Bi-Xin , WANG Yu-Feng , JIA Yan , WU Quan-Yuan
2005, 16(11):1859-1867.
Abstract:Web service composition is an important technology for agile inter-enterprise application integration. Centric composition engines are widely adopted to enact composite services in many Web service composition research projects. Such a centralized architecture, however, results in problem of scalability, message exchange efficiency and autonomy. A role-based decentralized approach for service composition is proposed in this paper, which partitions the global process model of a composite service into local process models according to the participant roles so as to distribute the control logic of a composite service and corresponding execution load into multiple nodes. An algorithm of generating the local process models is presented in detail with deployment and execution mechanism introduced. Simulation results indicate that this approach can support highly concurrent requests and large volume of data more effectively than the centralized architecture, so it is helpful to improve the scalability of composite services.
DING Jian-Wan , CHEN Li-Ping , ZHOU Fan-Li , HUANG Hua
2005, 16(11):1868-1875.
Abstract:The use of mathematical modeling and simulation in engineering is rapidly increasing since modern products are increasingly complex and heterogeneous. Consistency analysis of simulation model is a crucial subject of multi-domain modeling of complex physical systems. In this paper, a structural analysis method based on graph theoretical approaches is presented. The method can not only determine whether a simulation model is consistent or not, but also decompose the overall system of equations into three distinct parts: over-constrained, under-constrained, and well-constrained part. A methodology is also proposed for detecting and repairing over-constrained and under-constrained situations. Equations and variables that cause the inconsistencies can be automatically detected and isolated without solving the system of equations, and meaningful repairing messages for users are given. The methodology can considerably enhance the error finding and correcting process by providing a broad range of errors, and is implemented in a modeling and simulation tool, named MWorks.
DONG Guang-Zhi , LIU Jun-Fei , QI Xuan
2005, 16(11):1876-1885.
Abstract:Software process supporting environment (PSE) is a kind of computer system that supports meta-process of software process. PSE controls and guides real-world software development process by enacting a pre-defined software process model (SPM). The way SPM uses to control real-word process can be categorized into two groups: proactive and reactive. The proactive way cannot support software process evolution well, so more and more people pay attention to the reactive way. A kind of reactive SPM and the graphic software process modeling language which is used to define it are presented. At the same time, for each model which is defined with this language, a method is proposed to express the dynamic semantics of its behavior view with the temporal logic language XYZ/E. This provides a rigorous dynamic semantics for the model and a formal basis for its enactment and analysis.
2005, 16(11):1886-1893.
Abstract:Efficient and scalable data management becomes increasingly important in large-scale distributed storage systems. A key enabling technique is a flexible, balancing and scalable data object placement and location scheme that automatically adapts to the additions or departures of storage nodes. In this paper, a data object placement algorithm based on dynamic interval mapping is proposed, which is probabilistically optimal in both distributing data evenly and minimizing data movement when storage nodes is changed. Moreover, this algorithm supports weighted allocation of the storage nodes and variable levels of the object replication.
CHEN Jie , CHEN Xi-Lin , GAO Wen
2005, 16(11):1894-1901.
Abstract:Data collection for both training and testing a classifier is a tedious but essential step towards face detection and recognition. All of the statistical methods suffer from this problem. In this paper, a genetic algorithm (GA) based method to swell face database through re-sampling from existing faces is presented. The basic idea is that a face is composed of a limited components set, and the GA can simulate the procedure of heredity. This simulation can also cover the variations of faces in different lighting conditions, poses, accessories, and quality conditions. To verify the generalization capability of the proposed method, the expanded database is used to train an AdaBoost-based face detector and test it on the MIT+CMU frontal face test set. The experimental results show that the data collection can be speeded up efficiently by the proposed methods.
WANG Zheng-Qun , CHEN Shi-Fu , CHEN Zhao-Qian
2005, 16(11):1902-1908.
Abstract:Neural network ensembles have been used increasingly in recent years to improve classifier’s generalization ability. There are several methods of designing neural network ensembles, such as weighted average linear ensemble. The output of the weighted average linear ensemble is a weighted average of the output of each component neural network, with the weights determined by a function. Based on the characteristic of the classification issue, a function is defined, which is the ratio of the pattern separability within class to the pattern scatter between classes. The minimum of the function corresponds to the optimal weights, so an optimal linear ensemble is obtained. The optimal weights are searched by a genetic algorithm. The rationale behind the function is also analyzed, showing that it accords with the Bayesian decision rule. Finally, to estimate the performance of this linear ensemble, two handwritten Chinese character feature sets and four data sets from UCI machine learning depository are used. Empirical study shows that the optimal linear ensemble method can produce ensemble of the neural network with a stronger generalization ability.
GAO Xiao-Yu , GAO Qing-Shi , HU Yue , LI Li
2005, 16(11):1909-1919.
Abstract:In this paper, a high speed multi-language machine translation approach based on pruning on tree representations of semantic elements is proposed. This is the multi-language machine translation with the following several characteristics: Chinese segmentation before translation into another languages is not necessary, and the translation time is O(L) rather than general O(LN), where L is the length of text, N is the number of semantic elements (i.e. number of language patterns) in SER-base, even if N is hundreds of thousands or millions.
2005, 16(11):1920-1928.
Abstract:Abduction-Based speculative computation is a process which uses default hypothesis as tentative answer and continues computation when resources information cannot be obtained in time. In a computing process, if response is not consistent with belief, master Agent will revise its belief. To achieve goals, get the more accurate result of computation and reduce risks of decision under time constraints, the master Agent in speculative computation should make efforts to obtain more information via negotiation, so the negotiation is a key measure in reducing risks of decision. In this paper, basic theories of abduction and speculative computation are introduced. Based on these theories, a framework extended from the speculative computation with belief revision based on time constraints is presented, and the further negotiation framework based on time constraints and negotiation algorithm with belief revision are presented. The algorithm is embedded in speculative computation to make the master Agent revise its belief during negotiation. Finally, three experiments on goods transportation are given. The experimental results indicate that the algorithm can improve the accuracy of result of the speculative computation and reduce the risk of the result.
CHEN Ning-Jiang , WEI Jun , YANG Bo , HUANG Tao
2005, 16(11):1929-1938.
Abstract:Failure detection provides the ability of timely detecting the liveliness of runtime systems and is the basic reliability technology in distributed systems. Providing good failure detection is important to Web application server (WAS) that is the leading middleware in Web computing environment. Adaptive failure detection requires that failure detectors can dynamically adjust the detecting quality according to the requirements of applications and runtime environments. In this paper, the concepts and qualities of failure detectors are firstly discussed, and a multi-level model of failure detection in WAS is presented. Based on the QoS (quality of service) specification of failure detectors, an algorithm for adaptive failure detection is given, and an adaptive failure detection framework of WAS is designed, which can satisfy the requirements of dynamically adjusting qualities and flexible integration of failure detectors. The work has been implemented in OnceAS application server, and the experimental results are given in the end.
HU Han-Ping , CHEN Xiang , ZHANG Bao-Liang , GUO Wen-Xuan
2005, 16(11):1939-1945.
Abstract:Based on the active defense model of the network data transmission, a method of security measurement of the network data transmission is proposed. Deceptive packets are used in the active defense model to trap attacks. In addition, statistical quantification is used to measure and evaluate the security of the network data transmission according to network status parameters. This method not only helps make the policy of network data transmission accurately and efficiently, but also guarantees the security of the network data transmission.
ZHANG Qing , XIE Zhi-Peng , LING Bo , SUN Wei-Wei , SHI Bai-Le
2005, 16(11):1946-1957.
Abstract:This paper investigates the maximum lifetime data gathering problem theoretically. Specifically, (1) the simplified static routing scheme where only one routing tree is used to gather data during the lifetime of network is analyzed, (2) the actual dynamic routing scheme where a series of routing trees are used to gather data is analyzed,(3) a near optimal maximum lifetime data gathering and aggregation algorithm MLDGA is proposed, which tries to minimize the total energy consumption in each round and maximize the lifetime of a routing tree used in the round,(4) the MLDGA algorithm is simulated in Java programming language. Comparing with the existing algorithms that are only efficient in some specified conditions, the simulation results show that MLDGA performs well regardless of base station location and initial battery energy levels of sensors.
LIU Yi-Qun , ZHANG Min , MA Shao-Ping
2005, 16(11):1958-1966.
Abstract:Key resource page is one of the most important search target pages for Web search users. Decision tree learning is one of the most widely-used and practical methods for inductive inference in machine learning. Because of the difficulty in uniform sampling of Web pages, there are not enough negative instances for training a key resource decision tree. To solve the problem, the original algorithm is partly modified to learn from global instead of individual instance information. With the same evaluation method as TREC (Text Retrieval Conference) 2003, large scale retrieval experiments based on improved decision tree algorithm achieves more than 40% improvement than the ones based on the original algorithm. It not only offers an effective way for selecting Web key resource pages, but also shows a possible way to improve decision tree learning performances.
CHEN Wei-Dong , FENG Deng-Guo , TAN Zuo-Wen
2005, 16(11):1967-1974.
Abstract:The problem called “constructing signature schemes for specified verifiers” is proposed by Laih, and such a scheme is also given by Laih. It is shown that this scheme is not secure and a scheme called SV-EDL is put forward. Furthermore, the provable security theory is used to analyze such schemes, i.e. the security of SV-EDL scheme is proved in RO (random oracle) model. The security against forgery is tightly related to the Computational Diffie-Hellman problem, i.e. the forgery is almost as difficult as solving CDH (computational Diffie-Hellman) problem. Especially, for anyone except the specified verifiers, the ability of verifying signature is tightly related to DDH (decisional Diffie-Hellman) problem. Since the hardness of the CDH and DDH problem is widely believed to be closely related to the hardness of the DL (discrete logarithm) problem, the scheme offers better security guarantees than the existing schemes. In addition, it offers non-repudiation in a very straight-forward manner. Finally, the concept of threshold verification is proposed and a (t,m)-threshold verification protocol is constructed, and its security is proved in the standard model. Especially, the scheme does not ask for the existence of the trusted center.
CHEN Gang , ZHAO Xiao-Yu , LI Jun-Li
2005, 16(11):1975-1982.
Abstract:In this paper, a new self-adaptive image encryption algorithm is presented, which takes on a thorough integrity protect function and can be used in data validation. First, ergodic matrices are used to realize the position permutation algorithms. In particular, several novel methods of scrambling are proposed. By analysis of the weakness of pure position algorithms, a novel improved self-adaptive algorithm is proposed, which is strong under known-plaintext attack on image encryption. Finally the speed and safety of the new algorithm are analyzed and some simulation results are given.
WU Yong , HE Yuan-Jun , CAI Hong-Ming
2005, 16(11):1983-1991.
Abstract:A heuristic strategy is presented to solve the point location problem in spherical triangulation mesh. Firstly, a spherical mesh with regular subdivision connectivity is constructed to partition the spherical domain into some small regions. Then the region, which contains the query point p, is found according to the position of p and selected as the search area for locating p. During the point location, the barycentric coordinates are used to extract local heuristic information about the location of p so as to find the shortest path from the start triangle to the target one containing p. In comparison with traditional algorithms, it is found that the heuristic strategy has better time and space performances.
SHEN Wan-Qiang , WANG Guo-Zhao
2005, 16(11):1992-1999.
Abstract:A new series of the generalized Ball basises of degree-n with parameter k(2≤k≤「n/2」+1) are constructed as the transition from Wang-Ball basis (k=2) to Said-Ball basis (k=n/2+n), and some properties of them are given. Then, the new generalized Ball curves are based on these new basises and the algorithms for recursive evaluation, degree-elevation, and degree-reduction are introduced. Finally, the corresponding trigonal Ball basises are shown and the algorithms for the recursive evaluation and degree-elevation of trigonal Ball surfaces are given.
WAN Hua-Gen , JIN Xiao-Gang , LIU Gang , FENG Jie-Qing , PENG Qun-Sheng
2005, 16(11):2000-2007.
Abstract:3D object fusion provides an easy way to generate novel models from two or more existing geometric models by using 3D cutting and pasting operations. As a new kind of geometric modeling tool, it is now attracting much more attention. In this paper, a new montage mesh fusion method based on variational interpolating implicit surfaces is proposed. The approach first calculates mesh regions to be fused by cutting original meshes with boundless planes, then constructs a variational implicit surface by interpolating mesh regions to be fused and polygonalizes the implicit surface to obtain a blending surface between/among original meshes to be fused. Smooth mesh fusion is finally obtained by cutting away unwanted portions of the blending surface and performing topology merging operations. Compared with current methods that fuse mesh objects by directly connecting mesh regions to be fused, this approach has the advantages of no restriction on topologies of the mesh regions to be fused, and thus enabling the fusion among multiple objects, fast running speed with robustness, and easiness to use. The approach shows a promising future in applications for geometry modeling as well as computer animation.
2005, 16(11):2008-2013.
Abstract:In this paper, some key techniques about FEM surface mesh generation from multiple trimmed free surfaces are presented. The Advancing Front method is adopted and its kernel algorithm is given. A new method combining the parametric space method and direct 3D method is used during the surface calculation. As for the parametric space calculation to find an optimal 3D point, the Tangent Vector Inversion is presented so as to cut down iteration. Test results show this algorithm is quicker and more robust. The FEM mesh generated from large multiple trimmed free surfaces can be applied to finite element method directly.
LIU Gang , PENG Qun-Sheng , BAO Hu-Jun
2005, 16(11):2014-2020.
Abstract:The approach of mapping photographic images on 3D geometries that have been created from real world objects is attracting wide attentions. The registration of the images to the 3D geometries is the key technique. Since previous work used either 3D-2D point matching or silhouette matching for registration, it makes some special demands on the surface features or silhouette shapes of the geometries. In this paper, a new method is presented to overcome this problem. By matching the 3D points reconstructed from the images and the known geometric model in space, the topology and curvature information of the model surface can be fully utilized and all the images to the 3D model stitched at the same time. Experimental results show this method can account for some cases that are intractable for the traditional methods, and the results are quite satisfactory when the distribution of the reconstructed points is relatively symmetrical around the model surface.
XU Lin , LIU Zhi-Yong , LIU Ke
2005, 16(11):2021-2028.
Abstract:In this paper, the 2005 year’s application for and supporting General Program by Computer Science Division of Information Science Department of National Natural Science Foundation of China (NSFC) are summarized.