ZHOU Xiao-Cong , SHU Zhong-Mei
2003, 14(10):1661-1671.
Abstract:Unlike the widespread applications of algebraic methods in computer science, coalgebraic methods, as the dual concepts of algebras, have not been noticed by computer scientists till the mid 1990s. In algebraic methods, the constructive elements of data types are studied, while in coalgebraic methods, the observable behaviors of systems are investigated. Coalgebraic methods have distinct advantages in mathematic study of state-based systems since them enable the depth research on those systems’ properties such as behavior equivalence, nondeterminism and so on. Coalgebraic methods have also been applied in many research fields, for example, automata theory, semantics of concurrency, specifications of object-oriented software etc. The recent progress of coalgebraic methods, including its basic concepts, categorical foundations, logical foundations and its applications, is summarized for raising the attention of the relative researchers.
2003, 14(10):1672-1680.
Abstract:The modal graphs are effective graph forms for the predicate μ-calculus. The consistency between the predicate μ-calculus and the modal graphs is strictly established. Moreover, the relationship among the predicate μ-calculus, nested predicate equations and the modal graphs is discussed in detail. An optimized transformation algorithm from predicate μ-calculus formulae to nested predicate equations is presented.
2003, 14(10):1681-1691.
Abstract:The efficiency of an algorithm depends greatly on the data structure adopted in practice, while the processing of useless data can cause not only much waste of memory but also much waste of time. Hence to eliminate redundant information is one of the main focuses in the research of algorithms, and the issue has received extensive attention in real-time field (especially those based on dense/continuous time semantics) for many years. After investigating existing problems in finite representation and operations, and by analyzing the dependence relation in information, a method is presented for eliminating redundant information in representation of dense time, which is developed based on an modification of the procedure of ‘Normalization’. The correctness of the method is proved and the efficiency is tested by experiences.
2003, 14(10):1692-1696.
Abstract:An efficient method is introduced for discovering minimal functional dependencies from large database. It is based on the concept of agree sets. From agree sets, maximal sets and its complements are derived, and all minimal non-trivial functional dependencies can be generated. The computation of agree sets can be decreased by using stripped partition database. A levelwise algorithm is used for computing the left hand sides of minimal non-trivial functional dependencies. This method can be used to attribute reduction, clustering and mining associate rules, etc. in knowledge discovery as well as reorganization and design of databases.
XIA Ying-Ju , HUANG Xuan-Jing , HU Tian , WU Li-De
2003, 14(10):1697-1705.
Abstract:One special challenge in adaptive information filtering is the problem of extremely sparse data. So it is very important to learn optimal threshold while filtering the input textual stream. In this paper, an algorithm is presented for the threshold optimization. The algorithm learns fast by using few positive samples. Moreover, most of the quantities the algorithm requires can be updated incrementally, so its memory and computational power requirements are low. It also has the merits of effective, robust, and practically useful. Fudan University's adaptive text filtering system used this algorithm for the first time and came in third in all runs of TREC10. Its T10U and T10F are 0.215 and 0.414 respectively.
FENG Yu-Cai , LIU Yu-Bao , FENG Jian-Lin
2003, 14(10):1706-1716.
Abstract:Constrained cube gradient mining is an important mining task and its goal is to extract the pairs of gradient-probe cell satisfy the gradient constraint from a data cube. However, previous work are explored for a general data cube. In this paper, the problem of the mining constrained cube gradient for a condensed cube is studied. An algorithm named as eLiveSet for the problem is developed through the extension of the existing efficient mining algorithm LiveSet-driven. The experimental results show that the algorithm is more effective than the existing algorithm on the performance of mining constrained cube gradient.
LI Jian-Zhong , LI Jin-Bao , SHI Sheng-Fei
2003, 14(10):1717-1727.
Abstract:Sensor networks are integration of sensor techniques, nested computation techniques, distributed computation techniques and wireless communication techniques. They can be used for testing, sensing, collecting and processing information of monitored objects and transferring the processed information to users. Sensor network is a new research area of computer science and technology and has a wide application future. Both academia and industries are very interested in it. The concepts and characteristics of the sensor networks and the data in the networks are introduced, and the issues of the sensor networks and the data management of sensor networks are discussed. The advance of the research on sensor networks and the data management of sensor networks are also presented.
FAN Guo-Chuang , ZHONG Hua , HUANG Tao , FENG Yu-Lin
2003, 14(10):1728-1739.
Abstract:Web application server (WAS), which is considered to be one of the most exciting milestones of enterprise software technology since the relational database and has become very popular in the last few years. It solves the problems in applying traditional middlewares to web computing environment and provides web middleware platforms that support for the deployment, integration, and execution of transactional web applications. Attention to the new category of middleware has been drawn from the academia and software industry. The current state-of-the-art of Web application servers, including what is a web application server and its functionality, architectural model, component container, transaction processing, load balancing, mid-tier caching, web services and analysis criteria etc., are outlined. Using the Ecperf benchmark, a comparative study among some leading Web application servers is presented by evaluating their functions and performance. Lastly, some open issues about WASs are discussed, and the future direction of WASs is pointed out.
2003, 14(10):1740-1752.
Abstract:This paper is a survey on the twenty years development of security protocols research. The state of the art in the application of formal methods to the design and analysis of security protocols is presented. Some major threads and emerging trends of research in this area are outlined.
BAO Jun-Peng , SHEN Jun-Yi , LIU Xiao-Dong , SONG Qin-Bao
2003, 14(10):1753-1760.
Abstract:Copy detection has very important application in both intellectual property protection and information retrieval. Currently, copy detection concentrates on document copy detection mainly. In early days, document copy detection concentrated on program plagiarism detection mainly and now the most studies are on text copy detection. In this paper, a comprehensive survey on natural language text copy detection is given, the developments of copy detection is introduced. The approaches and features of a variety of existing text copy detection systems or prototypes are reviewed in detail. Then some key detection techniques are listed and compared with each other. In the end, the future trend of text copy detection is discussed.
YANG Bo , YANG Kun , LIU Da-You
2003, 14(10):1761-1767.
Abstract:Mobile Agent technology provides a new means for network management, but it also brings some insecurity factors at the same time. Based on the analysis of the security threats and the corresponding measures that may occur during the policy and mobile agent based network management applications, the MASF (mobile Agent security facility) for network management is presented. MASF supports a wide span of security mechanisms such as storage protection, confidentiality, authentication, integrity, authorization and security log, all these mechanisms are seamlessly integrated to secure the network management. Based on MASF, a practical network management application, inter-domain virtual private network configuration, is developed. Verified by the application, MASF can satisfy with most security requirement of network management.
2003, 14(10):1768-1780.
Abstract:Up to now, the World Wide Web (WWW) grows into a large hyperlinked corpus with more than 800 million pages and 5 600 million hyperlinks. Moreover, it is obviously impossible that any global ‘planning’ can be imposed on the creation of such a corpus. This brings some challenges to many research fields on the World Wide Web. On the other hand, the hyperlinked Web pages in the networking environment can be a very rich information source for daily or business use, provided people have effective means for understanding the Web. Linkage analysis is playing more and more significant role in many fields on the World Wide Web. Recent advances about the relevant research and application of linkage analysis of World Wide Web are presented in this paper. In particular, some results and achievements about linkage analysis and its applications on Web searching, Web community discovery and the Web modeling are surveyed here.
2003, 14(10):1781-1786.
Abstract:Related source code of Linux is analyzed for the performance of overloaded servers. Then, the phase of receiving packets of the kernel is analyzed by using queuing theory and several conclusions on the performance of overloaded servers are drawn. Based on these analyses, some methods to improve the performance of overloaded servers are presented and implemented on Linux. The results of the tests prove that these methods can avoid livelock effectively and improve the performance of overloaded servers greatly.
XU Zhi-Yuan , TANG Ze-Sheng , TANG Long
2003, 14(10):1787-1795.
Abstract:This paper proposes a new rejection test for accelerating ray/triangle mesh intersection. In the approach, a ray is defined as the intersection of two nonparallel planes. For a given ray and a complex scene including dense triangle meshes, this approach can cull most nonintersecting triangles by a simple rejection test that only involves triangle/plane intersection tests. With this approach, exploiting image-space coherences for primary rays in ray tracing is straightforward. In order to exploit object-space coherences, the approach can also be combined with popular spatial partition schemes, e.g. bounding box hierarchies and octrees. Furthermore, this approach can be easily extended to more general polygonal meshes.
2003, 14(10):1796-1805.
Abstract:A method is proposed to recover the reflectance properties of all objects in an environment. The input of the algorithm is a 3D geometric model of the scene, a panorama of it, and the information of light sources. The result is a full model of the reflectance properties of the scene. Recovery is done in a progressive refinement manner. At the very beginning, supposes all surfaces are diffuse and generates a new panorama. Then the new rendered panorama and the original one are compared iteratively. If the differences of some objects are greater than a threshold, more complex reflectance models are chosen for them. Finally, each object in the scene has a proper reflectance model, the scene can be rendered under novel lighting or viewing conditions, old objects can be removed from the scene, and new objects can be augmented into the scene. Special efforts have been done on recovering textures of textured diffuse, isotropic, and anisotropic surfaces; shadows and highlights are eliminated almost clearly.
2003, 14(10):1806-1812.
Abstract:A new method to scale a surface while holding the shape of specific features (trimming curves) unchanged is presented. The key of this method is using a new objective function to minimize the difference of the two surfaces before and after constrained scaling. The new objective function is defined as the integral of the square of the norm of the cross product of the two normal vectors on corresponding points of the two surfaces. Minimizing this objective function guarantees that the difference of the two normal vectors on every corresponding point of the two surfaces is as small as possible, which makes the shape and curvature distribution as close as possible. Compared with the Fix-and-Stretch method, the new method gives better results for several car parts with trimming curves. The high-light line models of the resulting surfaces produced by the two methods are also included.
ZHANG Ren-Jiang , WANG Guo-Jin
2003, 14(10):1813-1818.
Abstract:By using some new ideas of form conversion and expression simplification for rational Bézier curves, also by using Cauchy's inequality, some new close estimates for the heights of degree n (n=2,3,4) rational Bézier curves which are in common use in geometric shape design are investigated. Thus the former termination criterions for subdivision of rational Bézier curves are improved. This work is very valuable for reducing computing time and increasing efficiency.