• Volume 11,Issue 7,2000 Table of Contents
    Select All
    Display Type: |
    • Supporting Weak Synchronization over World Wide Web

      2000, 11(7):853-862. CSTR:

      Abstract (3667) HTML (0) PDF 578.94 K (4244) Comment (0) Favorites

      Abstract:The World Wide Web has become a new and powerful medium. However, applications on the Web are currently not fully multimedia supported. This is mainly because that the synchronization in the Web environment can be extremely complex, as the network infrastructure on which the Web is based provides no guarantee of real time data transfer, especially for large media, such as audio and video. In this article, weak synchronization is introduced as an efficient approach to specify temporal relationships among objects in a multimedia document to gain the best presentation performance. This approach provides a new modeling mechanism that can specify the precise as well as imprecise temporal relationships among involved objects. Specifications on inter-media synchronization, such as strict synchronization method and loose synchronization method, are also included in the approach.

    • A New Methodology for User-Driven Domain-Specific Application Software Development

      2000, 11(7):863-870. CSTR:

      Abstract (3357) HTML (0) PDF 513.89 K (5155) Comment (0) Favorites

      Abstract:This paper presents a new methodology for application software development, named as “user engineering”. It is a user-driven domain-specific application software development methodology based on component-based software architecture, strengthening driving effect of users to make software development as a detailed definition process rather than a coding process only. It indicates an effective way to meet increasing application software requirements.

    • Grid Automata and Grid Grammars for Picture Languages

      2000, 11(7):871-880. CSTR:

      Abstract (3670) HTML (0) PDF 517.98 K (4725) Comment (0) Favorites

      Abstract:A two-dimensional grid grammar was designed to fill up the missing ring in the Equivalence Theorem for recognizable picture languages (REC), summarized in a survey paper by Giammarresi and Restivo. Instead of 2-dimensional on-line tessellation automata, grid automata were introduced, which were closer to the traditional binary tree automata, to bridge the grid grammar and other approaches of describing the class of REC. Meanwhile the standard (existential) monadic second order logic was substituted by a weaker logic framework: positive monadic partition logic. A new and complete version of Equivalence Theorem for REC is presented.

    • Complexity Results for Restricted Credulous Default Reasoning

      2000, 11(7):881-888. CSTR:

      Abstract (3169) HTML (0) PDF 431.46 K (4435) Comment (0) Favorites

      Abstract:This paper concentrates on the complexity of the decision problem which decides whether a literal belongs to at least one extension of a default theory 〈D,W〉 in which D is a set of Horn defaults and W is a definite Horn formula or a Bi-Horn formula.

    • An Efficient Parallel Minimum Spanning Tree Algorithm on Message Passing Parallel Machine

      2000, 11(7):889-898. CSTR:

      Abstract (4066) HTML (0) PDF 497.24 K (4919) Comment (0) Favorites

      Abstract:An efficient parallel minimum spanning tree is proposed based on the classical Borüvka's algorithm on message passing parallel machine. Three methods were used to improve its efficiency, including two-phase union and packaged contraction for reducing communication costs, and the balanced data distribution for computation balance in each processor. The computation and communication costs of the algorithm are O(n2/p) and O((tsp+twn)n/p). On Dawning-1000 parallel machine, it gets a speedup of 12 on 16 processors with a sparse graph of 10 000 vertices.

    • >Review Articles
    • Overview of Workflow Technology

      2000, 11(7):899-907. CSTR:

      Abstract (11254) HTML (0) PDF 643.32 K (9107) Comment (0) Favorites

      Abstract:Workflow technology has been a new hotspot in the area of computer application since 1990. It is very important to study the technology in-depth, since it can significantly improve the information standard and operation efficiency of enterprises, hence make them more competitive. In this paper, the evolution of workflow technology is introduced. After that, the current situation of research in workflow area is overviewed, including workflow definition, workflow model, workflow implementation technology, and workflow transaction management. The limitations of current workflow technology and the reasons that cause them are also pointed out. Finally, the conclusions and the future trends of workflow study are given.

    • Multidimensional Data Modeling for Data Warehouses

      2000, 11(7):908-917. CSTR:

      Abstract (4274) HTML (0) PDF 677.83 K (5663) Comment (0) Favorites

      Abstract:Data model is a basic aspect in the research field of data warehouses. It has been argued that traditional data models, such as the ER model and the relational model, are in principle not powerful enough for modeling the data structure and semantics of data warehouse and supporting OLAP (on-line analysis processing). As a result, several multidimensional models based on multidimensional view of data have emerged. However, these multidimensional data models still fall short of ability to model complex data in some real-world application domains and to support complete OLAP operations. In this paper, the authors propose a new multidimensional data model based on the concepts of partial order and mapping. This model addresses supporting for complex data structure and semantics of data warehouses, especially complex hierarchies in dimensions. Along with the model, they also present an associated algebra that includes a complete set of OLAP operations and supports complex aggregation, roll-up and drill-down along hierarchies in dimensions. A new concept of aggregation function constraint is also presented in this paper, and the mechanism for expressing and checking the aggregation function constraint is included in the model.

    • Design and Implementation of a Mobile Agent Structured Migration Mechanism

      2000, 11(7):918-923. CSTR:

      Abstract (3446) HTML (0) PDF 449.58 K (4721) Comment (0) Favorites

      Abstract:Mobile agent computation will be the main paradigm of the future network computation. The mobility is the essence of mobile agent technology. After surveying some representative approaches to migration models used in mobile agent systems, the authors propose a new structured migration mechanism in this paper. In this mechanism, the itinerary and the function body of an agent are separated completely and the itinerary scheme itself has its own strictly defined structure. Besides, three kinds of migration mode are also provided. So the complexity of mobile agents can be controlled and the reusability is supported. This mechanism has been used in the mobile agent system Mogent 1.0.

    • An Approach Based on Deformation Mapping for Constraint Graph Direction and Solution

      2000, 11(7):924-930. CSTR:

      Abstract (3857) HTML (0) PDF 409.11 K (4577) Comment (0) Favorites

      Abstract:The concept of deformation mapping and deformation origin is described. A new approach for directing and solving constraint graph is presented, which is deformation mapping based and two-dimension engineering drawing oriented. A directive method for constraint graph, which is based on deformation mapping, is also proposed. Finally, a new approach for solving the deformation constraint graph is given, which is based on compound node and virtual constraint. With this method, the reliability to direct the constraint graph is increased, the computation complexity is reduced, and the cyclic constraint graph can be solved expediently.

    • A Role-Based Access Control Model for CSCW Systems

      2000, 11(7):931-937. CSTR:

      Abstract (3843) HTML (0) PDF 431.44 K (5119) Comment (0) Favorites

      Abstract:CSCW systems introduce new requirements for access control, which cannot be met by using existing models. In this paper, a new role-based access control model, RBCSAC (role-based collaborative systems access control), is introduced to meet these requirements. This model formally describes the relationship between the key elements of access control such as data, operation, privilege, role and user. It provides the method for recording access control information. The model grants and revokes access privileges of cooperative users by assigning them some roles and canceling their roles. Two role-assignment rules are also provided and two operation legality checking rules. RBCSAC model is brought forward aiming at accommodating with the characteristics of collaborative systems such as multi-user, interaction, collaboration, real-time, dynamic. This model can meet the requirements for access control in CSCW systems adequately.

    • Differentiated Services for the Internet Based on Dynamic Buffer Thresholds

      2000, 11(7):938-944. CSTR:

      Abstract (3488) HTML (0) PDF 676.43 K (4898) Comment (0) Favorites

      Abstract:A new integrated scheme of differentiated services for the Internet is proposed in this paper. In this scheme, a service which combines both delay and drop priority is provided to applications with clear service differentiation. The integrated scheme is based on the dynamic buffer threshold (DBT) scheme. In the DBT scheme, the threshold is dynamically varied in run-time based on the packet loss behaviors. In the static buffer threshold (SBT) scheme, the variation of packet loss probabilities can be very large and there is no simple mathematical expression relating the threshold to the desired packet loss rate. These problems are overcome with the DBT scheme. In this paper, the performance model of the scheme is given using stochastic Petri nets and is analyzed by simulation.

    • Research on the Mechanism of Genetic Algorithms

      2000, 11(7):945-952. CSTR:

      Abstract (4905) HTML (0) PDF 464.75 K (5379) Comment (0) Favorites

      Abstract:It's well known that the schemata theorem and the implicit parallelism are two basic theoretical foundations of genetic algorithms (GA). In this paper, the authors analyze the two basic principles and show that the two principles are not strict and have some disadvantages. That is, as the bases of GAs, the theorems are not perfect. In order to deepen the comprehension of GA, a new ideal density model of GA is presented in this paper. Based on the model, it's known that the GA is actually a guiding stochastic search. And the searching direction is guided onto the chromosome family whose ancestors belong to schemata with high fitness. Using the model to solve the typical function optimization problem, the simulation results show that the new GA has much better speed and can get more precise results. This shows that the new GA model has potential usage in practice.

    • Using Fuzzy Competitive Hopfield Neural Network for Image Segmentation

      2000, 11(7):953-956. CSTR:

      Abstract (3741) HTML (0) PDF 488.48 K (5624) Comment (0) Favorites

      Abstract:In this paper, based on the defect of self-organizing learning method, a fuzzy competitive learning method is proposed, and a fuzzy competitive Hopfield neural network for color image segmentation is designed based on competitive Hopfield neural network. The fuzzy clustering on gray feature set can be realized by means of mapping image space into gray feature space, then the color image segmentation can be done. The experiment results indicate that the algorithm is of better effect and adaptive ability to noise than Ostu method for binary segmentation, and shows higher processing speed than FCM (fuzzy C mean) algorithms for multi-class segmentation.

    • Automatic Extraction of Main Roads from Aerial Remote Sensing Images

      2000, 11(7):957-964. CSTR:

      Abstract (3654) HTML (0) PDF 758.16 K (4752) Comment (0) Favorites

      Abstract:The classification and measuring procedure for geometrical objects of a city will be simplified if its main road network could be automatically extracted from aerial remote sensing images. In this paper, an automatic approach for extracting main city roads based on straight lines is developed, in which main city roads are modeled according to their characteristics in the image. The new method puts emphases on dealing with all kinds of noise from the imaging process, and uses Gaussian distribution to blur parameters of straight lines in order to ensure extraction robustness. All parameters of this method are acquired by theoretic analysis so that the extraction process is automatic. The experimental results show that the main road network can be extracted from real aerial remote sensing images.

    • An Intention Model for Agent

      2000, 11(7):965-970. CSTR:

      Abstract (3774) HTML (0) PDF 1.67 M (5110) Comment (0) Favorites

      Abstract:Intentions, an integral part of the mental state of an agent, play an important role in determining the behavior of rational agents. There are several models of intention based on normal modal logic. But these theories suffer from the omniscience problem seriously. In this paper, the authors argue that intention is not a normal modal operator, and present another intention model. It doesn't have the logical omniscience problem and other related problems such as side-effect problem, etc. Compared with Konolige and Pollack's model of intention, this model not only is simpler and more natural, but also satisfies the K-axiom and the Joint Consistency. Actually it gives a new method for semantic representation of non-normal modal operators based on normal possible worlds.

    • Domain Dependent Language Model Based on Fuzzy Training Subset

      2000, 11(7):971-978. CSTR:

      Abstract (3586) HTML (0) PDF 386.15 K (4294) Comment (0) Favorites

      Abstract:Statistical language model is very important to speech recognition. To a system of special topic, domain dependent language model is much better than the general model. There are two problems in traditional method. (1) The corpus of special topic is not large enough as general corpus. (2) An article is always related to more than one topic, but these phenomena have not been considered during the process of model training. In this paper, the authors try to solve these two problems. They present a new method to organize the corpus——the method based on fuzzy training subset. And the training of domain dependent models is based on these fuzzy subsets. At the same time, self organized learning has been introduced in training process to improve the models' prediction ability. It can improve the performance of models evidently.

    • A Correction and Extension to the Testing Theory for Model-Based Diagnosis

      2000, 11(7):979-983. CSTR:

      Abstract (3470) HTML (0) PDF 524.55 K (4539) Comment (0) Favorites

      Abstract:Although McIlraith and Reiter characterized the test for model-based diagnosis, the limiting conditions of their researches are too strict. In this paper, the limiting conditions of relevant test theorem are relaxed and the revised releva nt test theorem and its proof are presented. As the extension of McIlraith's theories, the initial test condition of the Differential Diagnosis Principles is generalized, which improves the adaptability of differential diagnosis test.

    • Cubeheap and Branch-and-Bound Algorithms

      2000, 11(7):984-989. CSTR:

      Abstract (3812) HTML (0) PDF 355.88 K (4843) Comment (0) Favorites

      Abstract:Branch-and-Bound (B&B) algorithm is one of the fundamental methods for combinatorial optimization problems. The running time is dominated by the data structure used to implement B&B algorithm for the given problem and the related branching strategy. In this paper, the data structure called Cubeheap and the related algorithms (INSERT and DELETE) are discussed. The lower bound Ω(m+hlogh) of the running time for general B&B algorithm is proposed by constructing the mapping between the B&B procedure and the sorting procedure, where m is the number of the evaluated nodes and h is the number of the expanded nodes. According to the lower bound, Cubeheap is the near-optimal data structure to implement the general B&B algorithm. The experimental results for a concrete combinatorial optimization problem, Job-assignment, are obtained by running the B&B algorithm with the Cubeheap. The method to improve the balance of the Cubeheap is also proposed.

    • Adaptive Multi-Scale Edge Detection

      2000, 11(7):990-994. CSTR:

      Abstract (3941) HTML (0) PDF 389.94 K (4570) Comment (0) Favorites

      Abstract:An adaptive multi-scale edge detection algorithm and its fast implementation are presented in this paper. The algorithm detects edges by adaptively finding the optimal filter scale for each pixel with less computation. A method is presented to determine the range of edge scales of an image adaptively. The edge scale is treated as a parameter to describe the edge characteristic. The method is used in detecting edges of real images, and the result is satisfactory.

Current Issue


Volume , No.

Table of Contents

Archive

Volume

Issue

联系方式
  • 《Journal of Software 》
  • 主办单位:Institute of Software, CAS, China
  • 邮编:100190
  • 电话:010-62562563
  • 电子邮箱:jos@iscas.ac.cn
  • 网址:https://www.jos.org.cn
  • 刊号:ISSN 1000-9825
  •           CN 11-2560/TP
  • 国内定价:70元
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063