• Volume 18,Issue 9,2007 Table of Contents
    Select All
    Display Type: |
    • Quantum Clonal Algorithm for Multicast Routing Problem

      2007, 18(9):2063-2069. CSTR:

      Abstract (4252) HTML (0) PDF 490.80 K (5266) Comment (0) Favorites

      Abstract:The bounded shortest multicast algorithm(BSMA)is believed the best constrained multicast algorithm. However,the large computation time restricts its application.As a global optimizing algorithm,genetic algorithm (GA)is applied to solve the problem of multicast more and more.GA has more powerful searching ability than traditional algorithm,however its property of"prematurity"makes it difficult to get a good multicast tree.A quantum clonal algorithm(QCA)to deal with multicast routing problem is presented in this paper,which saliently solves the"prematurity"problem in Genetic based multicast algorithm.Furthermore,the algorithm is accelerated by using quantum crossover.The algorithm has the property of simple realization and flexible control.The simulation results show that QCA has a better performance than BSMA and traditional GA.

    • Parameterized Algorithm of the Individual Haplotyping MSR Problem with Mate-Pairs

      2007, 18(9):2070-2082. CSTR:

      Abstract (4393) HTML (0) PDF 828.79 K (5179) Comment (0) Favorites

      Abstract:The individual haplotyping MSR(minimum SNP removal)problem is the computational problem of inducing an individual's haplotypes from one's DNA fragments sequencing data by dropping minimum SNPs (single-nucleotide polymorphisms).To solve the problem,Bafna,et al.had provided an algorithm of time complexity O(2kn2m)with the number of fragments m,the SNP sites n,the maximum number of holes k in a fragment.In the case that there are some Mate-Pairs,since the number of holes in a Mate-Pair can reach 100, Bafna's algorithm is impracticable.Based on the characters of DNA fragments,this paper presents a new algorithm of time complexity O((n-1)(k1-1)k222h+(k1+1)2h+nk2+mk1)with the maximum number of SNP sites that a fragment covers k1(no more than n),the maximum number of the fragments covering a SNP site k2(usually no more than 19) and the maximum number of fragments covering a SNP site whose value is unknown at the SNP site h(no more than k2).Since the time complexity is not directly related with k,the algorithm can deal with the MSR problem with Mate-Pairs efficiently,and is more scalable and applicable in practice.

    • A Combinational Heuristic Algorithm for the Three-Dimensional Packing Problem

      2007, 18(9):2083-2089. CSTR:

      Abstract (5220) HTML (0) PDF 509.33 K (11633) Comment (0) Favorites

      Abstract:By combining the personification heuristics and simulated annealing,a combinational heuristic algorithm for the three-dimensional packing problem is presented.This personification heuristic algorithm is inspired by the strategy of building wall in the daily life.The point-finding way and the rules of horizontal and vertical reference line are developed to control the packing process.Simulated annealing algorithm is further used to improve the personification heuristics.Computational results on benchmark instances show that this algorithm can compete with excellent heuristics from the literature.

    • Haplotyping and Haplotype Frequency Estimates on Trio Genotype Data

      2007, 18(9):2090-2099. CSTR:

      Abstract (4096) HTML (0) PDF 681.03 K (5024) Comment (0) Favorites

      Abstract:The problems of haplotyping and haplotype frequency estimation on trio genotype data under the Mendelian law of inheritance and the assumption of Hardy-Weinberg equilibrium are studied in this paper. Since most past efforts only focused on haplotyping on genotype data of unrelated individuals and data with general pedigrees, but gave insufficient efforts to the special case of trio genotype data, there is coming an increasing demand in analyzing them in particular, especially when taking into account that part of HAPMAP database is exactly trio data. This paper presents a two-staged method to estimate haplotype frequencies in trios: i) haplotyping stage, find haplotype configurations without recombinant for each trio; ii) frequency estimation stage, use the expectation-maximization (EM) algorithm to estimate haplotype frequencies based on these inferred haplotype configurations. Both the haplotyping algorithm and the EM algorithm are implemented in software package TRIOHAP using C language. Its effectiveness and efficiency and tested on simulated and real data sets as well. The experimental results show that, TRIOHAP runs much faster than a popular frequency estimation software which discards trio information. Moreover, because TRIOHAP utilizes such information, its estimation is more reliable.

    • A Probabilistic Study on the Multiple Solutions of the P3P Problem

      2007, 18(9):2100-2104. CSTR:

      Abstract (4532) HTML (0) PDF 320.22 K (4899) Comment (0) Favorites

      Abstract:Under some special conditions, the P3P problem can have 1, 2, 3 and 4 solutions, and if the 3 control points and the optical center lie on a circle, the problem is indeterminate. In this paper, by the Monte Carlo approach of up to 1 million samples, it is shown that the probabilities of the P3P problem with one solution, two solutions, three solutions, and four solutions are respectively 0.9993, 0.0007, 0.0000, 0.0000. The result confirms the well-known fact that in the most cases, the P3P has a unique solution.

    • Efficient Partial Redundancy Fault Tolerance Compilation: Replicating Critical Subgraph of Error Flow

      2007, 18(9):2105-2116. CSTR:

      Abstract (4173) HTML (0) PDF 802.44 K (4910) Comment (0) Favorites

      Abstract:Traditional fault tolerance compilations replicate all computations and registers to guarantee fault tolerance.But this brought great overhead in both storage utilization and performance.This paper suggestes a new concept of critical subgraph of error flow graph based on error flow analyses.Methods are given to generate critical subgraphs from critical nodes or from critical paths,and partial redundancy algorithm is suggested to replicate only critical subgraph.Partial redundancy algorithm guarantees effective fault tolerance,and greatly improves performance,reduces power dissipations and reduces storage usage.Experimental results show that,compared with full redundancy which replicates full error flow graph,partial redundancy can reduce register usage by 6.25%, reduce power dissipation by over 17%,reduces total execution cycles by nearly 26%,and improves performance by over 22%,at the cost of 6.25% lower nodes coverage.

    • A Performance Model-Based Self-Configuration Framework for Middleware

      2007, 18(9):2117-2129. CSTR:

      Abstract (4366) HTML (0) PDF 790.49 K (5226) Comment (0) Favorites

      Abstract:High dynamic computing environment makes QoS(quality of service)guarantee more important for component-based distributed system.Software system should possess self-tuning capacity for reacting to external environment variation.This paper proposes an adaptive self-configuration framework,which can automatically tune configuration parameters to preserve QoS as workload changes.The key of this framework is a layered queuing network based performance model,and it guides the search for the best combination of configuration parameters to satisfy the QoS requirement.This self-configuration framework is prototyped on OnceAS application server,and is validated using StockOnline by comparing the performance requirement satisfaction with and without this framework.The results show that through the framework's regulation,system performs well on QoS goal.

    • Verification of C Programs Using Assume-Guarantee Reuse of Searching

      2007, 18(9):2130-2140. CSTR:

      Abstract (4548) HTML (0) PDF 789.70 K (5272) Comment (0) Favorites

      Abstract:This paper presents a novel method,namely Assume-Guarantee reuse of searching,to verify C programs with respect to temporal safety properties.Its idea is to introduce a conservative assume condition for each program location,and to assume that every path starting from the program location will never violate the property if the evaluation of its variables at that location satisfy the assume condition.All the possible execution paths are traversed based on the assume conditions,and the temporal safety property is checked on the fly.If some assume condition is too weak,it will be continually strengthened based on the spurious counterexamples.The presented verification method can try to adopt the weak assume conditions so as to let more execution paths satisfy the conditions and to reuse the searching efforts.Therefore,a significant reduction of verification cost can be achieved.The verification method has been used to verify the initial handshake process of SSL protocol based on the C source code of openssl-0.9.6c.The experimental results demonstrate that the method is both effective and practical.

    • Towards Software Process Enactment Based on the SPEM2XPDL Model Transformation

      2007, 18(9):2141-2152. CSTR:

      Abstract (3893) HTML (0) PDF 1.02 M (4883) Comment (0) Favorites

      Abstract:SPEM(software process engineering metamodel)is the standard metamodel put forward by OMG (object management group)and widely accepted in industry.However,its model enactment is still at the early stage. In this paper,software process is viewed as a specialization of the more general kind of process known as workflow. SPEM2XPDL is presented as a software process enactment approach based on the model transformation.The well-defined mapping rules from SPEM to XPDL(XML process definition language)metamodels are provided and the corresponding transformation algorithm and engine are developed.This approach is implemented in the SoftPM project.After transformed to XPDL format,the SPEM models are successfully enacted by Shark,a famous XPDL engine.

    • A Method of Time Constraint Workflow Model Analysis and Verification

      2007, 18(9):2153-2161. CSTR:

      Abstract (4108) HTML (0) PDF 630.42 K (5457) Comment (0) Favorites

      Abstract:In order to describe the temporal aspect of workflow model and verify the temporal consistency,a method based on temporal logic and model checking for modeling and verifying time constraint workflows is presented.By this method,the first order logic is used to model workflow including its basic temporal information, temporal logic is used to model the time constraints,and model checking is used to analyze and verify the temporal consistency.The method can be used to verify any time constraints that can be described by temporal logic.Also the method can offer a counterexample of workflow instance to the time constraint which can not pass the verification. Finally,the method is validated through a case study.

    • Error Correction of Continuous Handwriting Recognition by Multimodal Fusion

      2007, 18(9):2162-2173. CSTR:

      Abstract (4439) HTML (0) PDF 844.96 K (5551) Comment (0) Favorites

      Abstract:In recognition-based user interface,users'satisfaction is determined not only by recognition accuracy but also by effort to correct recognition errors.In this paper,an error correction technique based on multimodal fusion is introduced.It allows a user to correct errors of Chinese handwriting recognition by repeating the handwriting in speech.A multimodal fusion algorithm is the key of the technique.By constraining the search for the best handwriting recognition result by speech input,the algorithm can correct errors in both character extraction and recognition of handwriting.The experimental result indicates that the algorithm is effective and efficient in computation.Moreover,evaluation also shows the correction technique can help users to correct errors in handwriting recognition more efficiently than the other two error correction techniques.

    • Representation and Extension of Rough Set Theory Based on Formal Concept Analysis

      2007, 18(9):2174-2182. CSTR:

      Abstract (4504) HTML (0) PDF 641.98 K (6707) Comment (0) Favorites

      Abstract:This paper aims to establish the relationship between formal concept analysis and rough set theory.The following results are obtained:(1)a derivative formal context of an information system can be induced by the notion of nominal scale and the technique of plain scaling in formal concept analysis;(2)some core notions in rough set theory such as partition,upper and lower approximations,independence,dependence and reduct can be reinterpreted in derivative formal contexts.In addition,the limitation of rough set theory to data processing is analyzed.The results presented in this paper provide a basis for the synthesis of formal concept analysis and rough set theory.

    • Interactive Evolutionary Computation with Fitness Noise and Its Convergence Robustness

      2007, 18(9):2183-2193. CSTR:

      Abstract (4186) HTML (0) PDF 868.38 K (5113) Comment (0) Favorites

      Abstract:Noise is an important factor that influences the performance of evolutionary computation(EC).Much research on noise was reported in traditional EC,but less in IEC(interactive evolutionary computation).The definition,source,type of noise and methods to deal with noise in EC are reviewed firstly.Secondly,related with the rational user in IEC,the convergence robustness against fitness noise in IEC is studied.Mapping among spaces, dominating relationship and convergence in IEC are discussed to establish bases for two theorems:Strong condition theorem and weak condition theorem.These two theorems imply that the noise caused by the rational user will not prevent the algorithm from converging to the global optima.Thirdly,as the successive issue,the conclusions that the effective fitness scaling method is part of the weak condition and the user preference is the true fitness in IEC are discussed.The narrow definition of fitness noise based on the weak condition is also given.The experimental results validate the theorems,and the results establish a necessary foundation for future research.

    • Xmesh: A Mesh-Like Topology for Network on Chip

      2007, 18(9):2194-2204. CSTR:

      Abstract (4872) HTML (0) PDF 651.08 K (6263) Comment (0) Favorites

      Abstract:Network on chip(NoC)has many characteristics,such as less nodes number,shorter distance between the cores,the need of less physical implementation difficulty,and so on.To satisfy the special need of the NoC,this paper presents a new topology named Xmesh and its routing algorithm called XM.This paper adds some diagonal edges on the Mesh topology to form Xmesh,sequentially reduce the average distances of the topology.Given the same network size,Xmesh has the same edge number with Torus topology,this paper compares the performance of Mesh,Xmesh and Torus topologies.A detailed theoretical analysis for Mesh,Xmesh and Torus topologies is given, and a simulation analysis based on the Popnet simulator using uniform traffic pattern and hotspot traffic pattern as benchmarks is also presented.As the simulation result shows,the average latencies of Xmesh and Torus topologies are less than 70% of the average latency of Mesh topology.For uniform traffic pattern,when the network size is small,the performance of Xmesh is better than Torus topology.For hotspot traffic pattern,when the hot node is near to the network center or the two diagonals,the latency of Xmesh is about 70%~90% of the latency of Toms topology,otherwise,the latency of Torus is about 70%~90% of the latency of Xmesh topology.In conclusion, Xmesh has a good performance just like Torus,but its physical implementation is simpler than Torus's,and both of which have a better performance than Mesh topology.

    • High Performance Control-Plane Communication Model for Scalable Routers

      2007, 18(9):2205-2215. CSTR:

      Abstract (4060) HTML (0) PDF 677.04 K (5127) Comment (0) Favorites

      Abstract:The inter-nodes communication bottleneck has been a key factor that restricts the large-scale expanding of scalable router's software architecture.To solve the problem,this paper introduces a transmission adapted sub-layer in the supporting model of traditional software architecture.Through feature extracting of the up-going data stream and pattern matching with registered task,data stream can then be classified and divided based on the content of information to increase the effective communication rate.The paper further analyzes the model's performance from task's three characteristics:Distribution rate,spread number and traffic rate,and provides an optimized task dispatching reference.It shows that the introduced transmission adapted sub-layer can reduce inter-layer redundant flow and extensibility bottleneck of communication.Finally,the presented experiment verifies the theoretical analysis.

    • A User Behavior Based Semantic Search Approach under P2P Environment

      2007, 18(9):2216-2225. CSTR:

      Abstract (4627) HTML (0) PDF 718.79 K (5499) Comment (0) Favorites

      Abstract:Restricted by the diversity of resources and the complexity of search algorithms,current search mechanisms in peer-to-peer file sharing systems are based on file names and simple keyword matching.These mechanisms cannot recognize deeper relationships between keywords and resources;hence it cannot provide high search quality.This paper proposes a new search scheme,which is built on top of the current peer-to-peer network. It harnesses users' search behaviors and download behaviors to automatically discover the deeper relationships between keywords and resources,which is then used to improve the search quality.It has the advantages of low implementation cost,low complexity,self-evolving,and supports for semantic search.Simulations based on the Maze system show that this approach has high search hit rate and accuracy.

    • An Overlay Topology Based Proactive Replication in Unstructured P2P Systems

      2007, 18(9):2226-2234. CSTR:

      Abstract (3932) HTML (0) PDF 621.62 K (5369) Comment (0) Favorites

      Abstract:The overlay network of unstructured P2P system is neither regular network,nor pure random network. The peers are usually not completely equivalent.They usually play different roles in the overlay network.This paper firstly investigates the relation among the degree distribution,the access frequency mode and the success rate, and then presents an optimal degree distribution model in terms of the popularity of data items.Finally,a feasible proactive replication is proposed to reach the expected degree distribution.The simulation shows that the proactive replication can improve the performance of the unstructured P2P.

    • Movement-Assisted Data Gathering Scheme with Load-Balancing for Sensor Networks

      2007, 18(9):2235-2244. CSTR:

      Abstract (4979) HTML (0) PDF 683.56 K (4973) Comment (0) Favorites

      Abstract:Data gathering is the basic function of the sensor networks.However,the existing gathering schemes are almost based on the architecture with a static base station which results in the quick death of nodes around the base station.The reason is that the sensor nodes located near a base station have to relay data for a large part of the network and thus deplete their batteries very quickly.This paper discusses how to use the mobile base station for data gathering with load-balancing.A data gathering scheme MADG(movement-assisted data gathering),which makes use of the mobile base station for data collection,is presented.In this scheme,the base station moves in a stationary annularity area exploited for data buffering.The gathered data.are firstly forwarded into the buffering area and then collected by the mobile base station.It is theoretically proved that the location 2~(1/2)R/2 away from the center is the optimal location for minimizing the energy consumption for transmitting data and that there exists a location which can make the maximal node load minimize.This paper then considers the optimum location jointing the energy consumption and load-balancing based on above analyses.Compared with the static base station scheme and the existing mobile base station scheme,MADG reduces the load by over 95% and 80%,respectively.

    • DDOS Attack Detecting and Defending Model

      2007, 18(9):2245-2258. CSTR:

      Abstract (5196) HTML (0) PDF 841.44 K (6924) Comment (0) Favorites

      Abstract:This paper presents the APA-ANTI-DDoS(aggregate-based protocol analysis anti-DDoS)model to detect and defend the DDoS attack.APA-ANTI-DDoS model contains the abnormal traffic aggregate module,the protocol analysis module and the traffic processing module.The abnormal traffic aggregate module classifies the network traffic into normal traffic and the abnormal traffic;the protocol analysis module analyzes the potential features of DDoS attack traffic in the abnormal traffic;the traffic processing module filters the abnormal traffic according to the current features of DDoS attack,and resumes the non-attack traffic with the help of testing the congestion control feature of the traffic.The paper then implements the APA-ANTI-DDoS system.The experimental results show that APA-ANTI-DDoS model can primely detect and defend DDoS attack and resume the non-attack traffic at the time of miscarriage of justice to guarantee the legal communication traffic.

    • Performance Analysis of the Contention Resolution Scheme in IEEE 802.16

      2007, 18(9):2259. CSTR:

      Abstract (5318) HTML (0) PDF 714.31 K (5629) Comment (0) Favorites

      Abstract:At present,the mandatory contention resolution that shall be supported by IEEE 802.16 is based on truncated binary exponential backoff algorithm.In this paper,the differences of contention mechanism between IEEE 802.16 and IEEE 802.11 are analyzed.The calculations of performance metrics such as the utilization of transmission opportunity u,the delay of bandwidth request d and the drop probability of bandwidth request pd are presented.Based on simulation results,the effects of contention parameters such as the initial window W,the number of subscriber station n and the number of transmission opportunity Ntoare discussed.Some policies to adjust the parameters are also provided.These policies arc useful for BS to schedule its uplink bandwidth resource.

    • Techniques of Building a Scalable, Efficient Intrusion Monitoring Architecture

      2007, 18(9):2271-2282. CSTR:

      Abstract (4659) HTML (0) PDF 691.62 K (5055) Comment (0) Favorites

      Abstract:To perform effective intrusion analysis in higher bandwidth network, this paper studies the data collecting techniques and proposes a scalable efficient intrusion monitoring architecture (SEIMA) for network intrusion detection system (NIDS). In the architecture of SEIMA, scaling network intrusion detection to high network speeds can be achieved using multiple sensors operating in parallel coupled with a suitable load balancing traffic splitter. High-performance data transfer is achieved through asynchronous DMA without OS's intervention by using efficient address translation technique and buffer management mechanism. Multi-rule packet filter based on finite state machine technique is implemented at user layer to eliminate overhead for processing redundant packets. The simulative and actual experiment results indicate that SEIMA is capable of reducing the using rate of CPU while improving the efficiency of data collection in NIDS, so as to save much more system resources for complex data analysis in NIDS. The method of SEIMA is very practical for network security.

    • Geometric Distortions Invariant Blind Second Generation Watermarking Technique Based on Tchebichef Moment of Original Image

      2007, 18(9):2283-2294. CSTR:

      Abstract (4433) HTML (0) PDF 883.84 K (5042) Comment (0) Favorites

      Abstract:In this approach, a rotation, scaling and translation invariant blind second generation watermarking technique is proposed using Tchebichef moments of the original image to estimate the geometric distortion parameters of the corrupted watermarked images. The Tchebichef moments of original image can be used as private key of watermark extraction process. The characteristics of the human visual system are incorporated into the watermark embedding, and the embedding process can be performed in any image domain, including spatial and transform domain. A method in DWT domain is proposed in this paper. Independent Component Analysis is adopted by watermark detector so that the watermark can be correctly extracted but not merely detected. The experimental results show that this method has a good robustness to attacks produced by the popular watermarking test software Stirmark.

    • Multi-Level and Dynamic Security Access Control for CAD Models in Collaborative Environment

      2007, 18(9):2295-2305. CSTR:

      Abstract (4479) HTML (0) PDF 776.54 K (5811) Comment (0) Favorites

      Abstract:In this paper,a multi-level and dynamic security access control(MLDAC)model is proposed for CAD models in collaborative environment.A multi-level privilege model is developed to simplify the process of permission definition and assignment for enriching the expression ability and helping to realize multi-grained access control.The dependent relation of permission and the permission state migration are brought into MLDAC for dynamic authorization management based on the basic theory of workflow.Based on the practice,MLDAC model is more efficient to control collaborative operations.It meets the characters of design tasks,i.e.divisible,dependent and interactive.

    • Algebraic B-Spline Curve Reconstruction Based on Signed Distance Field

      2007, 18(9):2306-2317. CSTR:

      Abstract (4453) HTML (0) PDF 929.10 K (6117) Comment (0) Favorites

      Abstract:An algebraic B-spline curve fitting algorithm based on the signed distance field is proposed in this paper.Given a planar point set,the moving least square(MLS)method is adopted to denoise and resample it so that the resulting point set is with low noise and uniform sampling density.Then the reliable signed distance field of the preprocessed point set is constructed by using the Level Set method.Finally,an algebraic B-spline function is adopted to fit the signed distance field by solving a linear equation system.As a result,an algebraic curve is obtained which is the zero level set of the algebraic function.By using the proposed method,not only the high quality curve is obtained,but also geometric information around the curve.Furthermore,the unwanted branches in implicit curve fitting could be avoided.

    • Illumination Normalization with Morphological Quotient Image

      2007, 18(9):2318-2325. CSTR:

      Abstract (5202) HTML (0) PDF 937.10 K (5527) Comment (0) Favorites

      Abstract:Face recognition under complex illumination conditions is still an open question. To cope with the problem, this paper proposes an effective illumination normalization method. The proposed method employs morphology and quotient image techniques by analyzing the face illumination, and it is upgraded with dynamical lighting estimation technique to strengthen illumination compensation and feature enhancement. Compared with traditional approaches, this method doesn't need any training data and any assumption on the light conditions, moreover, the enrollment requires only one image for each subject. The proposed methods are evaluated on Yale Face database B and receive a very competitive recognition rate with low computational cost.

    • Bound Estimations on Lower Derivatives of Rational Triangular Bézier Surfaces

      2007, 18(9):2326-2335. CSTR:

      Abstract (4178) HTML (0) PDF 558.68 K (4578) Comment (0) Favorites

      Abstract:Based on the de Casteljau algorithm for triangular patches, also using some existing identities and elementary inequalities, this paper presents two kinds of new magnitude upper bounds on the lower derivatives of rational triangular Bézier surfaces. The first one, which is obtained by exploiting the diameter of the convex hull of the control net, is always stronger than the known one in case of the first derivative. For the second derivative, the first kind is an improvement on the existing one when the ratio of the maximum weight to the minimum weight is greater than 2; the second kind is characterized as being represented by the maximum distance of adjacent control points.

    • Multisolution Shape Editing of Point-Sampled Geometry

      2007, 18(9):2336-2345. CSTR:

      Abstract (4328) HTML (0) PDF 953.81 K (5021) Comment (0) Favorites

      Abstract:This paper proposes a technique for shape editing of point-based geometry based on geometric detail mapping. The geometric detail is an important attribute of a surface. It is defined in this paper as the difference between the original point-sampled surface and its base surface which is constructed as a multilevel B-splines approximation. The geometric detail is represented as the set of the local affine coordinates of the base surface and extracted effectively and efficiently in user specified frequency band of geometric signal. It can be mapped onto any arbitrary point set surface including its own deformed base surface. Versatile editing tools are developed for the point sampled geometry that naturally preserves the geometric detail. Experimental results demonstrate the high potentials of the approach for point-sampled geometry modeling.

    • Ternary Subdivision Scheme for Quadrilateral Mesh with Fast Convergence

      2007, 18(9):2346-2355. CSTR:

      Abstract (4249) HTML (0) PDF 680.17 K (5083) Comment (0) Favorites

      Abstract:This paper proposes a ternary stationary subdivision scheme for quadrilateral mesh. For regular and irregular quadrilateral meshes, different subdivision masks are applied to generate new vertices. The number of faces on the refined mesh is about nine times than that of the coarse mesh after every subdivision step. The limit surface generated by the new method is C2 continuous for a regular mesh and C1 continuous for an irregular mesh. Compared with typical subdivision schemes, the proposed scheme has faster convergence speed and the ability to solve arbitrary topological quadrilateral mesh. Some examples are given in the end to illustrate the performance of the new subdivision scheme.

    • A Method for Character Animation Generation and Control

      2007, 18(9):2356-2364. CSTR:

      Abstract (4162) HTML (0) PDF 686.21 K (4887) Comment (0) Favorites

      Abstract:In this paper presents an approach for creating lifelike, controllable motion in interactive virtual environments. This is achieved through learning statistical model from a set of motion capture sequences. The method is based on clustering the motion data into motion primitives that capture local dynamical characteristics-Dynamic texture, modeling the dynamics in each cluster using linear dynamic system (LDS), annotating those LDS' which have clear meaning, and calculating the cross-entropy between frames of LDS' to construct a directed graph called a annotated dynamic texture graph (ADTG) which has two-level structure. The lower level retains the detail and nuance of the live motion, while the higher level generalizes the motion and encapsulates the connections among LDS'. The results show that this framework can generate smooth, natural-looking motion in interactive environments.

Current Issue


Volume , No.

Table of Contents

Archive

Volume

Issue

联系方式
  • 《Journal of Software 》
  • 主办单位:Institute of Software, CAS, China
  • 邮编:100190
  • 电话:010-62562563
  • 电子邮箱:jos@iscas.ac.cn
  • 网址:https://www.jos.org.cn
  • 刊号:ISSN 1000-9825
  •           CN 11-2560/TP
  • 国内定价:70元
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063