2007, 18(9):2063-2069.
Abstract:The bounded shortest multicast algorithm(BSMA)is believed the best constrained multicast algorithm. However,the large computation time restricts its application.As a global optimizing algorithm,genetic algorithm (GA)is applied to solve the problem of multicast more and more.GA has more powerful searching ability than traditional algorithm,however its property of"prematurity"makes it difficult to get a good multicast tree.A quantum clonal algorithm(QCA)to deal with multicast routing problem is presented in this paper,which saliently solves the"prematurity"problem in Genetic based multicast algorithm.Furthermore,the algorithm is accelerated by using quantum crossover.The algorithm has the property of simple realization and flexible control.The simulation results show that QCA has a better performance than BSMA and traditional GA.
XIE Min-Zhu , CHEN Jian-Er , WANG Jian-Xin
2007, 18(9):2070-2082.
Abstract:The individual haplotyping MSR(minimum SNP removal)problem is the computational problem of inducing an individual's haplotypes from one's DNA fragments sequencing data by dropping minimum SNPs (single-nucleotide polymorphisms).To solve the problem,Bafna,et al.had provided an algorithm of time complexity O(2kn2m)with the number of fragments m,the SNP sites n,the maximum number of holes k in a fragment.In the case that there are some Mate-Pairs,since the number of holes in a Mate-Pair can reach 100, Bafna's algorithm is impracticable.Based on the characters of DNA fragments,this paper presents a new algorithm of time complexity O((n-1)(k1-1)k222h+(k1+1)2h+nk2+mk1)with the maximum number of SNP sites that a fragment covers k1(no more than n),the maximum number of the fragments covering a SNP site k2(usually no more than 19) and the maximum number of fragments covering a SNP site whose value is unknown at the SNP site h(no more than k2).Since the time complexity is not directly related with k,the algorithm can deal with the MSR problem with Mate-Pairs efficiently,and is more scalable and applicable in practice.
ZHANG De-Fu , WEI Li-Jun , CHEN Qing-Shan , CHEN Huo-Wang
2007, 18(9):2083-2089.
Abstract:By combining the personification heuristics and simulated annealing,a combinational heuristic algorithm for the three-dimensional packing problem is presented.This personification heuristic algorithm is inspired by the strategy of building wall in the daily life.The point-finding way and the rules of horizontal and vertical reference line are developed to control the packing process.Simulated annealing algorithm is further used to improve the personification heuristics.Computational results on benchmark instances show that this algorithm can compete with excellent heuristics from the literature.
ZHANG Qiang-Feng , XU Yun , CHEN Guo-Liang , CHE Hao-Yang
2007, 18(9):2090-2099.
Abstract:The problems of haplotyping and haplotype frequency estimation on trio genotype data under the Mendelian law of inheritance and the assumption of Hardy-Weinberg equilibrium are studied in this paper. Since most past efforts only focused on haplotyping on genotype data of unrelated individuals and data with general pedigrees, but gave insufficient efforts to the special case of trio genotype data, there is coming an increasing demand in analyzing them in particular, especially when taking into account that part of HAPMAP database is exactly trio data. This paper presents a two-staged method to estimate haplotype frequencies in trios: i) haplotyping stage, find haplotype configurations without recombinant for each trio; ii) frequency estimation stage, use the expectation-maximization (EM) algorithm to estimate haplotype frequencies based on these inferred haplotype configurations. Both the haplotyping algorithm and the EM algorithm are implemented in software package TRIOHAP using C language. Its effectiveness and efficiency and tested on simulated and real data sets as well. The experimental results show that, TRIOHAP runs much faster than a popular frequency estimation software which discards trio information. Moreover, because TRIOHAP utilizes such information, its estimation is more reliable.
2007, 18(9):2100-2104.
Abstract:Under some special conditions, the P3P problem can have 1, 2, 3 and 4 solutions, and if the 3 control points and the optical center lie on a circle, the problem is indeterminate. In this paper, by the Monte Carlo approach of up to 1 million samples, it is shown that the probabilities of the P3P problem with one solution, two solutions, three solutions, and four solutions are respectively 0.9993, 0.0007, 0.0000, 0.0000. The result confirms the well-known fact that in the most cases, the P3P has a unique solution.
GAO Long , WANG Zhi-Yuan , YANG Xue-Jun
2007, 18(9):2105-2116.
Abstract:Traditional fault tolerance compilations replicate all computations and registers to guarantee fault tolerance.But this brought great overhead in both storage utilization and performance.This paper suggestes a new concept of critical subgraph of error flow graph based on error flow analyses.Methods are given to generate critical subgraphs from critical nodes or from critical paths,and partial redundancy algorithm is suggested to replicate only critical subgraph.Partial redundancy algorithm guarantees effective fault tolerance,and greatly improves performance,reduces power dissipations and reduces storage usage.Experimental results show that,compared with full redundancy which replicates full error flow graph,partial redundancy can reduce register usage by 6.25%, reduce power dissipation by over 17%,reduces total execution cycles by nearly 26%,and improves performance by over 22%,at the cost of 6.25% lower nodes coverage.
HU Jian-Jun , GUAN He-Qing , WEI Jun , HUANG Tao
2007, 18(9):2117-2129.
Abstract:High dynamic computing environment makes QoS(quality of service)guarantee more important for component-based distributed system.Software system should possess self-tuning capacity for reacting to external environment variation.This paper proposes an adaptive self-configuration framework,which can automatically tune configuration parameters to preserve QoS as workload changes.The key of this framework is a layered queuing network based performance model,and it guides the search for the best combination of configuration parameters to satisfy the QoS requirement.This self-configuration framework is prototyped on OnceAS application server,and is validated using StockOnline by comparing the performance requirement satisfaction with and without this framework.The results show that through the framework's regulation,system performs well on QoS goal.
YI Xiao-Dong , WANG Ji , YANG Xue-Jun
2007, 18(9):2130-2140.
Abstract:This paper presents a novel method,namely Assume-Guarantee reuse of searching,to verify C programs with respect to temporal safety properties.Its idea is to introduce a conservative assume condition for each program location,and to assume that every path starting from the program location will never violate the property if the evaluation of its variables at that location satisfy the assume condition.All the possible execution paths are traversed based on the assume conditions,and the temporal safety property is checked on the fly.If some assume condition is too weak,it will be continually strengthened based on the spurious counterexamples.The presented verification method can try to adopt the weak assume conditions so as to let more execution paths satisfy the conditions and to reuse the searching efforts.Therefore,a significant reduction of verification cost can be achieved.The verification method has been used to verify the initial handshake process of SSL protocol based on the C source code of openssl-0.9.6c.The experimental results demonstrate that the method is both effective and practical.
2007, 18(9):2141-2152.
Abstract:SPEM(software process engineering metamodel)is the standard metamodel put forward by OMG (object management group)and widely accepted in industry.However,its model enactment is still at the early stage. In this paper,software process is viewed as a specialization of the more general kind of process known as workflow. SPEM2XPDL is presented as a software process enactment approach based on the model transformation.The well-defined mapping rules from SPEM to XPDL(XML process definition language)metamodels are provided and the corresponding transformation algorithm and engine are developed.This approach is implemented in the SoftPM project.After transformed to XPDL format,the SPEM models are successfully enacted by Shark,a famous XPDL engine.
2007, 18(9):2153-2161.
Abstract:In order to describe the temporal aspect of workflow model and verify the temporal consistency,a method based on temporal logic and model checking for modeling and verifying time constraint workflows is presented.By this method,the first order logic is used to model workflow including its basic temporal information, temporal logic is used to model the time constraints,and model checking is used to analyze and verify the temporal consistency.The method can be used to verify any time constraints that can be described by temporal logic.Also the method can offer a counterexample of workflow instance to the time constraint which can not pass the verification. Finally,the method is validated through a case study.
AO Xiang , WANG Xu-Gang , DAI Guo-Zhong , WANG Hong-An
2007, 18(9):2162-2173.
Abstract:In recognition-based user interface,users'satisfaction is determined not only by recognition accuracy but also by effort to correct recognition errors.In this paper,an error correction technique based on multimodal fusion is introduced.It allows a user to correct errors of Chinese handwriting recognition by repeating the handwriting in speech.A multimodal fusion algorithm is the key of the technique.By constraining the search for the best handwriting recognition result by speech input,the algorithm can correct errors in both character extraction and recognition of handwriting.The experimental result indicates that the algorithm is effective and efficient in computation.Moreover,evaluation also shows the correction technique can help users to correct errors in handwriting recognition more efficiently than the other two error correction techniques.
QU Kai-She , ZHAI Yan-Hui , LIANG Ji-Ye , LI De-Yu
2007, 18(9):2174-2182.
Abstract:This paper aims to establish the relationship between formal concept analysis and rough set theory.The following results are obtained:(1)a derivative formal context of an information system can be induced by the notion of nominal scale and the technique of plain scaling in formal concept analysis;(2)some core notions in rough set theory such as partition,upper and lower approximations,independence,dependence and reduct can be reinterpreted in derivative formal contexts.In addition,the limitation of rough set theory to data processing is analyzed.The results presented in this paper provide a basis for the synthesis of formal concept analysis and rough set theory.
HAO Guo-Sheng , SHI You-Qun , HUANG Yong-Qing , Lü Jun-Huai , GUO Guang-Song
2007, 18(9):2183-2193.
Abstract:Noise is an important factor that influences the performance of evolutionary computation(EC).Much research on noise was reported in traditional EC,but less in IEC(interactive evolutionary computation).The definition,source,type of noise and methods to deal with noise in EC are reviewed firstly.Secondly,related with the rational user in IEC,the convergence robustness against fitness noise in IEC is studied.Mapping among spaces, dominating relationship and convergence in IEC are discussed to establish bases for two theorems:Strong condition theorem and weak condition theorem.These two theorems imply that the noise caused by the rational user will not prevent the algorithm from converging to the global optima.Thirdly,as the successive issue,the conclusions that the effective fitness scaling method is part of the weak condition and the user preference is the true fitness in IEC are discussed.The narrow definition of fitness noise based on the weak condition is also given.The experimental results validate the theorems,and the results establish a necessary foundation for future research.
ZHU Xiao-Jing , HU Wei-Wu , MA Ke , ZHANG Long-Bing
2007, 18(9):2194-2204.
Abstract:Network on chip(NoC)has many characteristics,such as less nodes number,shorter distance between the cores,the need of less physical implementation difficulty,and so on.To satisfy the special need of the NoC,this paper presents a new topology named Xmesh and its routing algorithm called XM.This paper adds some diagonal edges on the Mesh topology to form Xmesh,sequentially reduce the average distances of the topology.Given the same network size,Xmesh has the same edge number with Torus topology,this paper compares the performance of Mesh,Xmesh and Torus topologies.A detailed theoretical analysis for Mesh,Xmesh and Torus topologies is given, and a simulation analysis based on the Popnet simulator using uniform traffic pattern and hotspot traffic pattern as benchmarks is also presented.As the simulation result shows,the average latencies of Xmesh and Torus topologies are less than 70% of the average latency of Mesh topology.For uniform traffic pattern,when the network size is small,the performance of Xmesh is better than Torus topology.For hotspot traffic pattern,when the hot node is near to the network center or the two diagonals,the latency of Xmesh is about 70%~90% of the latency of Toms topology,otherwise,the latency of Torus is about 70%~90% of the latency of Xmesh topology.In conclusion, Xmesh has a good performance just like Torus,but its physical implementation is simpler than Torus's,and both of which have a better performance than Mesh topology.
XU Ke , WU Kun , WANG Qing-Qing
2007, 18(9):2205-2215.
Abstract:The inter-nodes communication bottleneck has been a key factor that restricts the large-scale expanding of scalable router's software architecture.To solve the problem,this paper introduces a transmission adapted sub-layer in the supporting model of traditional software architecture.Through feature extracting of the up-going data stream and pattern matching with registered task,data stream can then be classified and divided based on the content of information to increase the effective communication rate.The paper further analyzes the model's performance from task's three characteristics:Distribution rate,spread number and traffic rate,and provides an optimized task dispatching reference.It shows that the introduced transmission adapted sub-layer can reduce inter-layer redundant flow and extensibility bottleneck of communication.Finally,the presented experiment verifies the theoretical analysis.
QIU Zhi-Huan , XIAO Ming-Zhong , DAI Ya-Fei
2007, 18(9):2216-2225.
Abstract:Restricted by the diversity of resources and the complexity of search algorithms,current search mechanisms in peer-to-peer file sharing systems are based on file names and simple keyword matching.These mechanisms cannot recognize deeper relationships between keywords and resources;hence it cannot provide high search quality.This paper proposes a new search scheme,which is built on top of the current peer-to-peer network. It harnesses users' search behaviors and download behaviors to automatically discover the deeper relationships between keywords and resources,which is then used to improve the search quality.It has the advantages of low implementation cost,low complexity,self-evolving,and supports for semantic search.Simulations based on the Maze system show that this approach has high search hit rate and accuracy.
FENG Guo-Fu , ZHANG Jin-Cheng , GU Qing , LU Sang-Lu , CHEN Dao-Xu
2007, 18(9):2226-2234.
Abstract:The overlay network of unstructured P2P system is neither regular network,nor pure random network. The peers are usually not completely equivalent.They usually play different roles in the overlay network.This paper firstly investigates the relation among the degree distribution,the access frequency mode and the success rate, and then presents an optimal degree distribution model in terms of the popularity of data items.Finally,a feasible proactive replication is proposed to reach the expected degree distribution.The simulation shows that the proactive replication can improve the performance of the unstructured P2P.
2007, 18(9):2235-2244.
Abstract:Data gathering is the basic function of the sensor networks.However,the existing gathering schemes are almost based on the architecture with a static base station which results in the quick death of nodes around the base station.The reason is that the sensor nodes located near a base station have to relay data for a large part of the network and thus deplete their batteries very quickly.This paper discusses how to use the mobile base station for data gathering with load-balancing.A data gathering scheme MADG(movement-assisted data gathering),which makes use of the mobile base station for data collection,is presented.In this scheme,the base station moves in a stationary annularity area exploited for data buffering.The gathered data.are firstly forwarded into the buffering area and then collected by the mobile base station.It is theoretically proved that the location 2~(1/2)R/2 away from the center is the optimal location for minimizing the energy consumption for transmitting data and that there exists a location which can make the maximal node load minimize.This paper then considers the optimum location jointing the energy consumption and load-balancing based on above analyses.Compared with the static base station scheme and the existing mobile base station scheme,MADG reduces the load by over 95% and 80%,respectively.
SUN Zhi-Xin , JIANG Ju-Liang , JIAO Lin
2007, 18(9):2245-2258.
Abstract:This paper presents the APA-ANTI-DDoS(aggregate-based protocol analysis anti-DDoS)model to detect and defend the DDoS attack.APA-ANTI-DDoS model contains the abnormal traffic aggregate module,the protocol analysis module and the traffic processing module.The abnormal traffic aggregate module classifies the network traffic into normal traffic and the abnormal traffic;the protocol analysis module analyzes the potential features of DDoS attack traffic in the abnormal traffic;the traffic processing module filters the abnormal traffic according to the current features of DDoS attack,and resumes the non-attack traffic with the help of testing the congestion control feature of the traffic.The paper then implements the APA-ANTI-DDoS system.The experimental results show that APA-ANTI-DDoS model can primely detect and defend DDoS attack and resume the non-attack traffic at the time of miscarriage of justice to guarantee the legal communication traffic.
LU Wen-Yan , JIA Wei-Jia , DU Wen-Feng , ZHANG Li-Zhuo
2007, 18(9):2259.
Abstract:At present,the mandatory contention resolution that shall be supported by IEEE 802.16 is based on truncated binary exponential backoff algorithm.In this paper,the differences of contention mechanism between IEEE 802.16 and IEEE 802.11 are analyzed.The calculations of performance metrics such as the utilization of transmission opportunity u,the delay of bandwidth request d and the drop probability of bandwidth request pd are presented.Based on simulation results,the effects of contention parameters such as the initial window W,the number of subscriber station n and the number of transmission opportunity Ntoare discussed.Some policies to adjust the parameters are also provided.These policies arc useful for BS to schedule its uplink bandwidth resource.
YANG Wu , FANG Bin-Xing , YUN Xiao-Chun
2007, 18(9):2271-2282.
Abstract:To perform effective intrusion analysis in higher bandwidth network, this paper studies the data collecting techniques and proposes a scalable efficient intrusion monitoring architecture (SEIMA) for network intrusion detection system (NIDS). In the architecture of SEIMA, scaling network intrusion detection to high network speeds can be achieved using multiple sensors operating in parallel coupled with a suitable load balancing traffic splitter. High-performance data transfer is achieved through asynchronous DMA without OS's intervention by using efficient address translation technique and buffer management mechanism. Multi-rule packet filter based on finite state machine technique is implemented at user layer to eliminate overhead for processing redundant packets. The simulative and actual experiment results indicate that SEIMA is capable of reducing the using rate of CPU while improving the efficiency of data collection in NIDS, so as to save much more system resources for complex data analysis in NIDS. The method of SEIMA is very practical for network security.
ZHANG Li , QIAN Gong-Bin , XIAO Wei-Wei
2007, 18(9):2283-2294.
Abstract:In this approach, a rotation, scaling and translation invariant blind second generation watermarking technique is proposed using Tchebichef moments of the original image to estimate the geometric distortion parameters of the corrupted watermarked images. The Tchebichef moments of original image can be used as private key of watermark extraction process. The characteristics of the human visual system are incorporated into the watermark embedding, and the embedding process can be performed in any image domain, including spatial and transform domain. A method in DWT domain is proposed in this paper. Independent Component Analysis is adopted by watermark detector so that the watermark can be correctly extracted but not merely detected. The experimental results show that this method has a good robustness to attacks produced by the popular watermarking test software Stirmark.
FANG Cui-Hao , YE Xiu-Zi , PENG Wei , ZHANG Yin
2007, 18(9):2295-2305.
Abstract:In this paper,a multi-level and dynamic security access control(MLDAC)model is proposed for CAD models in collaborative environment.A multi-level privilege model is developed to simplify the process of permission definition and assignment for enriching the expression ability and helping to realize multi-grained access control.The dependent relation of permission and the permission state migration are brought into MLDAC for dynamic authorization management based on the basic theory of workflow.Based on the practice,MLDAC model is more efficient to control collaborative operations.It meets the characters of design tasks,i.e.divisible,dependent and interactive.
LI Yun-Xi , FENG Jie-Qing , JIN Xiao-Gang
2007, 18(9):2306-2317.
Abstract:An algebraic B-spline curve fitting algorithm based on the signed distance field is proposed in this paper.Given a planar point set,the moving least square(MLS)method is adopted to denoise and resample it so that the resulting point set is with low noise and uniform sampling density.Then the reliable signed distance field of the preprocessed point set is constructed by using the Level Set method.Finally,an algebraic B-spline function is adopted to fit the signed distance field by solving a linear equation system.As a result,an algebraic curve is obtained which is the zero level set of the algebraic function.By using the proposed method,not only the high quality curve is obtained,but also geometric information around the curve.Furthermore,the unwanted branches in implicit curve fitting could be avoided.
HE Xiao-Guang , TIAN Jie , WU Li-Fang , ZHANG Yao-Yao , YANG Xin
2007, 18(9):2318-2325.
Abstract:Face recognition under complex illumination conditions is still an open question. To cope with the problem, this paper proposes an effective illumination normalization method. The proposed method employs morphology and quotient image techniques by analyzing the face illumination, and it is upgraded with dynamical lighting estimation technique to strengthen illumination compensation and feature enhancement. Compared with traditional approaches, this method doesn't need any training data and any assumption on the light conditions, moreover, the enrollment requires only one image for each subject. The proposed methods are evaluated on Yale Face database B and receive a very competitive recognition rate with low computational cost.
CAO Juan , CHEN Wen-Yu , WANG Guo-Zhao
2007, 18(9):2326-2335.
Abstract:Based on the de Casteljau algorithm for triangular patches, also using some existing identities and elementary inequalities, this paper presents two kinds of new magnitude upper bounds on the lower derivatives of rational triangular Bézier surfaces. The first one, which is obtained by exploiting the diameter of the convex hull of the control net, is always stronger than the known one in case of the first derivative. For the second derivative, the first kind is an improvement on the existing one when the ratio of the maximum weight to the minimum weight is greater than 2; the second kind is characterized as being represented by the maximum distance of adjacent control points.
XIAO Chun-Xia , FENG Jie-Qing , ZHOU Ting-Fang , ZHENG Wen-Ting , Peng Qun-Sheng
2007, 18(9):2336-2345.
Abstract:This paper proposes a technique for shape editing of point-based geometry based on geometric detail mapping. The geometric detail is an important attribute of a surface. It is defined in this paper as the difference between the original point-sampled surface and its base surface which is constructed as a multilevel B-splines approximation. The geometric detail is represented as the set of the local affine coordinates of the base surface and extracted effectively and efficiently in user specified frequency band of geometric signal. It can be mapped onto any arbitrary point set surface including its own deformed base surface. Versatile editing tools are developed for the point sampled geometry that naturally preserves the geometric detail. Experimental results demonstrate the high potentials of the approach for point-sampled geometry modeling.
LIU Li , ZHANG Cai-Ming , YANG Xing-Qiang , BO Peng-Bo
2007, 18(9):2346-2355.
Abstract:This paper proposes a ternary stationary subdivision scheme for quadrilateral mesh. For regular and irregular quadrilateral meshes, different subdivision masks are applied to generate new vertices. The number of faces on the refined mesh is about nine times than that of the coarse mesh after every subdivision step. The limit surface generated by the new method is C2 continuous for a regular mesh and C1 continuous for an irregular mesh. Compared with typical subdivision schemes, the proposed scheme has faster convergence speed and the ability to solve arbitrary topological quadrilateral mesh. Some examples are given in the end to illustrate the performance of the new subdivision scheme.
GAO Quan-Sheng , HONG Bing-Rong
2007, 18(9):2356-2364.
Abstract:In this paper presents an approach for creating lifelike, controllable motion in interactive virtual environments. This is achieved through learning statistical model from a set of motion capture sequences. The method is based on clustering the motion data into motion primitives that capture local dynamical characteristics-Dynamic texture, modeling the dynamics in each cluster using linear dynamic system (LDS), annotating those LDS' which have clear meaning, and calculating the cross-entropy between frames of LDS' to construct a directed graph called a annotated dynamic texture graph (ADTG) which has two-level structure. The lower level retains the detail and nuance of the live motion, while the higher level generalizes the motion and encapsulates the connections among LDS'. The results show that this framework can generate smooth, natural-looking motion in interactive environments.