• Volume 23,Issue 5,2012 Table of Contents
    Select All
    Display Type: |
    • Efficient Algorithm Based on Action Space for Solving the 2D Rectangular Packing Problem

      2012, 23(5):1037-1044. DOI: 10.3724/SP.J.1001.2012.03986 CSTR:

      Abstract (4835) HTML (0) PDF 565.76 K (7619) Comment (0) Favorites

      Abstract:This paper addresses a typical NP-hard problem, the two-dimensional (2D) rectangular packing problem. The study makes improvements on a quasi-human approach, a caving degree algorithm proposed by Huang Wen-Qi, et al., by defining the conception of action space such that the calculation of the caving degree is simplified. Therefore, the evaluation on different placements is reduced considerably, and good layouts could be obtained quickly. The experiments tested 21 famous instances of the 2D rectangular packing problem provided by Hopper and Turton. The improved algorithm achieved optimal layout with a space utilization of 100% for each instance, and the average computing time on a personal computer was within seven minutes. Computational results show that the improvement strategies on the quasi-human caving degree approach are evident and effective.

    • Termination Analysis of Nonlinear Loops

      2012, 23(5):1045-1052. DOI: 10.3724/SP.J.1001.2012.03982 CSTR:

      Abstract (3960) HTML (0) PDF 522.06 K (5153) Comment (0) Favorites

      Abstract:The termination of linear loops has been studied extensively, but there are few results collected on termination of nonlinear loops. Through the fixed point theorem, this paper presents an approach to the decision of termination of a class of nonlinear loops. In addition, for a particular class of loops, the paper gives a condition under which the termination of the loops is decidable.

    • Parallel Computing Method of Canonical Correlation Analysis for High-Dimensional Data Streams in Irregular Streams

      2012, 23(5):1053-1072. DOI: 10.3724/SP.J.1001.2012.04008 CSTR:

      Abstract (3930) HTML (0) PDF 943.86 K (6232) Comment (0) Favorites

      Abstract:This paper addresses an approach that uses GPU (graphic processing unit)-based processing architecture model and its parallel algorithm for high-dimensional data streams over the irregular streams in order to satisfy the real-time requirement under the resource-constraints. This six layers model combines the GPU high wide-band property of data processing with analysis data stream in a sliding window. Next, canonical correlation analysis is carried out between two high-dimensional data streams, by a data cube pattern, and a dimensionality-reduction method in this framework based on compute unified device architecture (CUDA). The theoretical analysis and experimental results show that the parallel processing method can detect correlations on high dimension data streams, online, accurately in the synchronous sliding window mode. According to the pure CPU method, this technique has significant speed advantage and conducts the real-time requirement of highdimensional data stream very well. It provides a common strategy for the applied field of data stream mining.

    • Approximation Algorithms for a Two-Stage Hybrid Flow Shop

      2012, 23(5):1073-1084. DOI: 10.3724/SP.J.1001.2012.00587 CSTR:

      Abstract (3478) HTML (0) PDF 559.26 K (4805) Comment (0) Favorites

      Abstract:This paper investigates a variant scheduling problem of minimizing makespan in a two-machine flow shop. In this variant, there will be two tasks for each job. The first task can be processed on either machine, and the second task can only be processed on the second machine after the first task has been finished. Furthermore, if the second task should start right after the first task is completed, it is called a no-waited case and is denoted by NSHFS. On the other hand, if the second task is allowed to be processed at any time after the first task is completed, the problem is then denoted as SHFS. In the case of SHFS, based on the result of Wei and He, an improved polynomial time approximation algorithm with worst-case ratio of 8/5 is presented. In the case of NSHFS, this paper shows that it is NP-hard, and presents a polynomial time approximation algorithm with worst-case ratio of 5/3.

    • Pronoun Resolution in English and Chinese Languages Based on Tree Kernel

      2012, 23(5):1085-1099. DOI: 10.3724/SP.J.1001.2012.04044 CSTR:

      Abstract (4868) HTML (0) PDF 760.27 K (6208) Comment (0) Favorites

      Abstract:This paper proposes a tree kernel method to anaphora resolution of pronouns in both English and Chinese. First, several basic structured tree spans are proposed according to linguistic intuition. The similarity between two structured objects is computed directly using SVMLight. Then, a dynamic-expansion scheme is proposed to automatically determine a proper tree span for pronoun resolution by the centering theory, antecedent competitor-related information, and semantic role-related information. Evaluation on both the ACE 2004 English NWIRE corpus and the ACE 2005 Chinese NWIRE corpus justified the effectiveness of this method.

    • Descending Kernel Track of Static and Dynamic Topic Models in Topic Tracking

      2012, 23(5):1100-1119. DOI: 10.3724/SP.J.1001.2012.04045 CSTR:

      Abstract (4774) HTML (0) PDF 1.06 M (5870) Comment (0) Favorites

      Abstract:Topic tracking is a task in research on identifying, mining and self-organizing relevant information to news topics. Its key issue is to establish statistical models that adapt the kind of news topic. This includes two aspects: one is topical structure; the other is topic evolution. This paper focuses on comparing and analyzing the features of three main kinds of topic models including words bag, hierarchical tree and chain. Different performances of static and dynamic topic models are deeply discussed, and a term overlapping rate based evaluation method, namely descending kernel track, is proposed to evaluate the abilities of static and dynamic topic models on tracking the trend of topic development. On this basis, this paper respectively proposes two methods of burst based incremental learning and temporal event chain to improve the performance of capturing topic kernels of dynamic topic models. Experiments adopt the international-standard corpus TDT4 and minimum detection error tradeoff evaluation method proposed by NIST (National Institute of Standards and Technology), along with descending kernel track method to evaluate the main topic models. The results show that structural dynamic models have the best tracking performance, and the burst based incremental learning algorithm and temporal event chain achieve 0.4% and 3.3% improvement respectively.

    • Head-Modifier Dependency Reordering Model for Phrased SMT

      2012, 23(5):1120-1131. DOI: 10.3724/SP.J.1001.2012.04055 CSTR:

      Abstract (4232) HTML (0) PDF 934.42 K (5621) Comment (0) Favorites

      Abstract:To enhance the reordering capacity of the phrase-based SMT (statistical machine translation), the study leverages the head-modifier dependency structure on the source to model the reordering. The model is added to baseline model in the form of soft-constraint way. The proposed model explores an approach to utilize the constituent based parse tree that the parse tree is mapped into sets of head-modifier relationships. Experimental results show that this model improves the local reordering significantly.

    • Selective Boosting Algorithm for Maximizing the Soft Margin

      2012, 23(5):1132-1147. DOI: 10.3724/SP.J.1001.2012.04064 CSTR:

      Abstract (4586) HTML (0) PDF 995.71 K (5820) Comment (0) Favorites

      Abstract:Research of traditional boosting algorithms mainly focuses on maximizing the hard or soft margin of the convex combination among weak hypotheses. The weak learners are often all used in the combination, even though some of them are more, or less related. This increases the time complexity of the hypotheses’ training and test. To ease the redundancies of the base hypotheses, this paper presents a selective boosting algorithm called SelectedBoost for classifying binary labeled samples, which is based on LPBoost. The main idea of the algorithm is to discard as many hypotheses as possible according to their relevance and diversity. Furthermore, this paper introduces an edge constraint for every strong hypothesis to speed up the convergence when maximizing the soft margin of the combination of the weak hypotheses. The experimental results show that this algorithm can achieve both better performance and less generalization error compared to some representative boosting algorithms.

    • >Review Articles
    • Research on Cloud Databases

      2012, 23(5):1148-1166. DOI: 10.3724/SP.J.1001.2012.04195 CSTR:

      Abstract (14458) HTML (0) PDF 946.37 K (19070) Comment (0) Favorites

      Abstract:With the recent development of cloud computing, the importance of cloud databases has been widely acknowledged. Here, the features, influence and related products of cloud databases are first discussed. Then, research issues of cloud databases are presented in detail, which include data model, architecture, consistency, programming model, data security, performance optimization, benchmark, and so on. Finally, some future trends in this area are discussed.

    • Query Answer over Inconsistent Database with Credible Annotations

      2012, 23(5):1167-1182. DOI: 10.3724/SP.J.1001.2012.04079 CSTR:

      Abstract (4980) HTML (0) PDF 771.24 K (5889) Comment (0) Favorites

      Abstract:Inconsistent data is confusing and conflicting. Computing credible query answers over such data is significant. However, previous related works lose information. The approach of annotation based query answer (AQA) introduces confidence annotation to differ consistently and inconsistently in attribute value. Thus, a credible query answer can be computed and information loss can also be avoided. This is limited, however, in functional dependencies. This paper extends the approach to applications where multi constraints are involved, and no attribute is definitely credible. This paper redefines its representing model and query algebra, discusses the rules for calculating valid implied constraints of the above types on query result for any query algebra, proposes a cost based heuristic algorithm to repair, and annotates the initial database. The experiments show that time performance of extended AQA is almost similar to that of SQL for any query without join, and close to SQL for join queries after optimization, but it doesn’t loss information.

    • Data Stream Prediction Based on Episode Rule Matching

      2012, 23(5):1183-1194. DOI: 10.3724/SP.J.1001.2012.04094 CSTR:

      Abstract (3989) HTML (0) PDF 628.83 K (5557) Comment (0) Favorites

      Abstract:This paper proposes an algorithm called Predictor. This algorithm uses an automaton per matched episode rule with general form. With the aim of finding the latest minimal and non-overlapping occurrence of all antecedents, Predictor simultaneously tracks the state transition of each automaton by a single scanning of data stream, which can not only map the boundless streaming data into the finite state space but also avoid over-matching episode rules. In addition, the results of Predictor contain the occurring intervals and occurring probabilities of future episodes. Theoretical analysis and experimental evaluation demonstrate Predictor has higher prediction efficiency and prediction precision.

    • Rare Category Detection Algorithm Based on Weighted Boundary Degree

      2012, 23(5):1195-1206. DOI: 10.3724/SP.J.1001.2012.04104 CSTR:

      Abstract (4171) HTML (0) PDF 695.21 K (5523) Comment (0) Favorites

      Abstract:This paper proposes an efficient algorithm named CATION (rare category detection algorithm based on weighted boundary degree) for rare category detection. By employing a rare-category criterion known as weighted boundary degree (WBD), this algorithm can make use of reverse k-nearest neighbors to help find the boundary points of rare categories and selects the boundary points with maximum WBDs for labeling. Extensive experimental results demonstrate that this algorithm avoids the limitations of existing approaches, has a significantly better efficiency on discovering new categories in data sets, and effectively reduces runtime, compared against the existing approaches.

    • Jamming Attacks in Wireless Network

      2012, 23(5):1207-1221. DOI: 10.3724/SP.J.1001.2012.04059 CSTR:

      Abstract (5269) HTML (0) PDF 737.19 K (14538) Comment (0) Favorites

      Abstract:The broadcast nature of wireless network with sharing medium makes it particularly vulnerable to the jamming style of the denial of service attack, which is a great threat to network performance and security. In this paper, the state-of-the-art jamming metrics and jamming models are first analyzed, and the recent literatures on jamming attack are surveyed in detail in terms of jamming detection, defense and jammer localization. Finally, the possible future trends and key points about this issue are solved.

    • Energy-Balanced Unequal Clustering Routing Protocol for Wireless Sensor Networks

      2012, 23(5):1222-1232. DOI: 10.3724/SP.J.1001.2012.04061 CSTR:

      Abstract (6069) HTML (0) PDF 698.22 K (9756) Comment (0) Favorites

      Abstract:A distributed energy-balanced unequal clustering routing protocol (DEBUC) is proposed and evaluated in this paper, which adopts an unequal clustering mechanism in combination with an inter-cluster multihop routing. Through a time based competitive clustering algorithm, DEBUC partitions all nodes into clusters of unequal size, in which the clusters closer to the base station have smaller size. The cluster heads of these clusters can preserve some more energy for the inter-cluster relay traffic, and the “hot-spots” problem can be avoided. For inter-cluster communication, DEBUC adopts an energy-aware multihop routing system to reduce and balance the energy consumption of the cluster heads. Simulation results demonstrate that the protocol can efficiently decrease the dead speed of the nodes, balance the energy dissipation of all nodes, and prolong the network lifetime.

    • Analysis of Scheduling Performance in Wireless Networks with Successive Interference Cancellation

      2012, 23(5):1233-1247. DOI: 10.3724/SP.J.1001.2012.04062 CSTR:

      Abstract (3771) HTML (0) PDF 808.40 K (5102) Comment (0) Favorites

      Abstract:This paper focuses on link scheduling in a wireless network with successive interference cancellation (SIC), and proposes a multi-level protocol model and an order-aware physical model to characterize the impact of SIC. As link scheduling in a wireless network with SIC is NP-hard, the study resorts to an approximate solution: (1) under the order-aware physical model, the study presents a scheduling scheme such that the approximation ratio is O(g), where g is the link diversity factor; (2) under the multi-level protocol model, the study presents an efficient scheduling scheme such that the approximation ratio is a constant. Finally, this study uses extensive simulations to investigate the impact of SIC on the scheduling performance in practice.

    • Image Interpolation Algorithm Based on Implicit Piecewise Autoregressive Model

      2012, 23(5):1248-1259. DOI: 10.3724/SP.J.1001.2012.04049 CSTR:

      Abstract (4858) HTML (0) PDF 789.25 K (5179) Comment (0) Favorites

      Abstract:The piecewise statistical stationary property of natural image signals provides an effective way to model the image signals. However, the statistical stationary region always has an irregular profile. Large mismatch errors rise when a regular shape window is adopted to estimate the statistics in these regions. This paper proposes an implicit piecewise autoregressive model based on a probabilistic description of the statistical stationary region. According to this model, an improved image interpolation algorithm is presented. The experimental results show that the proposed interpolation method can greatly alleviate the blurring, ringing, and noisy artifacts around edges in interpolated image.

    • Direct Anonymous Attestation Scheme in Cross Trusted Domain for Wireless Mobile Networks

      2012, 23(5):1260-1271. DOI: 10.3724/SP.J.1001.2012.04052 CSTR:

      Abstract (4439) HTML (0) PDF 768.54 K (6595) Comment (0) Favorites

      Abstract:Based on delegation of trusted relationship, a ross-domain direct anonymous attestation scheme for wireless mobile networks is proposed. A proxy signature is used for delegation among domains, and the direct anonymous attestation (DAA) method is used for mobile terminal authentication when a terminal roaming to another domain. The remote attestation system is security-enhanced by a key agreement. The authentication protocol is analyzed in Canetti-Krawczyk (CK) model, and the results show that the protocol is secure. Further analysis shows that this proposal can resist reply attacks and platform masquerade attacks; the scheme is effective and suitable for the mobile trusted computing platforms.

    • Http-Flood DDoS Detection Scheme Based on Large Deviation and Performance Analysis

      2012, 23(5):1272-1280. DOI: 10.3724/SP.J.1001.2012.04068 CSTR:

      Abstract (3838) HTML (0) PDF 531.62 K (6309) Comment (0) Favorites

      Abstract:This paper focuses on Http-Flood DDoS (distributed denial of service) attack and proposes a detection scheme based on large deviation statistical model. The detection scheme characterizes the user access behavior with its Web-pages accessed and adopts the type method quantizing user’s access behavior. Based on this quantization method, this study analyzes the deviation of ongoing user’s empirical access behavior from the website’s priori one with large deviation statistical model, and detects Http-Flood DDoS with large deviation probability. This paper also provides preliminary simulation regarding the efficiency of the scheme, and the simulation results show that the large deviation of most normal Web surfers is larger than 10-36, yet, the attacker’s is smaller than 10-40. Thus, this scheme is promising to detect Http-Flood DDoS. Specifically, the scheme can achieve 0.6% false positive and 97.5% true positive with detection threshold of 10-60. And compared with the existing detection methods, this detection scheme can outperform them in detection performance. In particular, this scheme can improve the true positive ratio 0.6% over the transition probability based detection scheme with the false positive below 5%.

    • Autofocusing Method for Microscopy with Low Image Content Density

      2012, 23(5):1281-1294. DOI: 10.3724/SP.J.1001.2012.04099 CSTR:

      Abstract (5746) HTML (0) PDF 980.65 K (6016) Comment (0) Favorites

      Abstract:Auto-Focusing is one of the key issues in automatic microscopy. The traditional gradient based auto-focusing algorithms may fail to find the optimal focal plane under the circumstances with low image content density because the slope variation of the focus measure of low content density images is small, and the global maximum may be drowned in noises. This paper proposes a content importance factor based focus measure for guiding automatic search of the optimal focal plane with low image content density. The proposed method classifies the pixels into three types: the content pixels, the debris pixels, and the background pixels, according to the relative variation of gradient magnitude of current image and the reference image captured at different z-axis positions from the same scene and adaptively assigns different weights to pixels based on the image content in the focus measure computation. In this way, the contribution of the content pixels is emphasized while that of debris pixels and background pixels is suppressed, and thus, the steepness of the focus curve around the optimal point is improved. The experimental results show that performance of the proposed method is far superior to the traditional methods: the auto-focusing success rate of the proposed method is larger than 90% under the circumstances with low image content density while the traditional method only gains a success rate of 24%.

    • Image Retrieval Approach Based on Sparse Canonical Correlation Analysis

      2012, 23(5):1295-1304. DOI: 10.3724/SP.J.1001.2012.04032 CSTR:

      Abstract (5266) HTML (0) PDF 602.55 K (9138) Comment (0) Favorites

      Abstract:A key issue of semantic-based image retrieval is how to bridge the semantic gap between the low-level feature of image and high-level semantics, which can be expressed by means of free text effectively. The cross-modal relationship between the text and image is studied by a modeling semantic correlation between text and image. Based on the model, an approach to image retrieval is proposed so that images are retrieved according to meaning of the query text rather than query keywords. First, an algorithm for solving sparse canonical correlation analysis (CCA) is designed in this paper. Then a semantic space is learned by way of latent semantic analysis from text corpus, and images are represented by bag of visual words. After that, a semantic correlation space, by which the map between visual words of image and the high-level semantics is made explicit, can be constructed. The proposed method solves CCA in a sparse framework in order to make the result more interpretable and stable. The experimental result demonstrates that Sparse CCA outperform CCA in the context, and also substantiates the feasibility of the proposed approach to image retrieval.

    • Global Anisotropic Quad-Dominant Remeshing

      2012, 23(5):1305-1314. DOI: 10.3724/SP.J.1001.2012.03988 CSTR:

      Abstract (4011) HTML (0) PDF 768.99 K (6399) Comment (0) Favorites

      Abstract:This paper proposes an anisotropic quad-dominant remeshing algorithm suitable for meshes of arbitrary topology. It takes an approach to the challenging problem of obtaining an anisotropic quad-dominant mesh. The method consists of operations that sample surface geometry by dense principle curvature lines and sort curvature-lines by variations of surface normal and volume related to them. The anisotropic sampling of curvature lines is then obtained by implementing a prioritization scheme of curvature lines elimination. The strategy is simple and straightforward to implement. It is flexible to produce anisotropic quad-dominant meshes ranging from dense to coarse too. The resulting meshes exhibit better anisotropic distribution than comparable methods while maintaining high geometric fidelity.

    • Super-Resolution Based on Sparse Dictionary Coding

      2012, 23(5):1315-1324. DOI: 10.3724/SP.J.1001.2012.03989 CSTR:

      Abstract (4789) HTML (0) PDF 745.50 K (9989) Comment (0) Favorites

      Abstract:Learning-Based super-resolution methods usually select several objects with similar features from some examples according to the low-resolution image, then estimate super-resolution result using optimization algorithm. But the result is usually limited by the quality of matching objects and only geometric construction of the images is selected as matching feature, so matching accuracy is relatively low. This paper presents a sparse dictionary model for image super-resolution, which unifies the feature patches of high-resolution (HR) and low-resolution (LR) images for sparse coding. To break through the aforementioned limitations, this method builds a sparse association between HR and LR images, and realized simultaneous matching and optimization methods. The study uses a MCA method to improve the accuracy for feature extraction and carry out super-resolution reconstruction and denoise simultaneously. Sparse K-SVD algorithm is adopted as optimization method to reduce the computation time of sparse coding. Some experiments with real images show that this method outperforms other learning-based super-resolution algorithms.

    • Image Editing Algorithms Based on Adaptive Mesh Deformation

      2012, 23(5):1325-1334. DOI: 10.3724/SP.J.1001.2012.03998 CSTR:

      Abstract (3749) HTML (0) PDF 980.29 K (7383) Comment (0) Favorites

      Abstract:This paper presents a generic framework for manipulating the images based on adaptive mesh deformation, which allows users to move, scale, rotate, and deform the salient objects in the image and retarget the image into other regions with arbitrary boundary shapes. The image is embedded into a triangular mesh and the manipulation is formulized into a quadratic energy minimization problem. The triangles corresponding to the salient objects are constrained by specific transformations with respect to the deformation types of the objects such as translation, scaling, or rotation etc. The solution to the energy optimization can be obtained by solving one or several sparse linear systems. Experimental results show that the algorithm is effective, robust, and efficient and can be integrated into other image processing tools.

Current Issue


Volume , No.

Table of Contents

Archive

Volume

Issue

联系方式
  • 《Journal of Software 》
  • 主办单位:Institute of Software, CAS, China
  • 邮编:100190
  • 电话:010-62562563
  • 电子邮箱:jos@iscas.ac.cn
  • 网址:https://www.jos.org.cn
  • 刊号:ISSN 1000-9825
  •           CN 11-2560/TP
  • 国内定价:70元
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063