• Volume 25,Issue S2,2014 Table of Contents
    Select All
    Display Type: |
    • A Survey of Edge Layout for Network Visualization

      2014, 25(S2):1-11.

      Abstract (2594) HTML (0) PDF 1.25 M (4677) Comment (0) Favorites

      Abstract:Edge layout is an important aspect of network visualization and an essential tool for representing the overall structure of network and revealing the hidden information pattern. To survey and analyze the existing edge layout techniques, this paper first gives an overview on the tasks of edge layout of network and the aesthetics standards of graph drawing. Then, it classifieds the edge layout techniques into three categories. Edge routing (ER), edge confluent (EC), and edge bundling (EB). Next, it elaborates the principles and characteristics of the typical layout methods, and highlights the analytics of edge bundling technique. Finally, this paper reviews the existing open problems, and offers an outlook on future work and challenges of edge layout.

    • Device Adaptive Wireless Signal Feature Extraction and Localization Method

      2014, 25(S2):12-20.

      Abstract (2677) HTML (0) PDF 727.60 K (5592) Comment (0) Favorites

      Abstract:In recent years, research on Wi-Fi based indoor localization draws increasing attention. However, in practical applications, the localization error caused by device variance is a severe problem. In this paper, a new calibration-free and unsupervised method, SSDR (Signal Strength Difference Ratio) is proposed to solve this issue. Considering the signal variance between training devices and testing devices, SSDR first removes the linear effect of fingerprint to get new features. It then puts forward a distance calculation criterion with AP impact factor according to the effect of AP. Finally SSDR eliminates the variance of devices and realizes indoor localization based on the new features and distance calculation criterion. The experiment deployed in real indoor wireless environment shows, compared with traditional indoor localization methods, the proposed SSDR can increase the indoor localization accuracy by 10%~20%, which greatly improves the practical usability of indoor localization system.

    • Video Smoke Detection Based on Circularly Aligned Edge Orientation Histogram

      2014, 25(S2):21-27.

      Abstract (2620) HTML (0) PDF 787.44 K (4738) Comment (0) Favorites

      Abstract:Video smoke detection has many advantages such as fast response and non-contact. Due to large variance of smoke shape, color and texture, it's difficult for existing methods to achieve satisfactory results. This paper proposes a robust feature extraction method by using support vector machine (SVM) for classification. First, an edge orientation histogram (EOH) is extracted. Then, circular shift technique is used to transform the maximum value bin of EOH to the fixed bin of EOH, thus eliminating the influence of rotation. To further enhance the robustness of features, Hu invariant moments, mean, deviation, skewness, and kurtosis are extracted from both illuminance and saturation component images converted from original RGB images. Finally, all the features are combined together to form a 38-dimentional feature vector, and SVM is used to train and classify smoke images. Experiments show that the features have good discrimination capabilities, and the method can achieve about 98% and 85% detection rates on selected large training and testing data sets.

    • Evaluation of Haptic Feedback in Hover Space in a Multi-Target Pointing Task

      2014, 25(S2):28-35.

      Abstract (2477) HTML (0) PDF 932.34 K (4253) Comment (0) Favorites

      Abstract:Adding haptic feedback to touch screens is promising in improving users' productivity. However, most touch screens only track user's finger position when the user touches the screen. As a result, haptic feedback is not provided until touch is detected. This paper investigates adding haptic feedback in addition to visual feedback in hover space—the space above the touch screen. Four feedback conditions are compared in a multi-target pointing task. Experimental results show that there is no significant difference between the four feedback conditions as far as task completion speed and accuracy are concerned. The additional haptic feedback neither improves nor deteriorates performance as compared with the visual-only condition no matter it was added in hover space or as a button-click feedback. However, the button-click feedback is most favorable to users. Based on these results and previous literature, this study creats guidelines on how to add haptic feedback in hover space.

    • Sign Language Recognition by 3D Trajectory Matching in Frequency Domain

      2014, 25(S2):36-43.

      Abstract (2658) HTML (0) PDF 600.70 K (4565) Comment (0) Favorites

      Abstract:For hearing-impaired people, sign language is a common communication means just like spoken language to ordinary people. Because most of ordinary people cannot understand sign language, it's difficult for hearing-impaired community to participate in social activities. This paper proposes an effective method for sign language recognition with simple 3D trajectory information to break down the barriers between hearing-impaired and normal persons. First of all, the 3D trajectory captured from Kinect is preprocessed, and both the trajectories from the probe and galleries is normalized by the size of the signer. Then the trajectories are resampled evenly by a fast, easy, and usable interpolation algorithm. Before matching of two curves, the trajectory of the probe is aligned to the gallery trajectory and both aligned trajectories are transferred into frequency domain by DFT, and vectors linking the real part, imaginary part and amplitude are obtained. Finally, Euclidean distance between two trajectories in frequency domain is calculated to evaluate the dis-similarity of two trajectory vectors according to the minimized distance. The experimental results on a data set of 239 sign words show that the presented approach is effective to recognizing isolated words of Chinese sign language.

    • Design of TPM Chip Based on Signal Integrity Analysis

      2014, 25(S2):44-52.

      Abstract (2777) HTML (0) PDF 1.01 M (5252) Comment (0) Favorites

      Abstract:It is important for personalized social services to calculate the relationship strength between users in a social network. Meanwhile, the psychological studies has shown that the personality traits is one of the key factors affecting the user's relationship strength. Based on the relationship generation principle in the social psychology, this paper proposes a personality embedded social relationship strength hierarchical model and algorithm. With the analysis of a user's behavior in social network, this paper predicts the Big Five personality traits of the user to calculate the propensity of personality. The propensity of personality is combined with the similarity of preference and the familiarity of interaction to formalize the personality embedded user relationship strength calculation. At the end of this paper, the proposed algorithm is demonstrated to be reasonable and effective in a simulation experiment of RENREN social network.

    • Incentive Mechanism for Opportunistic Social Networks: The Market Model with Intermediaries

      2014, 25(S2):53-62.

      Abstract (2671) HTML (0) PDF 970.98 K (4477) Comment (0) Favorites

      Abstract:The wide use of smart phones and other intelligent devices equipped for short-range wireless communications makes it possible for people to organize social activities via opportunistic social networks. However, message delivery can be easily disturbed due to the selfishness of nodes. This paper introduces a model of markets with intermediaries as an incentive scheme. On the basis of this model, an agent selection algorithm called "Ranger Algorithm" is proposed. Rangers refer to those users who not only have met with users in other communities for multiple times, but also have a higher probability of meeting those users. Experiments using MIT Reality Mining dataset is implemented and the effects of using market model with intermediaries as an incentive mechanism are analyzed. Results show that this model can effectively serve as an incentive mechanism to assist message delivery. In addition, this paper also finds that Ranger Algorithm outperforms other methods at improving communication performance. Based on the above work, a prototype system is built to help organize social activities.

    • Stable Boundary-Based Non-Uniform Unit Selection in Speech Synthesis

      2014, 25(S2):63-69.

      Abstract (2338) HTML (0) PDF 823.70 K (4665) Comment (0) Favorites

      Abstract:Speech synthesis technology plays an important role in human computer interaction. Based on the traditional cost function based unit selection method, this paper proposes an approach that incorporates diphone's stable boundary model into word and syllable, and utilizes multi-layer Viterbi algorithm for selecting the best path from the corpus to generate the final waveforms. With the proposed multi-layer non-uniform unit selection algorithm, the new method can not only choose the longer prosody units which have correct acoustical characteristic to reduce the concatenate points while including the potential coarticulation and bad labeled phones inside the longer units, but also fix the traditional unit boundary type to absorb the diphone's good stable joint character to improve the continuity and naturalness at concatenate boundaries. The evaluation results show that by using this approach, the synthetic speech can achieve great improvements on both naturalness and intelligibility compared with the traditional diphone-based unit selection approach.

    • Package of the Vector Math Library Based on the Sunway Processor

      2014, 25(S2):70-79.

      Abstract (2664) HTML (0) PDF 733.66 K (4155) Comment (0) Favorites

      Abstract:This paper first introduces the SIMD (single instruction multiple data) extension technology and presents three ways to use SIMD instructions. It is considered that calling the third party library, which is optimized for the target platform by using those instructions, is the best way to benefit the developers. Next, it introduces the China-developed SW-1600 CPU, and a software package called SW-VML, which consists of many mathematical functions, by using the SIMD extension technology. In order to solve the additional overhead caused by unaligned address access and transformation between vector and scalar array, the paper provides some performance optimized methods, such as aligned address access, simplifying vector condition branch and loop unrolling. An upgrade to SW-VML is also offered to support multi-thread with OpenMP. Finally, functions in the package are tested using arrays of different sizes on SW-1600,and the test results show that high performance is achieved with the technology of the SIMD vectorization. Compared with the traditional methods of the scalar calculation, the average speedup is up to 2.06. The performance speedup of package using 4 threads is up to 2.26 compared to using a single thread. SW-VML is a common vector function package for domestic Sunway processor series, and it can be used as a basic toolkit which is beneficial to high performance computing on Sunway platform.

    • Fair Scheduling on Dynamic Heterogeneous Chip Multiprocessor

      2014, 25(S2):80-89.

      Abstract (2919) HTML (0) PDF 971.49 K (4835) Comment (0) Favorites

      Abstract:Dynamic Heterogeneous CMPs (DHCMP), which provide the capability to configure different number and types of processing cores at system runtime, dramatically improve energy- and power-efficiency by scheduling workloads on the most appropriate core type. A significant body of recent work has focused on improving system throughput through scheduling on asymmetric CMPs (ACMP). However, none of the prior work has looked into fairness. In this work, centralized run queue is introduced and a heterogeneity-aware fair scheduler (HFS) is proposed to address the fair scheduling problem on DHCMP. HFS algorithm can not only gain the capability of DHCMP to configure the types of processing cores to match the granularities of parallelism in the tasks, but also keep the fairness when tasks running simultaneously. Experimental results demonstrate that HFS on DHCMP outperforms the best performing fair scheduler on SCMP and ACMP by 10.55% in user-oriented performance (ANTT), and 14.24% in system throughput (WSU).

    • Workload Analysis and Modeling of High Performance Computing Trace of Biological Gene Sequencing

      2014, 25(S2):90-100.

      Abstract (2269) HTML (0) PDF 866.87 K (4388) Comment (0) Favorites

      Abstract:Biological gene sequencing is one of the most common high-performance computing tasks in Bioinformatics analysis. This paper aims to find the main workload characteristics of biological gene sequence trace (BGST) and construct a general model to analyze the biological gene sequence (BGS), which can be used in high-performance computing scheduling and performance optimization with the BGS. The study mainly analyzes the job arrival, runtime and parallelism characteristics in BGST. Based on the analysis, it constructs several local models with exponential, Gamma, Gaussian and linear regression, then combines all the local models into a final model. The experimental results obtained by applying two general evaluation methods show that the new model has uniform distributed trend with BGST, which demonstrates the good versatility of the model.

    • A Cost Model for Automatic OpenMP Parallelization

      2014, 25(S2):101-110.

      Abstract (2329) HTML (0) PDF 822.37 K (4221) Comment (0) Favorites

      Abstract:Existing OpenMP cost models does not give enough thought to the implementation details of OpenMP programs so they cannot be applied widely to different types of parallel loops. To solve this problem, this study extends the cost model in the most advanced product-level optimizing compiler Open64. Targeting single candidate parallel loop, it establishes a cost model suite for the OpenMP benefit analysis. Besides improving the original DOALL-loop faced cost model in the Open64 compiler, two additional models which are also designed for DOACROSS and DSWP (Decoupled Software Pipelining) loops respectively. The experimental results show that the proposed cost model suite can simulate the execution cost of parallel programs very well, and therefore can better support cost evaluation of OpenMP parallelization.

    • Visual Analysis Approach for Clustering Multivariate Spatial Data

      2014, 25(S2):111-118.

      Abstract (2383) HTML (0) PDF 4.18 M (4015) Comment (0) Favorites

      Abstract:Clustering is one of important tools to study the multivariate spatial data. However, automatic clustering algorithms require the user to finely modulate parameters, imposing the need for an effective mechanism to manipulate the clustering process by dynamically changing the parameters and evaluating the results. This paper proposes a novel visual analysis approach for clustering multivariate spatial data. First, the underlying dataset is clustered in 3D using an automatic clustering algorithm. Second, the result is examined and refined on its 2D projection by leveraging a suite of visualization and analysis toolkits. The user is allowed to intuitively verify and adjust the clusters by referring to the visual encoding and visual patterns. The entire process is progressively performed in a raw-to-fine fashion. The case study on a high-dimensional symmetric tensor field verifies the effectiveness and robustness of the proposed approach.

    • Parallel Optimization Strategy on Tianhe-2 Supercomputer for a Method of DNA Sequence de novo Assembly

      2014, 25(S2):119-126.

      Abstract (2572) HTML (0) PDF 964.33 K (5022) Comment (0) Favorites

      Abstract:SGA is a tool based on string graph theory for DNA sequence de novo assembly. In this paper, the sequence de novo assembly problem based on SGA is proved to be an NP-complete problem, and detailed analysis on SGA is provided. According to the result, SGA outperforms other similar tools in memory consumption, but cost much more on time in which 60%~70% is spent by index construction. To tackle these issues, this paper introduces a deep parallel optimization strategy, and implements a Tianhe-2 architecture oriented parallel framework. Experiments are carried out on different data sizes on ordinary cluster and Tianhe-2. For data of small size, the optimized solution is 3.06 times as fast as before, and for data of medium size, it's 1.60 times. The results demonstrate the evident overall improvement and the linear scalability for parallel FM-index construction. This study can be beneficial to the optimization research of other relevant issues, and it also affirms the powerful computing ability of Tianhe-2 as a useful tool in life sciences research.

    • Identification of Misleading Product Description in E-Commerce Website

      2014, 25(S2):127-135.

      Abstract (2231) HTML (0) PDF 903.78 K (4194) Comment (0) Favorites

      Abstract:Online shopping has been accepted by more and more consumers. C2C websites provide thousands of offers for consumers as a mainstream e-commerce platform. When customers search products in C2C website, some returned offers have misleading description. Misleading description means that the description does not convey the actual price of products, but usually claiming much lower price for the purpose of attracting more consumers. The misleading offers affect consumers' judgments and bring bad influences on the websites' reputation. This paper proposes an approach that combines statistical model HMM with statistical outlier detection method to detect misleading offers. HMM model is built to determine the product that an offer description really designates, providing an efficient solution to eliminate the ambiguity of the offer description caused by description irregularities. The statistical outlier detection method is effective to deal with limited product offer information. The paper further conducts experiments on real data set of electric business websites and the results demonstrate the effectiveness of the proposed approach.

    • Bounded Diversification Methods for Top-k Query Results

      2014, 25(S2):136-146.

      Abstract (2474) HTML (0) PDF 1.18 M (3915) Comment (0) Favorites

      Abstract:High repetition rate of query results is a problem needing a prompt solution in top-k query processing. Existing solutions require the traversing over all objects in initial result set which may cause a lower efficiency in query processing. To address the issue, this paper first maps initial result set to the Euclidean space and selects the optimal subspace using either the score-based method or distance-based method by adopting the pulling strategy. Applying the distance-based method, the Euclidean space is partitioned and the number of second query objects is reduced by incorporating geometric properties of Voronoi diagram. Further, the bounded diversification algorithm over top-k query results is developed and the soundness of the algorithm is proved. Experimental results demonstrate that the proposed algorithm improves the efficiency of top-k query processing.

    • Uncertain Rule Based Method for Evaluating Data Currency

      2014, 25(S2):147-156.

      Abstract (3294) HTML (0) PDF 713.67 K (5176) Comment (0) Favorites

      Abstract:Data staleness is one of the most important factors leading to low data quality. It highlights the needs of determining the currency of data to identify whether a database is up-to-date. There are some works on determining data currency, but all these methods have their limitations. Some works require timestamps which are always invalid, and others are based on certain currency rules which can only decide relevant currency and cannot express uncertain semantics. To overcome the limitations of existing methods, this paper introduces a new approach for determining data currency based on uncertain rules. A new class of uncertain currency rule is first introduced. Based on the uncertain rules, mathematical models of data currency are proposed. Two algorithms to determine data currency are developed. A method of automatically learning the uncertain currency rules is also provided. Using real-life data, the effectiveness and efficiency of our methods are experimentally verified.

    • Efficient Compressed Index for Top-k Spatial Keyword Query

      2014, 25(S2):157-168.

      Abstract (2768) HTML (0) PDF 989.77 K (4426) Comment (0) Favorites

      Abstract:Location-Based services guide a user to find the object which provides services located in a particular position or region (e.g., looking for a coffee shop near a university). Given a query location and multiple keywords, location-based services return the most relevant objects ranked according to location proximity and text relevancy. Various hybrid indexes have been proposed in recent years which combine R-tree and inverted index to improve query efficiency. Unfortunately, the state-of-the-art approaches require more space in order to reduce response time. Cache mechanism is inefficient due to huge storage overhead. In this paper, a novel index based on index compressed technology (CSTI) is proposed, to answer top-k SKQ. CSTI significantly reduces storage overhead (by at least 80%), while maintaining efficient query performance. Extensive experiments based on real dataset and simulated dataset confirm CSTI is effective and efficient.

    • Approximate Attribute Reduction Algorithm Based on General Binary Relation

      2014, 25(S2):169-177.

      Abstract (2384) HTML (0) PDF 745.57 K (4161) Comment (0) Favorites

      Abstract:One of the most attentive applications of rough set is attribute reduction. Addressing the noise in decision information systems, a new method for importance measure of attribute set is presented from the point of view that knowledge can enhance the ability to perform classification. In addition, a new approximate attribute reduction algorithms is proposed based on general binary relation, which can be used to deal with noise and be applicable to many extending model of rough sets. Experimental results demonstrate that the proposed approximate attribute reduction algorithms can effectively increase sensitivity to noise, achieve more compact reduction, and simultaneously improve the classification performance.

    • Cut-Vertex Detection Algorithm Based on Compression on Big Graph

      2014, 25(S2):178-188.

      Abstract (2492) HTML (0) PDF 822.01 K (3890) Comment (0) Favorites

      Abstract:The detection of cut vertex is an important operation on graph. The deep-first search algorithm can solve this problem. However, the algorithm has drawback which prevents it from applying in the real-world applications. This is because of the two characteristics for today's data. One is the scale of data is huge, so it is challenging for many operations on graph. Another challenge is that the data is changeable. Because of the massive updates, the traditional algorithm must compute repeatedly according to the change, wasting a lot of time and space. The time complexity of deep first search tree is O(|V|+|E|), where |V| and |E| are number of nodes and edges of graph, so it can adapt to the first characteristic very well. But it is useless for the second characteristic. In order to solve this problem, this paper puts forward an algorithm based on compression to discovering cut vertex. The algorithm compresses the graph based on the naïve similarity on nodes. The time complexity of the algorithm is O(|E|). It discovers cut vertex on the lossless compression graph. At the same time, the algorithm maintains the updates of the nodes and edges dynamically, and updates the graph without decompression. It discovers cut vertex on the compression graph after update. These methods reduce the consumption of time and space remarkably. The compressed graph obtained by the compression algorithm in the article can be applied to other graph operations.

    • Incremental Mining and Evolutional Analysis of Co-Locations

      2014, 25(S2):189-200.

      Abstract (2940) HTML (0) PDF 666.76 K (4190) Comment (0) Favorites

      Abstract:Spatial co-locations mining is an important research domain in spatial data mining. Spatial co-locations represent the subsets of spatial features which are frequently located together in geographic space. Up to present, all the existing co-location mining algorithms only focus on discovering ordinary co-location patterns or co-location rules. However, in real-world applications, the data in a database do not usually remain a stable condition, making efficient incremental mining for co-locations very indispensable and interesting. The evolutionary analysis of co-locations can discover the development rules of co-locations, and predict the particular event happened in future. However, no results have yet been reported from these researches. This paper studies the incremental mining for co-locations and the evolutionary analysis of co-locations. Firstly, an efficient basic algorithm and a prune algorithm for incremental mining are proposed. Secondly, evolutionary co-locations are discovered based on several real datasets. Thirdly, both the basic algorithm and prune algorithm are proved correct and complete. Fourth, extensive experiments are performed to verify the performance and effectiveness of the basic algorithm and prune algorithm. At last, the basic algorithm and prune algorithm for incremental mining in conjunction with the evolutionary co-locations mining algorithm are applied to the Three Parallel Rivers of Yunnan protected Areas plant database to predict the development rules of co-locations, and dynamically track and protect the rare plants of this area.

    • Resource Scheduling Optimization Algorithm for Xen Virtual Machines

      2014, 25(S2):201-212.

      Abstract (2503) HTML (0) PDF 1.07 M (4860) Comment (0) Favorites

      Abstract:Arming at the irrational resource allocation problem in the Xen virtualization platform, this paper proposes two resource scheduling optimization algorithms: the fine-grained algorithm and the coarse-grained algorithm. The fine grained algorithm is mainly for resource allocation of single physical node, which dynamically adjusts the allocated resource amount of each virtual machine according to its resource utilization, and appropriately increases the resource amount for the virtual machines whose resource utilization are high and reduces the resource amount for those whose resource utilization is low, thus improves resource utilization efficiency and avoids unnecessary virtual machine migrations. The coarse-grained algorithm focuses on the load imbalance problem among multiple physical nodes in a cluster, and applies the particle swarm optimization technique to select some virtual machines on the hot physical machines and then immigrates it to the most suitable cold physical machines in a cluster system, thereby solving load imbalance problem of the cluster system and avoiding high load physical machines downtime. Experiments show that the proposed two scheduling optimization algorithms can effectively solve the resource allocation irrational problem of virtual machines and have better adaptability and application prospects.

    • RIAIL: An Index Method for Reachability Query in Large Scale Graphs

      2014, 25(S2):213-224.

      Abstract (2791) HTML (0) PDF 949.69 K (4603) Comment (0) Favorites

      Abstract:Graph has been widely used to model the applications in social network, semantic web, computational biology and software analysis. Reachability query is one kind of basic queries in graph data. Currently, in allusion to graph, several index algorithms have been proposed to answer reachability query, however, they can not scale to large graph flexibly. To address the issue, a new index method called RIAIL (reachability index augmented by interval labeling) is developed in this paper. RIAIL labels each node with four-tuple. The first two elements are interval labels that encode reachability information of the spanning tree, and the last two elements encode reachability information of non-tree edges. RIAIL only needs to index when querying and the cost of index construction is little. Finally, a wide range of experiments on real and synthetic datasets are conducted to demonstrate that RIAIL can efficiently handle reachability query and easily scale to large graph.

    • Spatio-Temporal Abnormal Cluster Discovery in Arrival Data

      2014, 25(S2):225-235.

      Abstract (2216) HTML (0) PDF 977.27 K (4458) Comment (0) Favorites

      Abstract:Arrival data is a type of location related data which records the spots and time that users arrive. It can be the check-in data in the social network, stay points in the trajectory or the arrival locations of passengers in the public transport. The clusters of arrival data can reflect the aggregation behavior of users in a particular area. This paper presents a new spatio-temporal data query—Spatio-Temporal Abnormal Clusters Discovery. The new scheme partitions the arrival data into segments with equal timespan periodically. Then using spatio-temporal cluster algorithms, it clusters the data in every segment, and finds k most abnormal clusters by comparing the different degree of clusters. Finding abnormal clusters can be useful in areas such as urban safety management, location based service and transportation scheduling. This article defines the abnormal cluster query model, specifies the difference measurement for the clusters with arbitrary shape and transforms the query to the maximum matching problem of bipartite graphs. Algorithms are designed to improve the efficiency in constructing and matching of bipartite graphs. The experiments on the real datasets validate the application value of the query results and demonstrate the effectiveness of the query algorithm with different parameters.

    • Face Illumination Transfer Algorithm Based on Intrinsic Images Decomposition

      2014, 25(S2):236-246.

      Abstract (3352) HTML (0) PDF 2.14 M (4708) Comment (0) Favorites

      Abstract:In this paper, a face illumination transfer tech based on intrinsic images decomposition is proposed. Firstly, to tackle the issue that the intrinsic image decomposition effect is not perfect, an improved intrinsic image decomposition method is developed. Next, in order to maintain the face feature, a light filtering algorithm based on edge-preserving is created. The algorithm can transfer the illumination from reference face to target face without missing the target details. Finally, the reflectance image and the illumination image are merged to make face relighting. The experimental results show that compared with the existing algorithms, the presented algorithm can preserve the face skin after transfer and the new method produces light effect more reasonable and natural.

    • Large-Scale Simulation of Complex Water Scenes Based on Spherical Projected Grid

      2014, 25(S2):247-257.

      Abstract (2513) HTML (0) PDF 2.87 M (4476) Comment (0) Favorites

      Abstract:Current simulation methods for large-scale complex water scenes suffer from various problems such as low efficiency, and complicated collision detection. To remedy these problems, this paper presents a novel method for ocean-scale complex simulation based on spherical mapping projected grid. First, a novel spherical projected grid method is proposed for animating large scale water waves. Compared with conventional projected grid method, the new method does not need reconstructing projectors intersecting with the sphere, and thereby can achieve higher rendering efficiency which is also suitable for graphics hardware acceleration. Then, a new method is designed for simulating interactions including water-rigid body interaction and rigid body-terrain interaction. Additionally, a general bubble and coast line rendering framework is provided. Experiments demonstrate that the proposed method can realize realistic rendering results with high rendering rates (FPS>60), and can be applied in real time scenarios such as computer games and virtual reality.

    • Counting Pedestrians in High-Density Crowd Scenes Using Cross-Sectional Flow Statistics

      2014, 25(S2):258-267.

      Abstract (2527) HTML (0) PDF 1.46 M (4656) Comment (0) Favorites

      Abstract:Surveillance cameras have been widely installed in cities all over the world in recent years. Counting pedestrians from cameras has become a very important issue in intelligent video surveillance. However, factors such as occlusions, noise, camera perspective, background clutter may affect the accuracy of pedestrian counting. This paper introduces a pedestrian counting method for high-density crowd scenes using cross-sectional flow statistics. The proposed method consists of a new foreground detection algorithm based on the gradient motion history image, an improved feature-based counting algorithm by an effective motion image, and a moving speed extraction algorithm using optical flows. The experimental results show that the proposed method is robust and effective for counting pedestrians.

    • Illumination Estimation of Hazing Outdoor Scene

      2014, 25(S2):268-277.

      Abstract (2562) HTML (0) PDF 2.44 M (4465) Comment (0) Favorites

      Abstract:In hazy weather condition, images of outdoor scene are greatly affected by light scattering. Due to the fog, image parameter estimation of sunlight and skylight becomes more complex. In accordance with the atmospheric scattering model, this paper formulates the basis image model of outdoor scene, and presents the algorithm of parameter estimation of haze image. By iterating the scattering coefficient, the algorithm produces the proper haze density and depth of the haze image. It then decomposes the haze-free image to obtain the right illumination coefficient, best haze-free image and combined image. Experimental results demonstrate the effectiveness of the proposed approach in finding the proper hazy density and the corrected depth image of the scene.

    • Efficient Multi-Scale Texture Recognition Algorithm

      2014, 25(S2):278-289.

      Abstract (2821) HTML (0) PDF 2.34 M (4418) Comment (0) Favorites

      Abstract:As an effective texture description operator, local binary patterns (LBP) has the advantages of low computation complexity, low memory consumption and clear principle. Damper-Shafter evidence theory satisfies the conditions weaker than Bayesian probability theory and can directly express states of "uncertain" and "don't know". To exploit the advantages of above two concepts, a new texture recognition method is proposed. Firstly, the approach computes image pyramid and uses the distributions of multi-scale LBP to measure the similarity between two texture images. Secondly, the method combines the similarity measurement between the test texture and each training sample to combine the information given by each training sample. Finally, the recognition result is determined by the maximum evidence among different texture classes. Experimental results show that the proposed method achieves a correction rate of 96.43%, and 91.67%, for data set 1 and data set 2, respectively, outperforming the original LBP based texture recognition algorithm.

    • Alignment and Deformation Based Seam Carving Algorithm for Image Resizing

      2014, 25(S2):290-297.

      Abstract (2484) HTML (0) PDF 1.67 M (4494) Comment (0) Favorites

      Abstract:In order to solve the problem that seam carving usually damages the local structure of image, this paper presents a novel method for content-aware image resizing based on alignment and deformation. First, seam carving is used to remove a seam from the input image and segment it into two parts. Second, all pixels are robustly detected and aligned along the seam and the smaller parts. Third, these sparse deformation cues are then propagated robustly and smoothly into the interior of the target image by solving the associated Poisson equations in the image domain. Experimental results show that the presented algorithm can achieve structure continuous effect versus seam carving and can used to adjust the image size in different display device.

Current Issue


Volume , No.

Table of Contents

Archive

Volume

Issue

联系方式
  • 《Journal of Software 》
  • 主办单位:Institute of Software, CAS, China
  • 邮编:100190
  • 电话:010-62562563
  • 电子邮箱:jos@iscas.ac.cn
  • 网址:https://www.jos.org.cn
  • 刊号:ISSN 1000-9825
  •           CN 11-2560/TP
  • 国内定价:70元
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063