YANG Qin , CHEN Jun-Liang , MENG Xiang-Wu
Abstract:The location-based services is considered to be the next climax of the telecom value-added services after the SMS (short message system). This paper introduces a service flow describing language XPL (extended-calling process language) which mainly focuses on the value-added service in telecom field before. This paper presents a geography information service describing language GDL (geography description language) based on the prior work. XPL and GDL have the ability to describe LBS-oriented telecom services in coordination. Those languages have high level in abstractness, and they can be easily used and are fast at developing service. This paper also presents the supporting system of those languages. Due to the basis of SOA (services-oriented architecture), the supporting system suits the service creation under the condition of net merging.
SONG Xiao-Lei , HU Gang , YAO Jun-Jie , BAI Jing
Abstract:Based on the finite element analysis, this study applies three iterative regularization algorithms to the 3D reconstruction of fluorescent yield, including conjugate gradient least square (CGLS), least square QR decomposition (LSQR), and 2-order pre-iteration method. By using a non-contact, multi-angle transmission imaging system, the experiments of single fluorescent target and double targets with the common-used cylinder phantom are conducted. Experimental results show that the above three methods could estimate the position of fluorescence targets accurately, while the time cost of each method is only about 2% of that of algebraic reconstruction technology (ART).
Abstract:An adaptive medical image fusion algorithm based on the representation of bidimensional empirical mode decomposition (BEMD) is proposed. Source medical images are decomposed into a number of bidimensional intrinsic mode functions (BIMF) as well as a residual image. Image features are extracted through Hilbert-Huang transform on the BIMF. Then the composite BEMD is formed by region-based fusion rules on data representations of BEMD. Finally, the fused image is obtained by inverse BEMD on the composite representation. The BEMD is an adaptive data decomposition representation, and has better performance than Fourier and wavelet transform. The proposed algorithm does not need predetermined filters or wavelet functions. Experimental results show that the proposed algorithm provides superior performance over conventional fusion algorithms in improving the quality of fused images.
DUAN Chai-Jie , MA Jing-Feng , ZHANG Yi-Bao , HOU Kai , BAO Shang-Lian
Abstract:This paper proposes an energy conduction model (ECM) based on the level set framework, which takes advantage of the heat conduction equation to construct the image energy. After comparing the image intensity distribution with the spatial distribution of the temperature field, an energy conduction function is defined, which well simulates the process of heat conducting. The advantage of the ECM is that it captures the global feature of an image and takes the local intensity information into account. Thus, ECM is able to accurately segment medical images with inhomogeneity and noise, as well as for the medical images with multi-targets. Synthetic and real medical images are tested with ECM, which shows its robustness and efficiency.
WANG Hui-Liang , HUANG Fei-Zeng , BAO Shang-Lian
Abstract:In order to meet the requirement for conformal intensity modulation inverse radiotherapy planning optimization processing, a new objective function is structured to target at the well defined objective area. This paper studies the parallel hybrid optimization strategy for inverse radiotherapy planning with the example of predominance combination of hybrid optimization strategy of simulation annealing and genetic algorithms, which formed a parallel general neighborhood searching hybrid optimization algorithm based on the uniform structure, and realizes the algorithm in a computer with multiple CPUs and multiple nuclei. It describes the dose distributions got with the algorithm for a virtual phantom and 5 clinical cases with the satisfying results. The results proves that this algorithm is effective and practical, which is a good platform for further research in parallel hybrid algorithm and the base for further development of the treatment planning system using biology guided radiotherapy technologies.
ZHAO Xiao-Jie , LONG Zhi-Ying , GUO Xiao-Juan , YAO Li
Abstract:Firstly, current Alzheimer’s disease (AD) studies using structural magnetic resonance imaging (MRI), functional MRI and diffusion tensor imaging (DTI) techniques are reviewed. Then the primary processing approaches of the three sorts of MRI data are introduced and analyzed. Finally, the neuroimaging database based on AD and the construction of corresponding diagnosis platform are described. Moreover, some research advances of the research group in this field are also mentioned.
SHUI Wu-Yang , ZHOU Ming-Quan , GENG Guo-Hua
Abstract:This paper presents an effective automatic method of brain extraction through twice segmentations respectively. The noise of the image is eliminated through anisotropic diffusion filtering based on Catt model while the details of the image are kept. The over-segmentation problem of the watershed algorithm is solved based on the merging of gray-scale similarity regions and the brain tissue is initially segmented. The edges of different organizations of every brain image are fuzzy and the MRI data is vulnerable to be affected by noise, therefore the non-brain region may be mistaken for brain region in the process of merging. In order to solve these problems, the Level Set method is adopted. The outline of the watershed is taken as the initial curves of level set to realize the automatic segmentation of brain tissue. The feasibility and practicality of this algorithm is proved by the results in the experiments.
PAN Qi , LUO Xiao-Nan , ZHU Ji-Wu
Abstract:Image magnification is an important technology in medical image processing. High detail areas in medical images most often have a definite geometric structure or pattern, such as in the case of edges. This paper proposes a learning-based method. Geometric features extracted from the available neighboring pixels in the Low-resolution (LR) image form the training set. Assuming the training set is locally corresponding to geometric features from the High-resolution (HR) image patch to be reconstructed. Local geometric similarity is described as the correspondence. The task of image magnification is formulated as an optimization problem, where the optimization coefficients can adaptively tune its value to effectively propagating the features from the training set to the target HR image patch. The advantages are the ability to produce a magnified image by any factor, and not require any outlier supporters. A Weighted Least Square (WLS) method is provided to offer a convenient way of finding the regularized optimal solution, where the weight function is determined by the non-local means. Simulation and comparison results show that the proposed method is independent, adaptive and can produce sharp edges with rare ringing or jaggy artifacts.
GONG Shao-Run , GAO Feng , LIU Hong
Abstract:This paper introduces the design, analysis and experimental verification of focal spot X-ray in-line imaging system. First of all, the object transmission function is fixed by the refractive index decrement and the absorption index of tissue according to the previously selected energy. After the factors being analyzed, such as the distance between the object and detector, and the object edge width, which can affect imaging quality, methods presented to improve image resolution are validated. In the experiment, plastic set square is irradiated with X-ray phase-contrast technique. The result is then compared with that using traditional X-ray imaging technique. The experimental results exhibit considerable consistency with former numerical simulation, which proves that the particular method can be used to determine the parameters of imaging system.
ZHENG Han , CHEN Zi-Kuan , KANG Yan , LIU Ji-Ren
Abstract:Based on the analysis of the shadow zone and the padding methods of the Radon domain generated by circular cone-beam CT scanning, this paper proposes a shadow zone padding method by distance-weighted interpolation. The method uses the known boundary data to interpolate the missed data inside the shadow zone with a distance weighting function. Simulation results show that the proposed padding method shows some advantages over the constant padding technique in terms of data-padding accuracy, artifacts reduction and image reconstruction for large cone-angle cone-beam scanning. The data padding method is particularly useful in Grangeat reconstruction algorithm. It enlarges the applicable scope of circular cone-beam tomography.
Abstract:In order to make a thorough use of the anatomical and functional information derived from cardiac magnetic resonance images, the epicardium and endocardium of the left ventricle should be extracted in advance. This paper presents a method for segmentation of the endocardium and epicardium of the left ventricle in cardiac magnetic resonance images using Snake models. It first proposes an external force for active contours, which is called convolutional virtual electric field (CONVEF). This CONVEF external force possesses the advantages of enlarged capture range, noise resistance and C-shape concavity convergence and can be implemented in real time by using fast Fourier transform since it is based on convolution. Considering that the left ventricle is roughly a circle, a shape constraint based on circle is adopted for segmentation of the endocardium. As to locating the epicardium, an internal energy based on shape similarity is proposed, and an edge map is coined to calculate the new external force by exploiting the resemblance between the endocardium and epicardium in shape and position. With these strategies, taking the final contour for endocardium as initialization, the Snake contour is reactivated to locate the epicardium automatically and accurately. This paper demonstrates the proposed approach on an in vivo dataset and compare the segmented contours with that of the GGVF (generalized gradient vector flow) Snake and manual collections. The results show its effectiveness.
Abstract:Based on the level set method without re-initialization, a sequential level set method is proposed to realize full image segmentation. The proposed method automatically and alternatively creates nested sub-regions or the corresponding initial level set functions in the image to be segmented, and then makes the level set function evolved to be convergence in the corresponding sub-region. This step is sequentially repeated until the sub-region vanishes. Compared with the original method and a representative region-based level set method, the proposed method has many advantages as follows: 1) It is automatically executed and does not need the interactive initialization anymore; 2) It segments image more than once and detects more boundaries than the original method; 3) It can get better performance on non-homogenous images than the representative region-based level set method; 4) It is an open image segmentation framework in which the single level set method is used can be replaced by other single level set methods after some modification. Experimental results indicate that the proposed method could fully segment the synthetic and medical images without interactive step and additionally works more robust on non-homogenous images.
JIN Lu-Dong , WU Yan , WANG Wei-Wei , REN Nu-Nu , HUANG He-Yu , CHEN Xue-Li , HAN Run-Qiang
Abstract:In this paper, an algorithm named non-truncated wavelet finite element for bioluminescence tomography (BLT) is proposed. Using linear combination of non-truncated wavelet functions across the elements to approximate the unknown function, this algorithm is used in BLT forward problem in phantoms of cylinder and sphere successfully. Theoretical analysis and numerical simulations show that the computation accuracy by this algorithm is almost as good as that of finite element method (FEM), while the number of elements and computational complexity reduce greatly compared with FEM.
ZHANG Yi , LIU Peng , TIAN Jie , QIN Wei , LIANG Ji-Min
Abstract:This paper introduces modern medical imaging technologies (fMRI (functional magnetic resonance imaging), PET(positron emission tomography), EEG(electroencephalography) and MEG(magnetoencephalography)) for deeper exploration in the field of acupuncture. Based on the experimental design and acupoint selection, acupuncture’s regulation of the brain functional network, paralimbic system, brain stem network and its control of the endogenous analgesic system as well as the autonomic nervous system are presented from three aspects. This paper summarizes the imaging technology-related academic reports used in the neural mechanisms of acupuncture. It simultaneously depicts the problems in current acupuncture studies and provides a better framework for future experimental work in hope of further understanding the neural mechanisms of acupuncture.
LIU Kai , TIAN Jie , YANG Wei , QIN Cheng-Hu , XU Min , LIU Dan
Abstract:This paper reviews the models of the forward problem, which are analytic solution, numerical solution and the statistic solution. Especially, a case of the Monte Carlo (MC) method when a bioluminescence source in multi-spectral mode is embedded in the tissue is introduced. In the field of optical imaging, according to different imaging modalities, the demand for imaging quality and the extracted information, there are three major categories: continuous waves, time-domain, and frequency-domain. Here not only the fundamental principles for each category are presented, but some typical applications and softwares using MC method are also particularly introduced. On the basis of these applications, it is believed that the MC method plays an indispensable role in the development of diffuse optical imaging, especially in vivo, no-invasive imaging in recent years.
Lü Shuai , LIU Lei , SHI Lian , LI Ying
Abstract:This paper introduces the concrete details of combining the automated reasoning techniques with planning methods, which includes planning as satisfiability using propositional logic, Conformant planning using modal logic and disjunctive reasoning, planning as nonmonotonic logic, and Flexible planning as fuzzy description logic. After considering experimental results of International Planning Competition and relevant papers, it concludes that planning methods based on automated reasoning techniques is helpful and can be adopted. It also proposes the challenges and possible hotspots.
JIANG Hong-Fei , LI Sheng , FU Guo-Hong , ZHAO Tie-Jun , ZHANG Min
Abstract:A translation model based on synchronous tree-substitution-grammar is presented in this paper. It can elegantly model the global reordering and discontinuous phrases. Furthermore, it can learn non-isomorphic tree-to-tree mappings. Experimental results on two different data sets show that the proposed model significantly outperforms the phrase-based model and the model based on synchronous context-free grammar.
RAO Dong-Ning , JIANG Zhi-Hua , JIANG Yun-Fei , ZHU Hui-Quan
Abstract:This paper improves the methods of observation reduction in non-deterministic planning (NDP) in three aspects: in finding MOS (minimal observation set); in finding out the optimal observation set (OOS) when observations have different costs; and in finding fault-tolerant OOS. A MOS problem is similar to a minimal set cover (MSC) problem, so it can be proved that finding MOS is NP-hard. Inspired by MSC methods, an O(2mm2) but Ω(2m?1) algorithm for MOS is presented, where m is the number of observations. By using integer programming (IP) technologies, OOS or fault tolerant OOS can be found out. Proofs are provided to show that these algorithms can guarantee finding optimal solutions.
ZHANG Li-Ning , GONG Mao-Guo , JIAO Li-Cheng , MA Wen-Ping
Abstract:Based on the antibody clonal selection theory of immunology, an artificial immune system algorithm, clonal selection algorithm based on anti-idiotype (AICSA), is proposed to deal with complex multi-modaloptimization problems by introducing the anti-idiotype. This algorithm evolves and improves the antibodypopulation through clonal proliferation, anti-idiotype mutation, anti-idiotype recombination and clonal selection operation, which can perform global search and local search in many directions rather than one direction around the identical antibody simultaneously. Theoretical analysis proves that AICSA can converge to the global optimum. By introducing the anti-idiotype, AICSA can make the most of the structure information of antibodies, accelerate the convergence, and obtain the global optimization quickly. In experiments, AICSA is tested on four different types of functions and compared with the clonal selection algorithm and other optimization methods. Theoretical analysis and experimental results indicate that AICSA achieves a good performance, and is also an effective and robust technique for optimization.
CAO Dong-Lin , LIAO Xiang-Wen , XU Hong-Bo , BAI Shuo
Abstract:Based on the information theory, this paper presents a model based on Web format information quantity in blog information extraction. First, the vision information in blog Web page and the effective text information are combined to locate the main text which represents the theme of the blog Web page. Second, the format information of blog Web page is used to calculate the information quantity of each block and the minimal separating information quantity of separate position is used to detect the boundary of posts and comments in the main text. This model is language insensitive and can be used in a lot of blogs which are written in different natural languages. Experimental results show that this method achieves high precision in locating main text and separating the post and comment.
Abstract:In order to solve the problem of data sparseness and knowledge acquisition in translation disambiguation and WSD (word sense disambiguation), this paper introduces an unsupervised method, based on the n-gram language model and web mining. It is supposed that there exists a latent relationship between the word sense and n-gram language model. Based on this assumption, the mapping between the English translation of Chinese word and the DEF of Hownet is established and the word set is acquired. Then the probabilities of n-gram in the words set are calculated based on the query results of a searching engine. The disambiguation is performed via these probabilities. This method is evaluated on a gold standard Multilingual Chinese English Lexical Sample Task dataset. Experimental results show that the model gets the state-of-the-art results (Pmar=55.9%) and outperforms 12.8% on the best system in SemEval-2007.
Abstract:This paper proposes an algorithm named Small Sample Incremental Biased Learning Algorithm to solve three difficulties of relevance feedback in sketch retrieval, including small sample issue, asymmetry of training data and real-time requirement. The algorithm combines active learning, biased classification and incremental learning to model the small sample biased learning problem in relevance feedback process. Active learning employs uncertainty sampling to choose the best labeling samples, so that the generalization ability of classifier is maximized with the limited training data; Biased classification constructs hyperspheres to treat positive and negative data differently, which distinguishes the user’s target class accurately; Newly labeled samples in each feedback loop are used to train the classifier incrementally to reduce the training time. Incremental learning also collects training data to further alleviate the small sample problem. Experimental results show that this algorithm improves the performance of sketch retrieval. And it can be well extended to other retrieval domains like CBIR (content based image retrieval), 3D retrieval, and so on.
DAI Dong-Bo , ZHAO Gang , SUN Sheng-Li
Abstract:An effective clustering algorithm called “P-Stream” for probabilistic data stream is developed in this paper for the first time. For the uncertain tuples in the data stream, the concepts of strong cluster, transitional clusters and weak cluster are proposed in the P-Stream. With these concepts, an effective strategy of choosing candidate cluster is designed, which can find the sound cluster for every continuously arriving data point. Then, in order to further cluster on the high level and analyze the evolving behaviors of data streams, snapshots of micro-clusters are stored at every checkpoint. At last, an “aggressive” two-tier clustering model is introduced to judge whether the most recently arrived data point is fitting in with the first level clustering model or not. Probabilistic data streams in the experiments include KDD-CUP’98 and KDD-CUP’99 real data sets and synthetic data sets with changing Gaussian distributions. Comprehensive experimental results demonstrate that P-Stream is of high quality, fast processing rate and is efficiently fitting in with the evolving situations of data streams.
DAI Qun , CHEN Song-Can , WANG Zhe
Abstract:An SOM(self-organizing feature maps)-based integrated network, namely ICBP-SOM, is constructed by applying the ICBP network model to the BP-SOM architecture. BP-SOM is a learning algorithm put forward by Ton Weijters, which aims to overcome some of the serious limitations of BP in generalizing knowledge from certain types of learning material. The motivation of presenting the integration is to employ BP-SOM good knowledge interpretation ability and the ICBP good generalization and adaptability to construct an ICBP-SOM, which processes favorable knowledge representation capability and competitive generalization performance. The experimental results on six benchmark data sets validate the feasibility and effectiveness of the integration.
Abstract:This paper surveys the current technologies adopted in cloud computing as well as the systems in enterprises. Cloud computing can be viewed from two different aspects. One is about the cloud infrastructure which is the building block for the up layer cloud application. The other is of course the cloud application. This paper focuses on the cloud infrastructure including the systems and current research. Some attractive cloud applications are also discussed. Cloud computing infrastructure has three distinct characteristics. First, the infrastructure is built on top of large scale clusters which contain a large number of cheap PC servers. Second, the applications are co-designed with the fundamental infrastructure that the computing resources can be maximally utilized. Third, the reliability of the whole system is achieved by software building on top of redundant hardware instead of mere hardware. All these technologies are for the two important goals for distributed system: high scalability and high availability. Scalability means that the cloud infrastructure can be expanded to very large scale even to thousands of nodes. Availability means that the services are available even when quite a number of nodes fail. From this paper, readers will capture the current status of cloud computing as well as its future trends.
Abstract:This paper briefly reviews the theoretical researches on network coding, from which the significance of research on optimization problems is revealed. Based on the network information flow model, it makes a survey on the formulation, characteristics and algorithms of optimization problems with the latest results. According to the goal of optimization, the typical optimization problems in network coding are classified into four categories: minimum-cost multicast, throughput maximization in undirected networks, minimum number of coding nodes and links, topology design of network coding-based multicast networks. The general approaches to deal with these problems are categorized. For (linear or convex) programming problems, the solutions are summarized; for NP complete problems, the latest heuristic algorithms and their difficulties are analyzed. The perspectives on future work are also discussed.
ZHANG Yu-Xiang , YANG Dong , ZHANG Hong-Ke
Abstract:Churn is one of the main problems faced by all P2P networks. This paper surveys the existing theories and methods about Churn. On the basis of the generation mechanism of Churn, this study summarizes the major steps of the Churn problem solution: precise measurement of Churn, analysis of the impact of Churn on P2P performance, and provision of specific strategies to handle Churn. Under each step, existing problems, solutions, and the newest recent research are explored. This paper also discusses the prospects of future studies.
Abstract:This paper studies the critical traffic matrices selection problem and develops an algorithm called MinMat which uses information entropy to select the first critical matrices at first, then takes merging cost into consideration when agglomerating a pair of clusters. The algorithm is evaluated by using a large collection of real traffic matrices collected in Abilene network. Theoretical analysis and experimental results demonstrate that MinMat algorithm is more effective than K-means, Hierarchical Agglomeration, CritAC, and by simulating on Totem, it is concluded that a small number of critical traffic matrices suffice to yield satisfactory performance.
ZHAO Li , YUAN Rui-Xi , GUAN Xiao-Hong , JIA Qing-Shan
Abstract:A discrete time dynamic model is proposed for bursty propagation of incidental events based on the node popularity and activeness in blog networks. The parameters of this model are clearly associated with the actual propagation and can reflect the characteristics of the dynamic propagation process. The model can provide a basis for predicting the trend of social events propagation in blog networks. Numerical testing is performed with the data from widely discussed events in Sina Blog, one of the most popular blogospheres in China in several months, and the results show that this model can emulate the actual event propagation and reflect the heavy tail phenomena of the decreasing propagation rate.