• Volume 20,Issue 5,2009 Table of Contents
    Select All
    Display Type: |
    • LBS-Oriented Creation Method and Implementation for Telecommunication Value-Added Services

      2009, 20(5):965-974.

      Abstract (4500) HTML (0) PDF 591.59 K (4847) Comment (0) Favorites

      Abstract:The location-based services is considered to be the next climax of the telecom value-added services after the SMS (short message system). This paper introduces a service flow describing language XPL (extended-calling process language) which mainly focuses on the value-added service in telecom field before. This paper presents a geography information service describing language GDL (geography description language) based on the prior work. XPL and GDL have the ability to describe LBS-oriented telecom services in coordination. Those languages have high level in abstractness, and they can be easily used and are fast at developing service. This paper also presents the supporting system of those languages. Due to the basis of SOA (services-oriented architecture), the supporting system suits the service creation under the condition of net merging.

    • >Special Issue's Articles
    • Three-Dimensional Reconstruction for Fluorescence Tomography Using Cylinder Phantoms

      2009, 20(5):1089-1095.

      Abstract (8963) HTML (0) PDF 462.97 K (9351) Comment (0) Favorites

      Abstract:Based on the finite element analysis, this study applies three iterative regularization algorithms to the 3D reconstruction of fluorescent yield, including conjugate gradient least square (CGLS), least square QR decomposition (LSQR), and 2-order pre-iteration method. By using a non-contact, multi-angle transmission imaging system, the experiments of single fluorescent target and double targets with the common-used cylinder phantom are conducted. Experimental results show that the above three methods could estimate the position of fluorescence targets accurately, while the time cost of each method is only about 2% of that of algebraic reconstruction technology (ART).

    • Medical Image Fusion Algorithm Based on Bidimensional Empirical Mode Decomposition

      2009, 20(5):1096-1105.

      Abstract (8921) HTML (0) PDF 1.24 M (10794) Comment (0) Favorites

      Abstract:An adaptive medical image fusion algorithm based on the representation of bidimensional empirical mode decomposition (BEMD) is proposed. Source medical images are decomposed into a number of bidimensional intrinsic mode functions (BIMF) as well as a residual image. Image features are extracted through Hilbert-Huang transform on the BIMF. Then the composite BEMD is formed by region-based fusion rules on data representations of BEMD. Finally, the fused image is obtained by inverse BEMD on the composite representation. The BEMD is an adaptive data decomposition representation, and has better performance than Fourier and wavelet transform. The proposed algorithm does not need predetermined filters or wavelet functions. Experimental results show that the proposed algorithm provides superior performance over conventional fusion algorithms in improving the quality of fused images.

    • Energy Conduction Model and Its Application in Medical Image Segmentation

      2009, 20(5):1106-1115.

      Abstract (8723) HTML (0) PDF 716.71 K (7643) Comment (0) Favorites

      Abstract:This paper proposes an energy conduction model (ECM) based on the level set framework, which takes advantage of the heat conduction equation to construct the image energy. After comparing the image intensity distribution with the spatial distribution of the temperature field, an energy conduction function is defined, which well simulates the process of heat conducting. The advantage of the ECM is that it captures the global feature of an image and takes the local intensity information into account. Thus, ECM is able to accurately segment medical images with inhomogeneity and noise, as well as for the medical images with multi-targets. Synthetic and real medical images are tested with ECM, which shows its robustness and efficiency.

    • Uniform-Structure-Based Parallel Hybrid Optimization Strategy in Radiotherapy Planning

      2009, 20(5):1116-1122.

      Abstract (7474) HTML (0) PDF 632.96 K (8827) Comment (0) Favorites

      Abstract:In order to meet the requirement for conformal intensity modulation inverse radiotherapy planning optimization processing, a new objective function is structured to target at the well defined objective area. This paper studies the parallel hybrid optimization strategy for inverse radiotherapy planning with the example of predominance combination of hybrid optimization strategy of simulation annealing and genetic algorithms, which formed a parallel general neighborhood searching hybrid optimization algorithm based on the uniform structure, and realizes the algorithm in a computer with multiple CPUs and multiple nuclei. It describes the dose distributions got with the algorithm for a virtual phantom and 5 clinical cases with the satisfying results. The results proves that this algorithm is effective and practical, which is a good platform for further research in parallel hybrid algorithm and the base for further development of the treatment planning system using biology guided radiotherapy technologies.

    • Analysis of Magnetic Resonance Imaging Data on the Study of Alzheimer’s Disease

      2009, 20(5):1123-1138.

      Abstract (8641) HTML (0) PDF 1.02 M (15999) Comment (0) Favorites

      Abstract:Firstly, current Alzheimer’s disease (AD) studies using structural magnetic resonance imaging (MRI), functional MRI and diffusion tensor imaging (DTI) techniques are reviewed. Then the primary processing approaches of the three sorts of MRI data are introduced and analyzed. Finally, the neuroimaging database based on AD and the construction of corresponding diagnosis platform are described. Moreover, some research advances of the research group in this field are also mentioned.

    • Automatic Brain Tissue Extraction Approach of Magnetic Resonance Head Images

      2009, 20(5):1139-1145.

      Abstract (8013) HTML (0) PDF 708.15 K (10415) Comment (0) Favorites

      Abstract:This paper presents an effective automatic method of brain extraction through twice segmentations respectively. The noise of the image is eliminated through anisotropic diffusion filtering based on Catt model while the details of the image are kept. The over-segmentation problem of the watershed algorithm is solved based on the merging of gray-scale similarity regions and the brain tissue is initially segmented. The edges of different organizations of every brain image are fuzzy and the MRI data is vulnerable to be affected by noise, therefore the non-brain region may be mistaken for brain region in the process of merging. In order to solve these problems, the Level Set method is adopted. The outline of the watershed is taken as the initial curves of level set to realize the automatic segmentation of brain tissue. The feasibility and practicality of this algorithm is proved by the results in the experiments.

    • Learning-Based Medical Image Magnification Algorithm by Local Geometric Similarity

      2009, 20(5):1146-1155.

      Abstract (8837) HTML (0) PDF 567.90 K (8827) Comment (0) Favorites

      Abstract:Image magnification is an important technology in medical image processing. High detail areas in medical images most often have a definite geometric structure or pattern, such as in the case of edges. This paper proposes a learning-based method. Geometric features extracted from the available neighboring pixels in the Low-resolution (LR) image form the training set. Assuming the training set is locally corresponding to geometric features from the High-resolution (HR) image patch to be reconstructed. Local geometric similarity is described as the correspondence. The task of image magnification is formulated as an optimization problem, where the optimization coefficients can adaptively tune its value to effectively propagating the features from the training set to the target HR image patch. The advantages are the ability to produce a magnified image by any factor, and not require any outlier supporters. A Weighted Least Square (WLS) method is provided to offer a convenient way of finding the regularized optimal solution, where the weight function is determined by the non-local means. Simulation and comparison results show that the proposed method is independent, adaptive and can produce sharp edges with rare ringing or jaggy artifacts.

    • Numerical Simulations and Preliminary Experiments of the Principle of X-Ray In-Line Phase-Contrast Imaging

      2009, 20(5):1156-1165.

      Abstract (8275) HTML (0) PDF 748.96 K (9676) Comment (0) Favorites

      Abstract:This paper introduces the design, analysis and experimental verification of focal spot X-ray in-line imaging system. First of all, the object transmission function is fixed by the refractive index decrement and the absorption index of tissue according to the previously selected energy. After the factors being analyzed, such as the distance between the object and detector, and the object edge width, which can affect imaging quality, methods presented to improve image resolution are validated. In the experiment, plastic set square is irradiated with X-ray phase-contrast technique. The result is then compared with that using traditional X-ray imaging technique. The experimental results exhibit considerable consistency with former numerical simulation, which proves that the particular method can be used to determine the parameters of imaging system.

    • Shadow Zone Padding Method for Grangeat Reconstruction of Circular Cone-Beam CT

      2009, 20(5):1166-1175.

      Abstract (8814) HTML (0) PDF 861.43 K (10298) Comment (0) Favorites

      Abstract:Based on the analysis of the shadow zone and the padding methods of the Radon domain generated by circular cone-beam CT scanning, this paper proposes a shadow zone padding method by distance-weighted interpolation. The method uses the known boundary data to interpolate the missed data inside the shadow zone with a distance weighting function. Simulation results show that the proposed padding method shows some advantages over the constant padding technique in terms of data-padding accuracy, artifacts reduction and image reconstruction for large cone-angle cone-beam scanning. The data padding method is particularly useful in Grangeat reconstruction algorithm. It enlarges the applicable scope of circular cone-beam tomography.

    • Method for Segmentation of the Endocardium and Epicardium of the Left Ventricle in Cardiac Magnetic Resonance Images

      2009, 20(5):1176-1184.

      Abstract (7721) HTML (0) PDF 1003.87 K (9085) Comment (0) Favorites

      Abstract:In order to make a thorough use of the anatomical and functional information derived from cardiac magnetic resonance images, the epicardium and endocardium of the left ventricle should be extracted in advance. This paper presents a method for segmentation of the endocardium and epicardium of the left ventricle in cardiac magnetic resonance images using Snake models. It first proposes an external force for active contours, which is called convolutional virtual electric field (CONVEF). This CONVEF external force possesses the advantages of enlarged capture range, noise resistance and C-shape concavity convergence and can be implemented in real time by using fast Fourier transform since it is based on convolution. Considering that the left ventricle is roughly a circle, a shape constraint based on circle is adopted for segmentation of the endocardium. As to locating the epicardium, an internal energy based on shape similarity is proposed, and an edge map is coined to calculate the new external force by exploiting the resemblance between the endocardium and epicardium in shape and position. With these strategies, taking the final contour for endocardium as initialization, the Snake contour is reactivated to locate the epicardium automatically and accurately. This paper demonstrates the proposed approach on an in vivo dataset and compare the segmented contours with that of the GGVF (generalized gradient vector flow) Snake and manual collections. The results show its effectiveness.

    • Automatic Image Segmentation Method Using Sequential Level Set

      2009, 20(5):1185-1193.

      Abstract (8851) HTML (0) PDF 1.24 M (10733) Comment (0) Favorites

      Abstract:Based on the level set method without re-initialization, a sequential level set method is proposed to realize full image segmentation. The proposed method automatically and alternatively creates nested sub-regions or the corresponding initial level set functions in the image to be segmented, and then makes the level set function evolved to be convergence in the corresponding sub-region. This step is sequentially repeated until the sub-region vanishes. Compared with the original method and a representative region-based level set method, the proposed method has many advantages as follows: 1) It is automatically executed and does not need the interactive initialization anymore; 2) It segments image more than once and detects more boundaries than the original method; 3) It can get better performance on non-homogenous images than the representative region-based level set method; 4) It is an open image segmentation framework in which the single level set method is used can be replaced by other single level set methods after some modification. Experimental results indicate that the proposed method could fully segment the synthetic and medical images without interactive step and additionally works more robust on non-homogenous images.

    • Research on Bioluminescent Tomography Forward Problem Based on Non-Truncated Wavelet Finite Element

      2009, 20(5):1194-1206.

      Abstract (7538) HTML (0) PDF 985.97 K (7724) Comment (0) Favorites

      Abstract:In this paper, an algorithm named non-truncated wavelet finite element for bioluminescence tomography (BLT) is proposed. Using linear combination of non-truncated wavelet functions across the elements to approximate the unknown function, this algorithm is used in BLT forward problem in phantoms of cylinder and sphere successfully. Theoretical analysis and numerical simulations show that the computation accuracy by this algorithm is almost as good as that of finite element method (FEM), while the number of elements and computational complexity reduce greatly compared with FEM.

    • Neuroimaging Study of Acupuncture Effects

      2009, 20(5):1207-1215.

      Abstract (7494) HTML (0) PDF 599.33 K (10238) Comment (0) Favorites

      Abstract:This paper introduces modern medical imaging technologies (fMRI (functional magnetic resonance imaging), PET(positron emission tomography), EEG(electroencephalography) and MEG(magnetoencephalography)) for deeper exploration in the field of acupuncture. Based on the experimental design and acupoint selection, acupuncture’s regulation of the brain functional network, paralimbic system, brain stem network and its control of the endogenous analgesic system as well as the autonomic nervous system are presented from three aspects. This paper summarizes the imaging technology-related academic reports used in the neural mechanisms of acupuncture. It simultaneously depicts the problems in current acupuncture studies and provides a better framework for future experimental work in hope of further understanding the neural mechanisms of acupuncture.

    • Applications of Monte Carlo Method in Simulating Diffuse Optical Imaging

      2009, 20(5):1216-1225.

      Abstract (7787) HTML (0) PDF 640.92 K (10339) Comment (0) Favorites

      Abstract:This paper reviews the models of the forward problem, which are analytic solution, numerical solution and the statistic solution. Especially, a case of the Monte Carlo (MC) method when a bioluminescence source in multi-spectral mode is embedded in the tissue is introduced. In the field of optical imaging, according to different imaging modalities, the demand for imaging quality and the extracted information, there are three major categories: continuous waves, time-domain, and frequency-domain. Here not only the fundamental principles for each category are presented, but some typical applications and softwares using MC method are also particularly introduced. On the basis of these applications, it is believed that the MC method plays an indispensable role in the development of diffuse optical imaging, especially in vivo, no-invasive imaging in recent years.

    • >Review Articles
    • Artificial Intelligence Planning Methods Based on Automated Reasoning Techniques

      2009, 20(5):1226-1240.

      Abstract (16509) HTML (0) PDF 926.82 K (18984) Comment (0) Favorites

      Abstract:This paper introduces the concrete details of combining the automated reasoning techniques with planning methods, which includes planning as satisfiability using propositional logic, Conformant planning using modal logic and disjunctive reasoning, planning as nonmonotonic logic, and Flexible planning as fuzzy description logic. After considering experimental results of International Planning Competition and relevant papers, it concludes that planning methods based on automated reasoning techniques is helpful and can be adopted. It also proposes the challenges and possible hotspots.

    • Statistical Machine Translation Model Based on a Synchronous Tree-Substitution Grammar

      2009, 20(5):1241-1253.

      Abstract (4958) HTML (0) PDF 744.60 K (8431) Comment (0) Favorites

      Abstract:A translation model based on synchronous tree-substitution-grammar is presented in this paper. It can elegantly model the global reordering and discontinuous phrases. Furthermore, it can learn non-isomorphic tree-to-tree mappings. Experimental results on two different data sets show that the proposed model significantly outperforms the phrase-based model and the model based on synchronous context-free grammar.

    • Further Research on Observation Reduction in Non-Deterministic Planning

      2009, 20(5):1254-1268.

      Abstract (5709) HTML (0) PDF 867.07 K (6279) Comment (0) Favorites

      Abstract:This paper improves the methods of observation reduction in non-deterministic planning (NDP) in three aspects: in finding MOS (minimal observation set); in finding out the optimal observation set (OOS) when observations have different costs; and in finding fault-tolerant OOS. A MOS problem is similar to a minimal set cover (MSC) problem, so it can be proved that finding MOS is NP-hard. Inspired by MSC methods, an O(2mm2) but Ω(2m?1) algorithm for MOS is presented, where m is the number of observations. By using integer programming (IP) technologies, OOS or fault tolerant OOS can be found out. Proofs are provided to show that these algorithms can guarantee finding optimal solutions.

    • Clonal Selection Algorithm Based on Anti-Idiotype

      2009, 20(5):1269-1281.

      Abstract (5593) HTML (0) PDF 854.50 K (7487) Comment (0) Favorites

      Abstract:Based on the antibody clonal selection theory of immunology, an artificial immune system algorithm, clonal selection algorithm based on anti-idiotype (AICSA), is proposed to deal with complex multi-modaloptimization problems by introducing the anti-idiotype. This algorithm evolves and improves the antibodypopulation through clonal proliferation, anti-idiotype mutation, anti-idiotype recombination and clonal selection operation, which can perform global search and local search in many directions rather than one direction around the identical antibody simultaneously. Theoretical analysis proves that AICSA can converge to the global optimum. By introducing the anti-idiotype, AICSA can make the most of the structure information of antibodies, accelerate the convergence, and obtain the global optimization quickly. In experiments, AICSA is tested on four different types of functions and compared with the clonal selection algorithm and other optimization methods. Theoretical analysis and experimental results indicate that AICSA achieves a good performance, and is also an effective and robust technique for optimization.

    • Extraction Model Based on Web Format Information Quantity in Blog Post and Comment Extraction

      2009, 20(5):1282-1291.

      Abstract (4776) HTML (0) PDF 612.89 K (7431) Comment (0) Favorites

      Abstract:Based on the information theory, this paper presents a model based on Web format information quantity in blog information extraction. First, the vision information in blog Web page and the effective text information are combined to locate the main text which represents the theme of the blog Web page. Second, the format information of blog Web page is used to calculate the information quantity of each block and the minimal separating information quantity of separate position is used to detect the boundary of posts and comments in the main text. This model is language insensitive and can be used in a lot of blogs which are written in different natural languages. Experimental results show that this method achieves high precision in locating main text and separating the post and comment.

    • Unsupervised Translation Disambiguation by Using Semantic Dictionary and Mining Language Model from Web

      2009, 20(5):1292-1300.

      Abstract (4121) HTML (0) PDF 593.18 K (6440) Comment (0) Favorites

      Abstract:In order to solve the problem of data sparseness and knowledge acquisition in translation disambiguation and WSD (word sense disambiguation), this paper introduces an unsupervised method, based on the n-gram language model and web mining. It is supposed that there exists a latent relationship between the word sense and n-gram language model. Based on this assumption, the mapping between the English translation of Chinese word and the DEF of Hownet is established and the word set is acquired. Then the probabilities of n-gram in the words set are calculated based on the query results of a searching engine. The disambiguation is performed via these probabilities. This method is evaluated on a gold standard Multilingual Chinese English Lexical Sample Task dataset. Experimental results show that the model gets the state-of-the-art results (Pmar=55.9%) and outperforms 12.8% on the best system in SemEval-2007.

    • Small Sample Incremental Biased Learning Algorithm for Sketch Retrieval

      2009, 20(5):1301-1312.

      Abstract (5461) HTML (0) PDF 680.82 K (6972) Comment (0) Favorites

      Abstract:This paper proposes an algorithm named Small Sample Incremental Biased Learning Algorithm to solve three difficulties of relevance feedback in sketch retrieval, including small sample issue, asymmetry of training data and real-time requirement. The algorithm combines active learning, biased classification and incremental learning to model the small sample biased learning problem in relevance feedback process. Active learning employs uncertainty sampling to choose the best labeling samples, so that the generalization ability of classifier is maximized with the limited training data; Biased classification constructs hyperspheres to treat positive and negative data differently, which distinguishes the user’s target class accurately; Newly labeled samples in each feedback loop are used to train the classifier incrementally to reduce the training time. Incremental learning also collects training data to further alleviate the small sample problem. Experimental results show that this algorithm improves the performance of sketch retrieval. And it can be well extended to other retrieval domains like CBIR (content based image retrieval), 3D retrieval, and so on.

    • Effective Clustering Algorithm for Probabilistic Data Stream

      2009, 20(5):1313-1328.

      Abstract (5483) HTML (0) PDF 1.01 M (7562) Comment (0) Favorites

      Abstract:An effective clustering algorithm called “P-Stream” for probabilistic data stream is developed in this paper for the first time. For the uncertain tuples in the data stream, the concepts of strong cluster, transitional clusters and weak cluster are proposed in the P-Stream. With these concepts, an effective strategy of choosing candidate cluster is designed, which can find the sound cluster for every continuously arriving data point. Then, in order to further cluster on the high level and analyze the evolving behaviors of data streams, snapshots of micro-clusters are stored at every checkpoint. At last, an “aggressive” two-tier clustering model is introduced to judge whether the most recently arrived data point is fitting in with the first level clustering model or not. Probabilistic data streams in the experiments include KDD-CUP’98 and KDD-CUP’99 real data sets and synthetic data sets with changing Gaussian distributions. Comprehensive experimental results demonstrate that P-Stream is of high quality, fast processing rate and is efficiently fitting in with the evolving situations of data streams.

    • Hybrid Neural Network Architecture Based on Self-Organizing Feature Maps

      2009, 20(5):1329-1336.

      Abstract (4408) HTML (0) PDF 422.54 K (6106) Comment (0) Favorites

      Abstract:An SOM(self-organizing feature maps)-based integrated network, namely ICBP-SOM, is constructed by applying the ICBP network model to the BP-SOM architecture. BP-SOM is a learning algorithm put forward by Ton Weijters, which aims to overcome some of the serious limitations of BP in generalizing knowledge from certain types of learning material. The motivation of presenting the integration is to employ BP-SOM good knowledge interpretation ability and the ICBP good generalization and adaptability to construct an ICBP-SOM, which processes favorable knowledge representation capability and competitive generalization performance. The experimental results on six benchmark data sets validate the feasibility and effectiveness of the integration.

    • >Review Articles
    • Cloud Computing: System Instances and Current Research

      2009, 20(5):1337-1348.

      Abstract (28418) HTML (0) PDF 1.06 M (46647) Comment (0) Favorites

      Abstract:This paper surveys the current technologies adopted in cloud computing as well as the systems in enterprises. Cloud computing can be viewed from two different aspects. One is about the cloud infrastructure which is the building block for the up layer cloud application. The other is of course the cloud application. This paper focuses on the cloud infrastructure including the systems and current research. Some attractive cloud applications are also discussed. Cloud computing infrastructure has three distinct characteristics. First, the infrastructure is built on top of large scale clusters which contain a large number of cheap PC servers. Second, the applications are co-designed with the fundamental infrastructure that the computing resources can be maximally utilized. Third, the reliability of the whole system is achieved by software building on top of redundant hardware instead of mere hardware. All these technologies are for the two important goals for distributed system: high scalability and high availability. Scalability means that the cloud infrastructure can be expanded to very large scale even to thousands of nodes. Availability means that the services are available even when quite a number of nodes fail. From this paper, readers will capture the current status of cloud computing as well as its future trends.

    • Research on the Optimization Problems in Network Coding

      2009, 20(5):1349-1361.

      Abstract (7260) HTML (0) PDF 663.75 K (12823) Comment (0) Favorites

      Abstract:This paper briefly reviews the theoretical researches on network coding, from which the significance of research on optimization problems is revealed. Based on the network information flow model, it makes a survey on the formulation, characteristics and algorithms of optimization problems with the latest results. According to the goal of optimization, the typical optimization problems in network coding are classified into four categories: minimum-cost multicast, throughput maximization in undirected networks, minimum number of coding nodes and links, topology design of network coding-based multicast networks. The general approaches to deal with these problems are categorized. For (linear or convex) programming problems, the solutions are summarized; for NP complete problems, the latest heuristic algorithms and their difficulties are analyzed. The perspectives on future work are also discussed.

    • Research on Churn Problem in P2P Networks

      2009, 20(5):1362-1376.

      Abstract (9273) HTML (0) PDF 864.95 K (11763) Comment (0) Favorites

      Abstract:Churn is one of the main problems faced by all P2P networks. This paper surveys the existing theories and methods about Churn. On the basis of the generation mechanism of Churn, this study summarizes the major steps of the Churn problem solution: precise measurement of Churn, analysis of the impact of Churn on P2P performance, and provision of specific strategies to handle Churn. Under each step, existing problems, solutions, and the newest recent research are explored. This paper also discusses the prospects of future studies.

    • Algorithm Based on Entropy for Finding Critical Traffic Matrices

      2009, 20(5):1377-1383.

      Abstract (4980) HTML (0) PDF 562.42 K (7515) Comment (0) Favorites

      Abstract:This paper studies the critical traffic matrices selection problem and develops an algorithm called MinMat which uses information entropy to select the first critical matrices at first, then takes merging cost into consideration when agglomerating a pair of clusters. The algorithm is evaluated by using a large collection of real traffic matrices collected in Abilene network. Theoretical analysis and experimental results demonstrate that MinMat algorithm is more effective than K-means, Hierarchical Agglomeration, CritAC, and by simulating on Totem, it is concluded that a small number of critical traffic matrices suffice to yield satisfactory performance.

    • Bursty Propagation Model for Incidental Events in Blog Networks

      2009, 20(5):1384-1392.

      Abstract (7430) HTML (0) PDF 548.27 K (8215) Comment (0) Favorites

      Abstract:A discrete time dynamic model is proposed for bursty propagation of incidental events based on the node popularity and activeness in blog networks. The parameters of this model are clearly associated with the actual propagation and can reflect the characteristics of the dynamic propagation process. The model can provide a basis for predicting the trend of social events propagation in blog networks. Numerical testing is performed with the data from widely discussed events in Sina Blog, one of the most popular blogospheres in China in several months, and the results show that this model can emulate the actual event propagation and reflect the heavy tail phenomena of the decreasing propagation rate.

Current Issue


Volume , No.

Table of Contents

Archive

Volume

Issue

联系方式
  • 《Journal of Software 》
  • 主办单位:Institute of Software, CAS, China
  • 邮编:100190
  • 电话:010-62562563
  • 电子邮箱:jos@iscas.ac.cn
  • 网址:https://www.jos.org.cn
  • 刊号:ISSN 1000-9825
  •           CN 11-2560/TP
  • 国内定价:70元
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063