Loading...

Table of Content

    20 October 2013, Volume 42 Issue 5
    Study on CHZ gravimeter applied in airborne gravimetry involving error spectrum characteristic
    2013, 42(5):  633-639. 
    Asbtract ( )   HTML   PDF (1827KB) ( )  
    Related Articles | Metrics
    The method of error analysis in frequency domain for airborne gravimetry is introduced firstly. Here, the ability to recovery gravity field for airborne scalar gravimetry is assessed using power spectrum density of gravity signal and system noise in frequency domain, and error sources of airborne gravimetry are also discussed in different frequency bands. Then, the characteristics of CHZ Gravimeter are introduced. Meanwhile, the dynamic proprieties of CHZ gravimeter with different damp factors, which base on frequency bands of airborne gravimetry, are analyzed using observed gravity anomalies and vertical acceleration data measured by airborne GPS. The result indicates that CHZ gravimeter with proper damp factor can meet the requirements of airborne gravimetry for fixed wing aircraft.
    Application of CRUST 2.0 Global Crustal Model in Isostatic Potential Model Construction
    2013, 42(5):  640-647. 
    Asbtract ( )   HTML   PDF (2457KB) ( )  
    References | Related Articles | Metrics
    The determination of disturbing mass is one of the key techniques in the construction of the Earth’s gravity field model (EGM) based on Isostasy. Different distribution of crust density with different depth results in different crustal mass. With spherical layer distribution and Airy compensation models, the application of CRUST 2.0 in the establishment of isostatic potential model was studied here. Calculation models of the isostatic EGM were derived. Then, the contributions of CRUST 2.0 and compensation depth towards the models were discussed. Numerical experiments show that 22.97 km is not the best compensation depth, while 40 km is better than the others, and compensation depths do little help to the ultra-high-degree part of the original model. Besides, the crustal model could modify the ideal model between degree 361 and 1080. Finally, it is hard to figure out which one is better than the other when spherical layer distribution and Airy compensation model have the same compensation depth.
    Analytical Method for VCE Using Equivalent Condition Closure
    2013, 42(5):  648-653. 
    Asbtract ( )   HTML   PDF (1411KB) ( )  
    References | Related Articles | Metrics
    The defect of the existed methods for variance - covariance component estimation (VCE), including computational efficiency and statistical properties of Chi-square statistic, are analyzed and pointed out in this paper. The equivalent condition adjustment model is established from the generalized adjustment model by null-space operator, based on which the equivalent condition closure (ECC) and the formula of Chi-square statistic is derived. Moreover, the variance-covariance component model characterized with invertibility is constructed effectively and the analytical method used for VCE is proposed using ECC and invertible variance covariance component model (short for VCE-ECC method). Meantime, simplification formulas of VCE-ECC method are provided for four basic adjustment models correspondingly. Finally, results show there was no significant difference for estimated value of variance - covariance component between proposed method and existed VCE methods in the statistical significance, and the analytical method presented here could overcome the inherent defect in existed VCE methods.
    A New Water Vapor Tomographic Numerical Quadrature Approach with Ground-based GPS Network
    2013, 42(5):  654-660. 
    Asbtract ( )   HTML   PDF (2771KB) ( )  
    References | Related Articles | Metrics
    This paper presents a new approach of ground-based GPS tomography for wet refractivity using numerical quadrature and respective interpolation in both horizontal and vertical direction with new interpolation formulas,and the earth curvature is considered.Real GPS and meteorological data from CORS located in Dallas, U.S. were used to validate the new approach.Tomographic results of wet refractivity vertical profile by new and classic approaches were compared with the radio soundings results from FWD RS station.The research shows that new methods can get the wet refractivity vertical profile with higher precision and reliability as the RMS and error distribution are better than the classic method’s.And precision of the new and more simple vertical interpolation function is slightly higher than the more complex spline one’s.
    Research of Parallel Processing for GNSS Data under Multi-core Environment
    2013, 42(5):  661-667. 
    Asbtract ( )   HTML   PDF (2931KB) ( )  
    Related Articles | Metrics
    Multi-core processors have becoming the mainstream of current computer architecture, and multi-core parallel computing technology and its application has gotten increasing concerns. Generally, traditional GNSS data processing procedures are written for single processor architecture. This paper studies the parallel algorithm for computationally intensive tasks of the current GNSS data processing with multiple time periods or many stations under multi-core environment. The hot spot calculated tasks which GNSS data processing deal with are analyzed and numerical calculation parallel method for matrix multiplication and matrix decomposition based on modular matrix theory are proposed. Subsequently, the computational efficiencies of single-core and multi-core environment are compared. The Parallel Extensions under .NET4.0 framework is adopted and the validity of the multi-core concurrent design is demonstrated by some examples. The experimental results show that the multi-core parallel computing of GNSS data processing can give full play to the advantage of the multi-core system performance and greatly improve the efficiency of GNSS data processing and the resource utilization.
    A CPU-GPU Co-processing Orthographic Rectification Approach for Optical Satellite Imagery
    2013, 42(5):  668-675. 
    Asbtract ( )   HTML   PDF (2959KB) ( )  
    References | Related Articles | Metrics
    A new CPU-GPU co-processing orthographic rectification approach for optical satellite imagery is proposed in this paper. First, “hierarchical tiling” strategy is applied to form the basic algorithm flow; then the algorithm performance is further improved in the respect of configuration optimization and memory hierarchical access. The algorithm is applied to the orthographic rectification of ZY-3 nadir panchromatic imagery and the experimental data is collected. By analyzing the data we find that the processing time of our algorithm on Tesla M2050 GPU is less than 5 seconds and the highest speedup ratio to the traditional serial algorithm is more than 110 times. This result demonstrates that our algorithm significantly improves the orthographic rectification efficiency and fully satisfies the requirement of fast orthographic rectification for large data optical satellite imagery.
    Research on distribution characteristics of polar seaice by the altimetry backscatter coefficient
    2013, 42(5):  676-681. 
    Asbtract ( )   HTML   PDF (3407KB) ( )  
    Related Articles | Metrics
    Seaice is a sensitive indicator of climate change. The knowledge of sea ice extent and surface humidity are not only of great importance to polar region, but also very crucial to the prediction of future temperature trend and the establishment of global climate model. The tracks of ENVISAT RA2 altimeter can reach up to the high latitude of 81.4°of north and south, which provides a way of active microwave sensing to monitor sea ice. In this paper, we developed a method using backscatter coefficients (sigma0) based on the ENVISAT satellite radar altimeter RA2 to detect the monthly changes of polar sea ice extent and surface properties. Considering the variations of scattering characteristics of the sea water and ice surface, we prove that a threshold of 13db for sigma0 can distinguish the sea ice cover from sea.Except for summer times, a highly correspondence between seaice boundaries from altimeter and from radiometer released by NSIDC is presented. For lacking of satellite tracks on the central areas of arctic, we only estimated the seaice extent of antarctic zones. And the ENVASAT altimeter gives a greater value in summer months compared to the radiometer result from NSIDC, which relates to the excellent capacity of altimeter to sensing the dispersed thin ice. In other seasons with high seaice concentration, the difference of sea ice extent is pretty small as the mean difference of winter is just 0.17Mkm2.We also research the discrepancy of sea ice properties over two polar zones, and the result shows that in arctic the sea ice surface is more rough and dry in in the winter and more wet in the summer than antarctica.Based on this research, the ENVISAT radar altimeter can precisely detect seasonal evolution of sea ice cover and surface properties and is proved to be a useful tool in sea ice monitoring and polar exploration.
    Omnidirectional Edge Detection Based on Two-dimensional Log Butterworth Filters in Frequency Domain
    2013, 42(5):  682-690. 
    Asbtract ( )   HTML   PDF (5391KB) ( )  
    References | Related Articles | Metrics
    In this paper, an improved algorithm of omnidirectional edge detection based on two-dimension Log Butterworth filter is proposed to satisfy the need of the nonlinear recognition mechanism in the image processing. The edge detection using the proposed algorithm involves Fast Fourier Transform (FFT) and Inverse Fast Fourier Transform (IFFT) in frequency domain. The two-dimension Log Butterworth filter is proposed by introducing the Log function into the Butterworth filter. When the length and width of image is different, the centre frequency is located at an ellipse with the ratio of long axis to minor axis being equal to the length-width ratio of the original image. Thus, this filter can be expressed with a variable of angle. In order to obtain the optimal parameters range, the parameters of the two-dimension Log Butterworth filter are normalized. Then F-measure and PSNR (Peak Signal to Noise Ratio) are introduced into the determination of the range of optimal parameters. Meanwhile, the numbers of multiplication and addition, and computation time for edge detection with different size of image are used to compare the efficiency of the proposed algorithm and a traditional edge detector (Canny detector). Finally, edge detections of BSDS (The Berkeley Segmentation Dataset and Benchmark) images and high spatial resolution remotely sensed imageries using the proposed algorithm are evaluated and analyzed. The results of evaluation and analysis show that the proposed algorithm can be used to detect edges from images efficiently.
    A Dense Matching Algorithm of Multi-View Image Based on the Integrated Multiple Matching Primitives
    2013, 42(5):  691-698. 
    Asbtract ( )   HTML   PDF (8286KB) ( )  
    Related Articles | Metrics
    Focusing on the occlusion problem in multi-view image matching, this paper presents one new dense matching algorithm for multi-view image by integrating the feature points in image object and plane elements in the space object, which are based on the space plane divided in regular grids. Firstly, it realizes the simultaneously matching of feature points and plane elements by the constrains of the projection ranges of feature points in multi-view image and the positions of plane elements, and provides a reliable initial DSM for the next dense matching; Then it carries on the dense matching of regular distribution plane elements by combining the Vertical Line Locus method and the Height-based Occlusion Detection method, and densifies the initial matching results; Finally, the validity of the algorithm proposed in this paper is verified by the experiments using four UltraCamX (UCX) digital aerial images.
    The study of Interferogram denoising method Based on EMD and Adaptive Filter
    2013, 42(5):  707-714. 
    Asbtract ( )   HTML   PDF (4889KB) ( )  
    Related Articles | Metrics
    A new adaptive filter based on empirical mode decomposition that is based on different characteristics of signal with noise in different IMFS for suppressing speckle in SAR interferogram is proposed. At first empirical mode decomposition is used to divide signal and processed high-frequency IMF signals separately by adaptive filter. The denoising effect of the proposed method, usual filter and multiscale EMD filter was investigated by experiment. When the part related to the speckle is subtracted from the original interferogram, the speckle noise is reduced. The result is compared with the four other methods of Goldstein filtering, periodic pivoting median filtering, EMD decomposition method and the adaptive filtering, and shows that EMD-adaptive filter method is powerful to interferogram speckle noise reduction, as well as it can preserve fine details in the interferogram that are directly related to the ground topography and maintain phase values distribution.
    Points Cloud Classification using JointBoost Combined with Contextual Information for Feature Reduction
    2013, 42(5):  715-821. 
    Asbtract ( )   HTML   PDF (5399KB) ( )  
    References | Related Articles | Metrics
    The requirements of 3D scene classification and understanding have dramatically increased with the widespread using of airborne LiDAR. This paper therefore focuses on complex power-line corridors scenes and presents an approach to automatically classify point clouds in building, ground, vegetation, power-line, and tower classes. Many key features of points cloud are introduced in this paper for classification using the JointBoost classifier. Due to the data of points cloud is “Big Data” and its classification rate is slow, we propose a method of serialized points cloud classification using spatial contextual information between objects for features reduction. The experiments prove that the classification method we study in this paper can be effectively used for points cloud classification in power-line corridors scenes.
    Soil Spatial Sampling design based on a Multi-objective Micro-neighborhood Particle Swarm Optimization Algorithm
    2013, 42(5):  722-728. 
    Asbtract ( )   HTML   PDF (1913KB) ( )  
    References | Related Articles | Metrics
    The design of a soil spatial sampling network is a complex optimization problem, which must reconcile the conflicts between survey budget, sampling efficiency, sample size and spatial pattern of soil variables. This study presents a soil spatial sampling model on the basis of a multi-objective micro-neighborhood particle swarm optimization algorithm (MM-PSO). The model combines minimum mean kriging variance (MKV) and maximum entropy (ME) as the fitness function of the MM-PSO, and integrates the constraints of sampling barriers, maximum sample size, survey budget and sampling interval as the neighbor operating rules of the particles, in order to improve sampling accuracy and efficiency and to determine sample size and spatial sampling pattern simultaneously. We applied the method to optimizing the sampling networks for soil organic matter in Hengshan County in north-west China. The results indicate that the MM-PSO features a good convergence ability and stability, and can obtain better sampling networks with higher fitness values of the objectives than the single objective and spatial simulated annealing algorithm.
    An Adaptive Speckle Reduction Method Based Kernel Regression for SAR Image
    2013, 42(5):  729-737. 
    Asbtract ( )   HTML   PDF (2672KB) ( )  
    Related Articles | Metrics
    In order to reduce speckle noise in SAR image processing while preserving scatter targets and edge as more as possible, an adaptive speckle reduction method based on a kernel regression is presented. By analyzing the magnitude distribution characteristic of SAR image, while building model the image magnitude is chosen as the classification condition. The kernel function heavily smoothes to reduce the speckle for background region with small magnitude, and protect targets for targets region with large magnitude. Then considering preserving the edges, the steering kernel is modified based on scatter matrix, finally the speckle reduction method based on kernel regression for SAR image is proposed. The experiment results show that the proposed method can reduce speckle noise while preserving targets and edges by introducing magnitude information and scatter matrix into kernel function.
    Accuracy Analysis and Verification of ZY-3 Products
    2013, 42(5):  738-744. 
    Asbtract ( )   HTML   PDF (3298KB) ( )  
    References | Related Articles | Metrics
    The survey satellite ZY-3 provides users with a series of various products to meet their different needs. This paper introduces ZY-3 products and their process methods, including Sensor Corrected (SC) products, Geocoded Ellipsoid Corrected (GEC) products and Enhanced Geocoded Ellipsoid Corrected (eGEC) products, which are provided with Rational Function Model (RFM); and further illustrates the adjustment methodology of 4D products. In the experiments, two sets of ZY-3 data, Dengfeng area and Anping area are used to verify the orientation accuracy of SC, GEC, eGEC products and the model accuracy of Digital Surface Model (DSM) products and Geocoded Terrain Corrected (GTC) products. The results show that GEC products are of the same accuracy of SC products. The orientation accuracy of eGEC products is relatively high and its adjustment result is the same as SC’s. The root mean square errors (RMSEs) of plane and altitude for Dengfeng area are both 2.5m; while the values for Anping area are 1.6m and 1.5m, respectively, which could be used in 1:50000 mapping.
    Color Matching of Polychromatic Contour Lines Considering the Land-use Thematic Information Based on the Munsell-HCV Color Harmony Rule
    2013, 42(5):  752-759. 
    Asbtract ( )   HTML   PDF (2021KB) ( )  
    References | Related Articles | Metrics
    This paper proposes a color matching method of polychromatic contour lines based on Munsell Color Harmony Theory for improving the color harmony in thematic maps. The working framework includes 3 aspects: category harmonization, direction harmonization and chromatism harmonization. Firstly, a basic color for contour lines is predestined with the habitual color used to be employed, and its corresponding harmony color subdomain is worked up. Secondly, several algorithms of color harmony between contour lines and land-use parcels are used based on Munsell’s color harmony directions, and a few candidate colors of contour lines are figured out according to their background parcels. Finally, the priority matching levels for contour lines are conducted by the distribution variation analysis of parcels and contour lines, and the optimal polychromatic contour lines are balance taking account to human feeling on chromatic aberration. Both the primary comparison experiments and questionnaire survey indicate that the polychromatic contour lines calculated by harmony color matching method provide more flexible and delicate visual effect for the land-use thematic maps.
    Algorithm for Constructing Network Voronoi Diagram Based on Flow Extension Ideas
    2013, 42(5):  760-766. 
    Asbtract ( )   HTML   PDF (3749KB) ( )  
    References | Related Articles | Metrics
    As an important geometrical construction obtaining the facility distribution characteristics in geographical space, Voronoi diagram can be constructed differently based on the different distance concept. According to the factor that the service function and interrelation of urban feasilities is carried out on the network path distance, neither than conventional Euclidean distance, this study prsented a raster extension algorithm to build Voronoi diagram in network space, aiming at establishing the network Voronoi diagram model. First the edges of graph structure are subdivided into small linear elements, and we call this procedure as network rasterisation. Then the algorithm introduces the water flow extension idea which lets streams spread on the network paths until meeting the other streams or arriving at the end of edge, with treating the event sources as the headwaters, raster unit length as expanded step length. The algorithm can add constraints of network graph structure, for example, one-way traffic, node restrictive connection. The large-scale actual data experiment for generating the POIs’ service areas in the “Digital City” shows that the proposed algorithm is efficient.
    Performance Evaluation of Line Simplification Algorithms Based on Hierarchical Information Content
    2013, 42(5):  767-773. 
    Asbtract ( )   HTML   PDF (2652KB) ( )  
    Related Articles | Metrics
    Simplification has always been a commonly-used generalization operator, and a number of simplification algorithms are currently available. It is natural to further pay more attention to the performance evaluation of these algorithms. With regard to this, there are some representative evaluation indicators which are proposed with the consideration of the differences of the location and/or shape of spatial features. Indeed, these individual indicators only consider some aspect of the distortions caused by generalization, so that it is difficult to comprehensively reflect the performance of simplification algorithms. To overcome such problem, this paper takes line feature as an example and develops a new evaluation indicator (i.e. information content) at three levels (i.e. element, neighborhood and holistic levels) from the view of information transmission. The process of performance evaluation mainly involves the calculation and the comparison of information content before and after line simplification. As for the former, the quantitative computational methods of information content at the three levels are presented. As for the latter, the difference of information content before and after line simplification is computed and utilized to measure performance of line simplification algorithms. The difference degree of information content at element level reflects the ability of line simplification algorithms to select key points; the difference degree of information content at neighborhood level reflects the ability of line simplification algorithms to maintain bends; the difference degree of information content at holistic level reflects the ability of line simplification algorithm to maintain the trend of line features. Finally, a river network dataset is used to test the performance evaluations of four common-used line simplification algorithms according to the proposed indicator of information content. It is proven that this new indicator is very rational to evaluate the performance of these four line simplification algorithms. At the meantime, a comparative test is made to show the advantages of the new evaluation indicator.
    An Extraction Method of Catchment Basin Based on Cooperation of Contour Cluster and River Network in Map River System Generalization
    2013, 42(5):  774-781. 
    Asbtract ( )   HTML   PDF (2718KB) ( )  
    References | Related Articles | Metrics
    The extraction of river catchment basin is one of the fundamental problems in automatic cartographic generalization of river networks, the catchment basin area is a mostly used index to quantify the importance of river in the networks. There are two commonly used methods to extract catchment areas. One method is to split regions between the rivers medially without considering topographic information, but with this method the extracted catchment basin areas often deviate from the actual situation. Another method is to extract rivers and watersheds from the DEM data, and then to compute the catchment basin areas, however, the extracted rivers is different from the real river objects after all, so the extracted catchment basin areas are not corresponding with the real river objects completely. This paper proposed a catchment basin extraction algorithm based on the cooperation of contour clusters and river networks according to their natural spatial relation. The constrained Delaunay TIN(CD-TIN) was constructed to organize the topographical information of contour clusters and river networks, then the triangles were classified and two watershed extraction method was discussed separately. The watershed segments extracted from the typical triangles with the specific rules were subsequently connected to form the watershed network in order to build the catchment basins. The algorithm’s effectiveness was confirmed with the experiment and the result was in accordance with the terrain characteristics.
    Semantic Similarity Measurement Model between Fundamental Geographic Information Concepts Based on Ontological Property
    2013, 42(5):  782-789. 
    Asbtract ( )   HTML   PDF (2392KB) ( )  
    Related Articles | Metrics
    Semantic similarity plays an important role in the knowledge sharing and data integration. Currently, in some geographical information related applications, the similarity between geographical concepts was measured with the semantic distance of concepts within domain ontology, which was usually captured from the specified concept taxonomies directly. By this method, though the similarity between concepts can be defined easily and fast, because of the alteration of the concept taxonomy, the discrepancies of similarity may exist even in a same concept pair and possibly cause the mistakes in the similarity measurements. This article proposes a semantic similarity measurement model based on the intrinsic features of the concepts, represented by the attribution enumeration for the need of fundamental geographical domain application. In this model, a set of ontology properties, representing the natural features of each concept, were defined to model each fundamental geographic concept. After that, the similarity of each pairs of the concepts can be measured by integrating the similarity of each ontological property and its corresponding weight, which were defined by the experts. In this paper, a hundred samples of the concepts of the fundamental geographic information were chosen to compute their similarity. The results showed that this proposed approach achieved good performance in measuring the similarity of the fundamental geographic information concepts.