Loading...

Table of Content

    25 February 2011, Volume 40 Issue 1
    测绘快报
    The Conception and Architecture of Live-Service for Geospatial Information
    2011, 40(1):  1-4. 
    Asbtract ( )   HTML   PDF (1444KB) ( )  
    Related Articles | Metrics
    Abstract: Real time collection of live imagery and location has constituted one of the basic requirements of geospatial information service in case of emergence fast response, along with high performance processing and quick release of data. This paper focuses on the conception and architecture of Live-service for Geospatial Information(LGI), its major technical routines is based on real time collection with integrated sensors in VTUAV platform and high speed processing with vehicular borne cluster system and information release with wide-band wireless networks. The link and elements of LGI and applied key technologies of in the future outlined with some typical application instances.
    学术论文
    The Perturbation Analysis of Nonlinear Ill-conditioned Solution
    2011, 40(1):  5-9. 
    Asbtract ( )   HTML   PDF (999KB) ( )  
    Related Articles | Metrics
    In addition to the ill-conditioned analysis of the nonlinear model itself, the perturbation stemmed from truncation and orthogonalization process needs to be considered. In this paper, the perturbative estimation inequality of nonlinear ill-conditioned problem was derived from the definition of the generalized condition number. And based on the perturbative estimation inequality, the impacts of judgement and analysis were studied on nonlinear ill-conditioned problem. The impacts stemmed primarily from the disturbance of the coefficient matrix of linear approximation and additional truncation error,also including orthogonal approximation. Therefore, nonlinear ill-conditioned problem was verified and analyzed through two examples. The results of the research show that: when the model nonlinearity is strong, the disturbance of the coefficient matrix and additional truncation error are very significant due to the selection of different approximation values and orthogonal approximation. In this case, iterative regularization method can be used to solve nonlinear ill-conditioned problems.
    Equivalent Residual Product Based Outlier Detection for Variance and Covariance Component Estimation
    2011, 40(1):  10-14. 
    Asbtract ( )   HTML   PDF (1212KB) ( )  
    Related Articles | Metrics
    The existing outlier detection methods in variance-covariance component estimation (VCE) are based on the residuals. However, the essential inputs of VCE are the second-order values of residuals and, thus it is more reasonable to carry out the outlier detection using these second-order values directly. In this paper, starting with the fundament VCE equations based on equivalent residuals with standard normal distribution, we propose a new method to detect the outliers of VCE inputs where the chi-square (χ2) and normal product (Np) statistics are used to test the residual squares and their products for one another with a given confidence probability, respectively. The results show that if a confidence probability α is used to detect outliers with normal distribution statistic in residual domain, it is equivalent to test the residual squares with the same confidence probability using χ2 statistic but to test the products of residuals with a confidence probability smaller than α. Therefore, the variance estimates are equivalent using residual based or χ2 based outlier detection, but the better covariance estimates are achievable using Np based outlier detection than residual based one.
    Estimate and Predict Satellite Clock error Used Adaptively Robust Sequential Adjustment with Classified Adaptive Factors Based on Opening Windows
    Yuanxi Yang2,2, 2
    2011, 40(1):  15-21. 
    Asbtract ( )   HTML   PDF (1200KB) ( )  
    Related Articles | Metrics
    Classical least-square clock model has been applied extensively in the area of navigation and positioning. However, it cannot reach a better result caused by some blunders and clock jumps. Actually, blunders and clock jumps have occurred so much that cannot be ignored in clock error series. General pre-processing methods, as transfer between phase and frequency, plot show, blunder detection and et al, cannot be satisfied with real-time clock error predicting. So, a new clock error method is proposed based on adaptively robust sequential adjustment with classified adaptive factors and open windows. Main ideas as follows: firstly, clock error series is opened windows with a better size; secondly, blunders in every window are processed using robust estimation; thirdly, clock jumps are eliminated by adaptive least-square between different windows. In a word, Observation anomaly and state anomaly can be impaired availably by above-mentioned steps. In addition, because different clock parameters depict different clock characteristic, classified adaptive factors are proposed to reject outlier in the clock data. Analysis results indicate: comparing with adaptively sequential adjustment, new method’s precision improved 78.9% and 60.4% in the fact of estimate and predict satellite clock error. Also, new method’s estimate and predict precision, because of classified adaptive factors, improved about 4.3% and 29.2% comparing with adaptively robust sequential adjustment. Furthermore, new method is usually fit for other clock models as AR model, Grey model and et al.
    The Influence of Optimized Train Samples on Elimination of Sounding Outliersin the LS-SVM Arithmetic
    2011, 40(1):  22-27. 
    Asbtract ( )   HTML   PDF (1389KB) ( )  
    Related Articles | Metrics
    Abstract: After validating the trend filter is the special result to the LS-SVM arithmetic, eliminating the sounding outliers by the seafloor surface which constructed by LS-SVM .In order to solve the sparseness of LS-SVM results meanwhile restrain the influence of the sample-outliers. A new method of optimize samples by part samples center distance is presented. Some practical multi-beam data is chose to verify the correctness and rationality of the new method. The example shows that on the ground of the optimized train samples, the reasonable seafloor surface could be constructed by LS-SVM arithmetic, and then the outliers of Multi-beam data could be eliminated effectively.
    Equivalent Weight Robust Estimation Method Based On Median Parameter Estimates
    2011, 40(1):  28-32. 
    Asbtract ( )   HTML   PDF (1191KB) ( )  
    Related Articles | Metrics
    The robust estimation based on equivalent weights keeps the outstanding properties of Least Squares (LS) adjustment in processing normal observations. Nevertheless, its robustness is highly related to the initial values. If the LS estimates, which are insensitive to outliers, are used as initial values it will certainly impact the robust characteristics of equivalent weight robust estimates. Comparatively, the Least Median (LM) estimates are much robust than LS estimates, but they are computed only with parts of observables, and many available observables are not used. In this paper, we put forwards the median parameter robust estimation and present an approximate method to estimate its breakdown point in finite observation cases. By integrating the advantages of median parameter estimation with equivalent weight robust estimation, we take the median parameter estimates as the initial values and carry out equivalent weight robust estimation to compute the final results. The numerical experiments show that our robust estimation method of combined median parameter and equivalent weight is much better than both individual robust estimations.
    Maximum Likelihood Adjustment of the monadic Unsymmetrical P-norm distribution
    2011, 40(1):  33-36. 
    Asbtract ( )   HTML   PDF (983KB) ( )  
    Related Articles | Metrics
    According to the foundational properties of the random errors, this paper deduces a more general error distribution---the unsymmetrical P-norm distribution, and then discusses the maximum likelihood adjustment of the p-norm distribution in detail. The distributions of degenerate、Laplace、normal and rectangular are the specific cases of the unsymmetrical P-norm distribution respectively. The maximum likelihood adjustment method of the unsymmetrical P-norm distribution is proposed, and the foundational equations of the adjustment are given. For each concrete measuring data, we can select a suitable value to be more close to the real one of the error than the normal distribution is. The method is the further generalization of the one of maximum likelihood adjustment of the P-norm distribution.
    Spatial Distribution of Antarctic Ionosphere TEC based on GPS
    2011, 40(1):  37-40. 
    Asbtract ( )   HTML   PDF (1642KB) ( )  
    Related Articles | Metrics
    Using the GPS observations from perennial GPS station in Chinese Zhongshan Station and IGS (International GNSS Service) station in the Antarctic region, the high accurate ionosphere TECs is calculated in polar area during 2000~2006, TECs above GPS stations located in polar cap, polar cap edge, in and out of auroral zone are comparatively analyzed. The results show that, the daily maximum TECs in the auroral zone are smaller than outer, larger than TECs in the polar cap, but this difference isn’t obvious on minimum daily TECs. This spatial distribution is related to different place has different solar incident angle and special configurations of geomagnetic filed in high latitude regions.
    Present-day Crust Thickness increasing Beneath the Qinghai-Tibetan Plateau by Using Geodetic data at Lhasa Station
    2011, 40(1):  41-44. 
    Asbtract ( )   HTML   PDF (996KB) ( )  
    Related Articles | Metrics
    The crust of Tibetan Plateau is still uplifting and thickening nowadays by the collision between the Indian plate and Eurasian plate which revealed by the results of geological and tectonic, it is also a complex and hot problem which has been studying by global geoscientists. However, almost no quantitative evidence shows whether the plateau is still uplifting or thickening. In this paper we present geodetic evidence of mass loss beneath the Tibetan Plateau and increasing crust thickness which is 3.9±0.8cm/a by using high accuracy absolute gravity data and the authoritative GPS measurements results, finally we give a simple geodynamic model illustrating crustal thickening of the Tibetan Plateau.
    Terrain Estimation Algorithm Based on Kalman Filter and its Simulation Research
    2011, 40(1):  45-51. 
    Asbtract ( )   HTML   PDF (2294KB) ( )  
    Related Articles | Metrics
    In this paper, a terrain estimation algorithm based on kalman filter was proposed. Combining with gradient anomaly measured by gravity gradiometer, the algorithm achieved the purpose of terrain estimation by improving the accuracy and resolution of a low-resolution priori terrain. By introducing slope theory and rectangular prism method, the paper built system error model and measure equation of the kalman filter first. After that a simulation research was carried out to investigate the effect of algorithm. The simulation analyzed terrain mean noise, random noise and gravity gradiometer precision’s influence on estimation accuracy. At last, an area estimation simulation was also investigated. The results show that both the precision and resolution of estimated terrain have improved, which confirmed the correctness of the algorithm.
    Ground subsidence monitoring in mining area using DInSAR SBAS algorithm
    2011, 40(1):  52-58. 
    Asbtract ( )   HTML   PDF (1866KB) ( )  
    Related Articles | Metrics
    It is difficult to get robust deformation results using conventional Differential InSAR method, due to lack of enough SAR images jointly used in SAR interferometry. In this paper, we use Small Baseline Subset (SBAS) DInSAR algorithm to monitor the temporal evolution of surface deformation in mining area. A temporal deformation sequence of Lengshuijiang mining area of Hunan Province is retrieved from DInSAR phase based on virtual observation method and SBAS algorithm. The retrieved deformation sequence reveals the development and evolution of the subsidence “bowl” of the mining area. The result demonstrates that using both SBAS algorithm and the method of virtual observation can improve the accuracy of deformation monitoring with InSAR.
    A Distance Metric for Discrete Triangular Grid
    2011, 40(1):  59-65. 
    Asbtract ( )   HTML   PDF (2433KB) ( )  
    Related Articles | Metrics
    A distance metric of 12-connection for triangular grid is provided in the paper. According to the triangular distance metric, a new corresponding Triangular Three Dimension Coordinate System (TTDCS) is constructed. Coordinate Transformations among several coordinate systems are given. And the distance equations, in both TTDCS and Scanning Coordinate System (SCS), are presented. Finally three distance properties are deduced.
    The High-accurate extraction of line features of object contour
    2011, 40(1):  66-70. 
    Asbtract ( )   HTML   PDF (1386KB) ( )  
    Related Articles | Metrics
    A high-accurate extraction algorithm of line features of object contour is proposed in this paper, which combines the method of exact positioning of single point and the algorithm of line segment approximation with minimum-distance error. First object contour can be approximated by linear feature. Then we search the adjacent line segment and compute the intersection point between them. At last, we realized the exact positioning of contour cutpoint and provided exact line segment data for 3D measuring and reconstruction based on contour. In contrast to other approaches, the algorithm proposed in this paper has not need good initial value of contour and can get better extraction result. The accuracy of the algorithm is analyzed on real image data. The algorithm has been applied in 3D inspection of diamond, line feature extraction of nuts and industrial sheetmetal parts. The experiment result demonstrates that the algorithm is feasible, correct, stable and exact and can also be extended to other application areas.
    Balloon Snake with Adjustable Expansion Coefficient in Road Contour Extraction
    2011, 40(1):  71-77. 
    Asbtract ( )   HTML   PDF (3941KB) ( )  
    Related Articles | Metrics
    Ziplock snake, ribbon snake and quadratic snake are three classical snake methods applied in roads extraction. The quadratic snake is the most complicated the least accurate among the three. Having better outcome however, the ziplock and ribbon snake methods need the initial road information provided by other algorithms. Once some part of the initial road network is missing, it cannot be recovered by the ziplock or ribbon snake. In this paper, the balloon snake with adjustable expansion coefficient is adopted. Its advantages include: 1. with the introduction of expansion force in the balloon snake, the missing road parts can be recovered; 2. expansion coefficient tuning according to image gradient and contour curvature solved the problem of the output’s sensitivity to the coefficient. The experimental results demonstrate its effectiveness and advantage.
    The Geometric Calibration of Airborne Three-Line-Scanner ADS40
    2011, 40(1):  78-83. 
    Asbtract ( )   HTML   PDF (1181KB) ( )  
    Related Articles | Metrics
    Geometric calibration of sensor is the key part of photogrammetry. ADS40 is POS supported airborne Three-Line-Scanner. The paper introduces the unique POS supported sensor geometric relation of ADS40, and self calibration with additional parameters based on BROWN model. According to ADS40, a set of calibration programs is designed, considering: how to determine the necessity of geometric calibration, how to carry out geometric calibration, and how to verify the validity of the result of the geometric calibration. Experimental results show that the self-calibration of the ADS40 is effective to improve data accuracy and reliability.
    Seamlines Intelligent Detection in Large-scale Urban Orthoimage Mosaicking
    2011, 40(1):  84-89. 
    Asbtract ( )   HTML   PDF (3451KB) ( )  
    Related Articles | Metrics
    Traditional algorithm can avoid color difference areas effectively by using seamlines detection in overlapping color difference image.. However, high density of urban buildings using this method is not good. In this paper, we distinguish ground regional and non-ground area accurately by processing high-precision DSM. Most of traditional seamlines detection algorithms are based on iterative calculations, and have high complexity. Our paper propose a new algorithm named greedy snake algorithm. The new algorithm is only associated with three parameters: searching step, direction rotation interval and region width. We carried out experiments for urban areas for orthophoto seamlines detection. The results show that the seamlines can avoid buildings well, and the greedy snake algorithm can be applied in the optimal path detection effectively.
    Vector Based Sparse Depth Map Algorithm
    2011, 40(1):  90-95. 
    Asbtract ( )   HTML   PDF (2163KB) ( )  
    Related Articles | Metrics
    In the method of 3D spatial data capturing based on the image sequences, people want to extract 3D data for each pixel, but the data extracted from this way need to be greatly simplified if used for 3D modeling in 3D Geographical Information System (3DGIS). On the other hand, if sparse texture areas exist in the images, it’s difficult to overcome the error matches. This paper presents a vector based matching algorithm. The algorithm takes one image as a continuous 3D surface, and mainly matches feature areas by computing vectors in each normal vector on the surface and matching corresponding vectors. The summary of cosine values of corresponding vectors and feature description value of the corresponding areas in two match windows are adopted as the cost factors. The changes of vectors reflect the image texture information, which can be used to evaluate image feature areas, filter sparse texture areas to avoid mismatch and save computer time. The results from experiment in the last part of the paper indicate that the proposed algorithm is effective for sparse depth map producing and can satisfy 3D modeling in 3DGIS.
    Spatial Simulation of Urban Heat Island Intensity Based on Support Vector Machine Technique: A Case Study in Beijing
    2011, 40(1):  96-103. 
    Asbtract ( )   HTML   PDF (5037KB) ( )  
    Related Articles | Metrics
    Surface fitting of Urban Heat Island (UHI) intensity provides a strong promise for deeply analyzing the spatial pattern, morphology and the evolution features of UHI. The surface fitting algorithm based on the support vector machine (SVM) technique was introduced. With the LST (Land Surface Temperature) products from MODIS and after elimination of the data covered by clouds, 757 imageries (274 and 483 ones for day and night, respectively) in Beijing metropolitan area from the year 2006 to 2008 were fitted one by one using SVM. The sensitivity analysis and accuracy assessment both indicate that this algorithm is of high accuracy and capable to be used in depicting the spatial pattern of UHI. Furthermore, by virtue of it, mapping of urban surface temperature becomes possible. The application results also show that, during daytime, the UHI intensity in Beijing is weakly and positively correlated with its rural background temperatures linearly; while during nighttime, the circumstance is on the opposite side. In addition, from an annually perspective, UHI capacities during daytime and nighttime in time series are both generally subjected to periodic sinusoidal variation; however, the amplitude of annual variation of UHI capacity during nighttime is far less than that during daytime. This is because different driving factors dominate different patterns of UHI temporally and spatially in distinct seasons and illumination conditions. The support vector machine (SVM) fitting of UHI intensity model presented in this paper pays more attention on the overall spatial features of UHI approximately, while ignoring the noise caused by random factors and the details of a weak surface temperature change; hence it is a powerful tool for investigating the spatial pattern of temperature distribution in the analysis of urban thermal environment.
    Boolean Operations on Polygons by Using Trapezoidal Decomposition
    2011, 40(1):  104-110. 
    Asbtract ( )   HTML   PDF (1179KB) ( )  
    Related Articles | Metrics
    Boolean operations on planar polygons, including “Union”, ”Erase”, “Identity”, “Intersect”, etc., are fundemental operations in GIS to extract new information which may contribute to resolve geological application problems. In this paper, a new algorithm for Boolean operations is presented, which incorporates trapezoidal decomposition. The decomposition of polygons into trapezoids has been studied in the area of computer graphics. It is comparatively easier to process a simple shape like a trapezoid than a polygon. However, so far it has seldom been used into GIS applications. In the proposed method, the involved polygons are decomposed into two sets of trapezoids by the sweep-line, therefore Boolean operations on polygons are transformed into the Boolean operations on the decomposed trapezoids. Since these trapezoids are organized and stored by row, thus the Boolean operations between them are confined within one row; in consequence, the computation efficiency could be improved effectively. Once the resulting set of trapezoids is obtained by implementing Boolean operations on the two original sets of trapezoids, the resulting polygons need to be constructed by means of boundary tracing. The proposed method avoids the complex computation of spatial relationship between polygons’ edges in most vector algorithms for Boolean operation, thus making the whole procedures more efficient and understandable. In addition, by adopting multi-attribute condition extraction, this method could realize various Boolean operations in a unified way, without the necessity of designing respective methods targeting at each operation.
    Research on GPS based Track Map Data Generation Method for Train Control
    2011, 40(1):  111-117. 
    Asbtract ( )   HTML   PDF (1527KB) ( )  
    Related Articles | Metrics
    GNSS technique, represented by GPS, could take advantage of high positioning precision, safety and integrity to improve the performance of train positioning, which is crucial for the development requirements of train control system on safety, efficiency and economy. High precision track map is the foundation of GNSS based train positioning, and GPS based track map generation is one of the most important content of intellectualized train control. To meet different leveled requirements of train positioning and control, the track map data for train control consists of private data, track data and terrain environment data, and the track data are divided into feature layer and interpolation layer under different scales, which could contribute to complete structure of the track map. To huge GPS measurements, the multi-points weighted distance discrimination and Kalman estimation are applied to reject incorrect data; with improvement of Douglas-Peucker method, in which the original data are divided as the track curve feature and the limited error is adjusted dynamically, certain feature points are selected to describe the real track under low scale; based on the feature point set, interpolation points are generated by inverse calculation of cubic B-spline curve and equidistance interpolation as certain distance resolution, which could refine the track description under high scale. With the replenishment of mileage information, the generation of track map data could be realized completely. Calculation results with measurement data from Qinghai-Tibet railway show that, the proposed track map data generation method could realize the description and approximation to the real track under different scales efficiently with original GPS measurements, which could meet various requirements of train positioning and control, with simple calculation, engineering realization and high practical application value.
    Study on Construction and Matching Algorithm of User Model in Adaptive Cartographic Visualization System
    2011, 40(1):  118-124. 
    Asbtract ( )   HTML   PDF (1497KB) ( )  
    Related Articles | Metrics
    For strengthening accuracy of user model construction and matching in adaptive cartographic visualization system, this paper has put forward quaternary interaction user model and similarity matching algorithm based on probability. Quaternary interaction user model, which adequately considers “human” “machine” interaction character in ACViS, can more roundly reflect characteristic for user using ACViS. Based on similarity calculation method in common use, similarity matching algorithm based on probability has been put forward. This paper has introduced the basic idea and process of algorithm in detail, and has given an example to validate. It is tested that this model matching algorithm is easy to be realized and can match more effectively, which considers actual circumstance of ACViS, and improves the accuracy of model matching, so, it provided a new approach for user model matching.
    Recognition of Structures of Typical Road Junctions Based on Directed Attributed Relational Graph
    2011, 40(1):  125-131. 
    Asbtract ( )   HTML   PDF (1359KB) ( )  
    Related Articles | Metrics
    Automatically deriving multiple representations of a road network at different levels of detail is desirable for various geospatial applications. This paper focuses on deriving simplified representation of complex road junctions from its detailed representation. It is based on the observation that a road junction is a designed functional structure that consists of functional elements. Each type of element usually has a shape pattern, while the composition of elements has a structural pattern. A road junction can therefore be represented by means of structural description and recognized by means of structural pattern recognition. The structural patterns of road junctions are represented as directed attributed relational graph (DARG) in this study. The collection of common road junction patterns constitutes a set of graph templates to be matched to. In order to simplify road junction representation, a road network is first converted to an attributed graph. Then, junction patterns are searched in the resulting attributed relational graph of road network. That is a process of subgraph matching. Ullman’s algorithm for subgraph matching is adopted in this study. Once a junction is recognized, it can be simplified according some predefined method. Experiments have been carried out to evaluate the proposed technique. It is shown that the proposed technique is quite effective in describing and recognizing road junctions.
    博士论文摘要
    High Quality True Orthoimage Generation with LiDAR and RS Image
    2011, 40(1):  132-132. 
    Asbtract ( )   HTML   PDF (791KB) ( )  
    Related Articles | Metrics
    Flow chart and key issues of high quality true orthophoto generation with LiDAR data and remote sensing image, are deduced by reviewing published papers local and abroad. This paper finds out methodologies about key issues through hard work. At last, feasibility and validity are verified by experiments on real data.
    Classification and Feature Extraction of Airborne LIDAR Data Fused with Aerial Image
    2011, 40(1):  134-134. 
    Asbtract ( )   HTML   PDF (790KB) ( )  
    Related Articles | Metrics