Loading...

Table of Content

    20 April 2018, Volume 47 Issue 4
    The Determination of an Ultra-high Gravity Field Model SGG-UGM-1 by Combining EGM2008 Gravity Anomaly and GOCE Observation Data
    LIANG Wei, XU Xinyu, LI Jiancheng, ZHU Guangbin
    2018, 47(4):  425-434.  doi:10.11947/j.AGCS.2018.20170269
    Asbtract ( )   HTML   PDF (3064KB) ( )  
    References | Related Articles | Metrics
    The theory and methods of the determination of an ultra-high gravity field model by combination of satellite observation data and gravity anomaly data are studied.And an ultra-high gravity field model named SGG-UGM-1 is computed using EGM2008 derived gravity anomaly and GOCE observation data.The block-diagonal least squares (BDLS) method for quickly estimating an ultra-high gravity field model is researched and the corresponding software module is validated by numerical experiments.OpenMP technique is introduced into the BDLS program,which improves computing efficiency dramatically.An ultra-high gravity model SGG-UGM-1 complete to degree and order 2159 is derived using the proposed calculation strategies.The fully occupied normal equation system up to degree and order 220 formed by GOCE satellite data and the block-diagonal normal equation system up to degree and order 2159 formed by EGM2008 gravity anomaly data are used for the combination.Comparison among the models SGG-UGM-1 and EGM2008,EIGEN-6C2,EIGEN-6C4,GOSG-EGM in frequency domain has been done,which shows that SGG-UGM-1 is close to the reference models and that the coefficients lower than degree 220 of SGG-UGM-1 are more accurate than that of EGM2008.The models are also validated by GPS-leveling data in China and America and airborne gravity data in Maowusu surveying area.The results show that in China the accuracy level of SGG-UGM-1 derived geoid is between EIGEN-6C2 and EIGEN-6C4,and better than GOSG-EGM and EGM2008,while in America they are almost the same.In Maowusu area the accuracy level of SGG-UGM-1 derived gravity disturbance is almost at the same accuracy level with EGM2008 and EIGEN-6C4 and better than GOSG-EGM and EIGEN-6C2.
    Astronomical Orientation Method Based on Lunar Observations Utilizing Super Wide Field of View
    PU Junyu, ZHENG Yong, LI Chonghui, ZHAN Yinhu, CHEN Shaojie
    2018, 47(4):  435-445.  doi:10.11947/j.AGCS.2018.20170306
    Asbtract ( )   HTML   PDF (2239KB) ( )  
    References | Related Articles | Metrics
    In this paper,astronomical orientation is achieved by observing the moon utilizing camera with super wide field of view,and formulae are deduced in detail.An experiment based on real observations verified the stability of the method.In this experiment,after 15 minutes' tracking shoots,the internal precision could be superior to ±7.5" and the external precision could approximately reach ±20".This camera-based method for astronomical orientation can change the traditional mode (aiming by human eye based on theodolite),thus lowering the requirements for operator's skill to some extent.Furthermore,camera with super wide field of view can realize the function of continuous tracking shoots on the moon without complicated servo control devices.Considering the similar existence of gravity on the moon and the earth's phase change when observed from the moon,once the technology of self-leveling is developed,this method can be extended to orientation for lunar rover by shooting the earth.
    One-dimensional Maximum Entropy Image Segmentation Algorithm Based on the Small Field of View of Measuring Robot Star Map
    SHI Chunlin, ZHANG Chao, CHEN Changyuan, DU Lan, YE Kai, HAN Zhong
    2018, 47(4):  446-454.  doi:10.11947/j.AGCS.2018.20170202
    Asbtract ( )   HTML   PDF (4508KB) ( )  
    References | Related Articles | Metrics
    As one of the fundamental problems in processing star map,image segmentation plays a significant part in ensuring precise field astronomical survey.Image binarization is the key procedure in the image segmentation,but it is extremely difficult to extract star targets from complex sky background using conventional threshold segmentation algorithms.Considering that the Leica video measurement robot TS50i shows features such as the small field of view,single star point,weak target,and single peak,one-dimensional maximum entropy method is firstly proposed to split the star maps.The proposed algorithm is verified by comparison with conventional threshold segmentation algorithms.It is indicated that the one-dimensional maximum entropy algorithm can achieve satisfied binarization processing results while adequately preserve the image information at the same time.Simulation experiments using real star maps show that the extraction method based on this algorithm is accurate and reliable with an accuracy of an order of magnitude better than requirements of the field first-class astronomical survey,hence it can satisfy the need of precise field astronomical survey.
    Derivation and Analysis of Singular-free Lagrangian/Gaussian Equations of Planetary Motion
    JIANG Chunhua, XU Tianhe, QIAO Jing, DU Yujun, WANG Qing, XU Guochang
    2018, 47(4):  455-464.  doi:10.11947/j.AGCS.2018.20170082
    Asbtract ( )   HTML   PDF (2037KB) ( )  
    References | Related Articles | Metrics
    Aiming at the singularity problem in satellite orbit theory,the singularity-free Lagrangian/Gaussian equations of motion is analyzed.Considering the original and physical meaning of the Lagrangian and Gaussian equations of motion,a new Lagrangian/Gaussian singularity-free disturbed equations of motion is proposed and then discussed in three cases:the circular orbit,equatorial orbit,circular and equatorial orbit.Besides,the continuity of these equations is explored.The proposed equations eliminate the zero factor and in this way the singularity problem in the orbital mechanics is solved.
    Characteristics of GLONASS Inter-frequency Code Bias and Its Application on Wide-lane Ambiguity Resolution
    XU Longwei, LIU Hui, SHU Bao, ZHENG Fu, WEN Jingren
    2018, 47(4):  465-472.  doi:10.11947/j.AGCS.2018.20170439
    Asbtract ( )   HTML   PDF (3989KB) ( )  
    References | Related Articles | Metrics
    GLONASS inter-frequency code biases (IFCBs) vary with receiver manufacturers,firmware versions,and antenna types.IFCBs are hardly corrected or modeled precisely,so that Hatch-Melbourne-Wübbena (HMW) combination observation contains a systemic bias and cannot applied into GLONASS wide-lane ambiguity resolution.Utilizing the residuals of GLONASS HMW combination observations,we propose an algorithm to estimate IFCB of different sites (DS-IFCB).The experiment results show that DS-IFCB is long term stability and the sizes of DS-IFCBs in some homogeneous baselines (composed by same type of devices,i.e.receiver type,version and antenna) are larger than 0.5 meters.In order to achieve wide-lane ambiguities in real-time,DS-IFCBs,estimated with previous observations,are used as priors to cancel IFCBs in current observations.After DS-IFCB offset,both the success rate and correct rate of GLONASS wide-lane ambiguity resolutions are improved,regardless of whether baselines are equipped with homogeneous devices.The correct rates of all baselines are higher than 98%.
    Two-antenna GNSS Aided-INS Alignment Using Adaptive Control of Filter Noise Covariance
    HAO Yushi, XU Aigong, SUI Xin, WANG Changqiang
    2018, 47(4):  473-479.  doi:10.11947/j.AGCS.2018.20170316
    Asbtract ( )   HTML   PDF (3517KB) ( )  
    References | Related Articles | Metrics
    This paper developed a theory of INS fine alignment in order to restrain the divergence of yaw angle,two antennas GNSS aided-INS integrated alignment algorithm was utilized.An attitude error measurement equation was conducted based on the relationship between baseline vectors calculated by two sensors and attitude error.The algorithm was executed by EKF using adaptive control of filter noise covariance.The experimental results showed that stability of the integrated system was improved under the system noise covariance adaptive control mechanism;The measurement noise covariance adaptive control mechanism can reduce the influence of measurement noise and improve the alignment absolute accuracy;Further improvement was achieved under the condition of minim bias of baseline length.The accuracy of roll and pitch was 0.02°,the accuracy of yaw was 0.04°.
    Total Kalman Filter Method of Dynamic EIV Model
    YU Hang, WANG Jian, WANG Leyang, NING Yipeng, LIU Zhiping
    2018, 47(4):  480-489.  doi:10.11947/j.AGCS.2018.20170098
    Asbtract ( )   HTML   PDF (2903KB) ( )  
    References | Related Articles | Metrics
    For the case of the adjustment method of dynamic errors-in-variables(EIV)model ignoring the random errors in the state propagating matrix of system equations,this paper establishes a dynamic EIV model which considers the errors of each elements in both observation equations and system equations.A total Kalman filter method (TKF) and its approximated precision estimator are proposed based on this dynamic EIV model.The similarities and differences of the proposed method,the existing total Kalman filter methods and total least squares (TLS) methods are also analyzed.The results show that the proposed method is statistically superior to the standard Kalman filter method and the existing total Kalman filter methods.
    Separating Static Offset and Seismic Wave from High-rate GPS Data by the Smoothness Priors Method
    YAO Yixin, WANG Yong, ZHAN Jingang, GUO Aizhi
    2018, 47(4):  490-497.  doi:10.11947/j.AGCS.2018.20160648
    Asbtract ( )   HTML   PDF (6224KB) ( )  
    References | Related Articles | Metrics
    The high-rate GPS data recorded detailed information of the static offset and the seismic wave when the earthquake occurred.Accurate separation is very important for the rapid inversion and disaster assessment after earthquake.In this paper,the smoothness priors method (SPM) was introduced to deal with the high-rate GPS data.Through the processing and analysis of the high-rate GPS data from three different earthquakes in recent years,it is indicated that the SPM method can be used to separate the static offset and the seismic wave from high-rate GPS data simply,quickly and effectively.The processing results clearly recorded the information of the static offset induced by main shock and strong aftershock,the post-seismic deformation and the seismic wave,which can be provided to restrain the accurate inversion of the seismic source parameters and the fault rupture process.
    Waveform Analysis and Retracking of Lake Level Monitoring by Satellite Altimeter Considering the Difference Between Land and Lake Reflection
    TIAN Shanchuan, HAO Weifeng, LI Fei, LUO Tianwen
    2018, 47(4):  498-507.  doi:10.11947/j.AGCS.2018.20170348
    Asbtract ( )   HTML   PDF (3394KB) ( )  
    References | Related Articles | Metrics
    The phenomenon that the distribution of water level along the satellite track is like a "V" is analyzed.To show the problem,the water level of Hongze Lake is calculated by Jason-2 satellite altimeter data.The analysis shows that the signal of ground reflection dominates the shape of waveform due to the small size of the lake and influence of land.In this case,the leading edge produced by water is not obvious,so it is easy to retrack the wrong middle point.The mechanism of waveform formation and the causes of water level calculation errors is explained and proved with remote sensing images.And the corresponding waveform retracking algorithm is proposed.The results indicate that the proposed algorithm performs better than common waveform retracking algorithm.After removing the abnormal value,the standard deviation of the water level reaches 0.09m,which means the analysis method could be a reference for altimeter data processing in lake region.
    Random Forest Method for Dimension Reduction and Point Cloud Classification Based on Airborne LiDAR
    XIONG Yan, GAO Renqiang, XU Zhanya
    2018, 47(4):  508-518.  doi:10.11947/j.AGCS.2018.20170417
    Asbtract ( )   HTML   PDF (10349KB) ( )  
    References | Related Articles | Metrics
    Exploring automatic point cloud classification method is of great importance to 3D modeling,city land classification,DEM mapping and etc.To overcome the problem that extracting geometric feature for point cloud classification involved neighbor structure meets the challenge that the optimal neighbor scale parameter,high data dimension and complex computation,lacking efficient feature importance analysis and feature selection strategy,this paper proposed a point cloud classification and dimension reduction method based on random forest.After analyzing the characteristic of elevation,intensity and echo of laser points,this paper extracted a total of 6 feature types like normalized height feature,height statistic feature,surface metric feature,spatial distribution feature,echo feature,intensity feature,then built a multi-scale feature parameter from them.Finally,a supervised classification was conducted using a random forest algorithm to optimal the feature set and choose the best feature set to classify the point cloud.Results indicate that,the overall accuracy of the proposed method is 94.3% (Kappa coefficient is 0.922).The proposed method got an improvement in the overall accuracy when compared with no feature selection strategy and SVM classification strategy; The feature importance analysis indicates that the normalized height is the most important feature for the classification.
    A Building Extraction Method via Graph Cuts Algorithm by Fusion of LiDAR Point Cloud and Orthoimage
    DU Shouji, ZOU Zhengrong, ZHANG Yunsheng, HE Xue, WANG Jingxue
    2018, 47(4):  519-527.  doi:10.11947/j.AGCS.2018.20160534
    Asbtract ( )   HTML   PDF (5020KB) ( )  
    References | Related Articles | Metrics
    An automatic building extraction method based on graph cuts algorithm fusing LiDAR point cloud and orthoimage is proposed.Firstly,three geometric features are computed from LiDAR points including flatness,distribution of normal vector and GLCM (grey level co-occurrence matrix) homogeneity of normalized height.NDVI is simultaneously calculated from orthoimage.After that,both kinds of features are combined to construct the data term of energy function,then DSM and NDVI is combined to construct smooth term.Thereafter,graph cuts algorithm is applied to obtain the initial building extraction results.Finally,foreground and background segmentation method is employed to optimize the building boundary based on the orthoimage color information in certain range of the initially detected building boundary.ISPRS Vaihingen dataset is used to evaluate the proposed method.The results reveal that the proposed method can obtain high accuracy of the detection building area.
    Registration of TLS and MLS Point Cloud Combining Genetic Algorithm with ICP
    YAN Li, TAN Junxiang, LIU Hua, CHEN Changjun
    2018, 47(4):  528-536.  doi:10.11947/j.AGCS.2018.20170235
    Asbtract ( )   HTML   PDF (8927KB) ( )  
    References | Related Articles | Metrics
    Large scene point cloud can be quickly acquired by mobile laser scanning (MLS) technology,which needs to be supplemented by terrestrial laser scanning (TLS) point cloud because of limited field of view and occlusion.MLS and TLS point cloud are located in geodetic coordinate system and local coordinate system respectively.This paper proposes an automatic registration method combined genetic algorithm (GA) and iterative closed point ICP to achieve a uniform coordinate reference frame.The local optimizer is utilized in ICP.The efficiency of ICP is higher than that of GA registration,but it depends on a initial solution.GA is a global optimizer,but it's inefficient.The combining strategy is that ICP is enabled to complete the registration when the GA tends to local search.The rough position measured by a built-in GPS of a terrestrial laser scanner is used in the GA registration to limit its optimizing search space.To improve the GA registration accuracy,a maximum registration model called normalized sum of matching scores (NSMS) is presented.The results for measured data show that the NSMS model is effective,the root mean square error (RMSE) of GA registration is 1~5 cm and the registration efficiency can be improved by about 50% combining GA with ICP.
    Method of Tree-like River Networks Hierarchical Relation Establishing and Generalization Considering Stroke Properties
    LI Chengming, YIN Yong, WU Wei, WU Pengda
    2018, 47(4):  537-546.  doi:10.11947/j.AGCS.2018.20170141
    Asbtract ( )   HTML   PDF (1901KB) ( )  
    References | Related Articles | Metrics

    Tree-like river is one of the key features of map,its simplification effect determines the quality of cartography generalization directly.Simplifying tree-like river need to be considered many characteristics,such as semantic,geometry,topology and structure,but the traditional methods only focus on the quantitive indexes,such as length,angle,which lead to the spatial distribution characteristics of the simplified results were destroyed easily.A new method based on the stroke properties is presented.Firstly,the intelligent identification method of characteristics is studied in this paper on the basis of directed topological tree (DTT).Secondly,according to "180° hypothesis" and "acute hypothesis" which proposed by Paiva and stroke properties,the hierarchical relationship for tree-like river network is established.Finally,the algorithms for setting the whole numbers and hierarchical elimination selection were proposed to realize the automatic simplification of tree river.Results from sample data test verify the reliability and results from actual data verify the rationality and effectiveness of the proposed method.

    A Progressive Simplification Method for the Estuary Coastline
    DU Jiawei, WU Fang, LI Jinghan, XING Ruixing, GONG Xianyong
    2018, 47(4):  547-556.  doi:10.11947/j.AGCS.2018.20170440
    Asbtract ( )   HTML   PDF (7652KB) ( )  
    References | Related Articles | Metrics
    Estuary coastline simplification is one of the most difficult research of linear features simplification.Having analyzed current researches,a new progressive simplification method for estuary coastline is proposed,considering that representation characteristics influenced by geographical features and rules of chart generalization for coastlines.Firstly,a binary tree of estuary skeletons is structured by Delaunay triangulation network to represent the pattern of the estuary coastline.Secondly,leaf streams of the structured model are removed or partial removed gradually to simplify small bends and parts of the coastline enough.Thirdly,narrow details of the bends-simplified coastline are exaggerated to avoid visual conflicts of the estuary.Results of experiments show that small details,invisible at the target scale,of coastline are simplified enough by the proposed method,while pattern features of the original estuary coastline are preserved.Besides,the proposed simplification method has advantages in both geometry and geography.Additionally,it is suitable for the proposed method to simplify various estuary coastlines in applications.
    Research on the Uncertainty of the Deformation Monitoring Analysis and Predication
    WEI Guanjun
    2018, 47(4):  557-557.  doi:10.11947/j.AGCS.2018.20170499
    Asbtract ( )   HTML   PDF (643KB) ( )  
    Related Articles | Metrics
    Research on the Methods of Emergency Resources Layout and Scheduling in Time-varying Semantic
    BAN Ya
    2018, 47(4):  558-558.  doi:10.11947/j.AGCS.2018.20170429
    Asbtract ( )   HTML   PDF (642KB) ( )  
    Related Articles | Metrics