首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   29篇
  免费   1篇
  国内免费   3篇
测绘学   12篇
大气科学   1篇
地球物理   3篇
地质学   7篇
海洋学   1篇
综合类   1篇
自然地理   8篇
  2022年   1篇
  2021年   1篇
  2020年   1篇
  2018年   1篇
  2017年   3篇
  2016年   7篇
  2015年   1篇
  2013年   8篇
  2012年   1篇
  2008年   2篇
  2005年   3篇
  2004年   1篇
  2001年   1篇
  1999年   1篇
  1997年   1篇
排序方式: 共有33条查询结果,搜索用时 15 毫秒
1.
Geophysical data sets are growing at an ever-increasing rate, requiring computationally efficient data selection(thinning)methods to preserve essential information. Satellites, such as Wind Sat, provide large data sets for assessing the accuracy and computational efficiency of data selection techniques. A new data thinning technique, based on support vector regression(SVR), is developed and tested. To manage large on-line satellite data streams, observations from Wind Sat are formed into subsets by Voronoi tessellation and then each is thinned by SVR(TSVR). Three experiments are performed. The first confirms the viability of TSVR for a relatively small sample, comparing it to several commonly used data thinning methods(random selection, averaging and Barnes filtering), producing a 10% thinning rate(90% data reduction), low mean absolute errors(MAE) and large correlations with the original data. A second experiment, using a larger dataset, shows TSVR retrievals with MAE < 1 m s-1and correlations 0.98. TSVR was an order of magnitude faster than the commonly used thinning methods. A third experiment applies a two-stage pipeline to TSVR, to accommodate online data. The pipeline subsets reconstruct the wind field with the same accuracy as the second experiment, is an order of magnitude faster than the nonpipeline TSVR. Therefore, pipeline TSVR is two orders of magnitude faster than commonly used thinning methods that ingest the entire data set. This study demonstrates that TSVR pipeline thinning is an accurate and computationally efficient alternative to commonly used data selection techniques.  相似文献   
2.
This paper presents the calibration of an experiment based on filtration tests, able to provide the cumulative constriction size distribution of granular materials. Here, simulations of these tests are performed using a discrete element method. Filters of same density but different thicknesses are created with a poly‐sized spherical material. Lateral periodic boundaries for the samples are used, and their size is calibrated so that a representative elementary volume is obtained. Fine particles are released on the created samples, and the particle size distribution of the collected material that successfully crossed the filters is computed. These particle size distributions are related to the underlying cumulative constriction size distribution (CSD) of the granular material involved in the samples. The CSD is derived using a probabilistic approach for the path length of individual particles through a granular material. We settle all the requisites related to the technique and to the fine particles that are released to allow reaching a correct CSD for the filter. The reference CSD used for the calibration of the experiment is obtained after a radical partition of the void space into Delaunay tetrahedra and a geometrical characterisation of constrictions on each tetrahedron face. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   
3.
Street patterns reflect the distribution characteristics of a street network and affect the urban structure and human behavior. The recognition of street patterns has been a topic of interest for decades. In this study, a linear tessellation model is proposed to identify the spatial patterns in street networks. The street segments are broken into consecutive linear units with equal length. We define five focal operations using neighborhood analysis to extract the geometric and topological characteristics of each linear unit for the purpose of grid-pattern recognition. These are then classified by Support Vector Machine, and the result is optimized based on Gestalt principles. The experimental results demonstrate that our method is effective for mining grid patterns in a street network.  相似文献   
4.
Abstract

We attempt to describe the role of tessellated models of space within the discipline of Geographic Information Systems (GIS) – a speciality coming largely out of Geography and Land Surveying, where there was a strong need to represent information about the land’s surface within a computer system rather than on the original paper maps. We look at some of the basic operations in GIS, including dynamic and kinetic applications. We examine issues of topology and data structures, and produced a tessellation model that may be widely applied both to traditional “object” and “field” data types. The Part I of this study examined object and field spatial models, the Voronoi extension of objects, and the graphs that express the resulting adjacencies. The required data structures were also briefly described, along with 2D and 3D structures and hierarchical indexing. The importance of graph duality was emphasized. Here, this second paper builds on the structures described in the first, and examines how these may be modified: change may often be associated with either viewpoint or time. Incremental algorithms permit additional point insertion, and applications involving the addition of skeleton points, for map scanning, contour enrichment or watershed delineation and simulation. Dynamic algorithms permit skeleton smoothing, and higher order Voronoi diagram applications, including Sibson interpolation. Kinetic algorithms allow collision detection applications, free-Lagrange flow modeling, and pen movement simulation for map drawing. If desired these methods may be extended to 3D. Based on this framework, it can be argued that tessellation models are fundamental to our understanding and processing of geographical space, and provide a coherent framework for understanding the “space” in which we exist.  相似文献   
5.
This work deals with the geostatistical simulation of a family of stationary random field models with bivariate isofactorial distributions. Such models are defined as the sum of independent random fields with mosaic-type bivariate distributions and infinitely divisible univariate distributions. For practical applications, dead leaf tessellations are used since they provide a wide range of models and allow conditioning the realizations to a set of data via an iterative procedure (simulated annealing). The model parameters can be determined by comparing the data variogram and madogram, and enable to control the spatial connectivity of the extreme values in the realizations. An illustration to a forest dataset is presented, for which a negative binomial model is used to characterize the distribution of coniferous trees over a wooded area.  相似文献   
6.
This paper proposes methods for detecting apparent differences between spatial tessellations at two different points in time, with the objective of conflation of spatial tessellations at multiple time points. The methods comprise three steps. First, we eliminate systematic differences between tessellations using the affine transformation. Second, we match subregions between tessellations at two points in time and match boundaries based on matching relationships between the subregions. Third, we propose a distance metric for measuring differences between the matched boundaries and a method for determining whether the measured differences are apparent or not. We apply the proposed methods to a part of the US Census data for 1990 and 2000 and empirically demonstrate the effectiveness of these methods.  相似文献   
7.
The spatial resolution of imaging sensors has increased dramatically in recent years, and so too have the challenges associated with extracting meaningful information from their data products. Object-based image analysis (OBIA) is gaining rapid popularity in remote sensing science as a means of bridging very high spatial resolution (VHSR) imagery and GIS. Multiscalar image segmentation is a fundamental step in OBIA, yet there is currently no tool available to objectively guide the selection of appropriate scales for segmentation. We present a technique for estimating the scale parameter in image segmentation of remotely sensed data with Definiens Developer®. The degree of heterogeneity within an image-object is controlled by a subjective measure called the ‘scale parameter’, as implemented in the mentioned software. We propose a tool, called estimation of scale parameter (ESP), that builds on the idea of local variance (LV) of object heterogeneity within a scene. The ESP tool iteratively generates image-objects at multiple scale levels in a bottom-up approach and calculates the LV for each scale. Variation in heterogeneity is explored by evaluating LV plotted against the corresponding scale. The thresholds in rates of change of LV (ROC-LV) indicate the scale levels at which the image can be segmented in the most appropriate manner, relative to the data properties at the scene level. Our tests on different types of imagery indicated fast processing times and accurate results. The simple yet robust ESP tool enables fast and objective parametrization when performing image segmentation and holds great potential for OBIA applications.  相似文献   
8.
With the increased use of locational information, spatial location referencing and coding methods have become much more important to the mining of both geographical and nongeographical data in digital earth system. Unfortunately, current methods of geocoding, based on reverse lookup of coordinates for a given address, have proven too lossy with respect to administrative and socioeconomic data. This paper proposes a spatial subdivision and geocoding model based on spatial address regional tessellation (SART). Given a hierarchical address object definition, and based on the ‘region of influence’ characteristics of an address, SART creates multiresolution spatial subdivisions by irregular and continuous address regions. This model reflects most of the geographical features and many of the social and economic implications for a given address. It also better reflects the way people understand addresses and spatial locations. We also propose an appropriate method of geocoding for standard addresses (SART-GC). The codes generated by this method can record address footprints, hierarchical relationships, and spatial scales in a single data structure. Finally, by applying our methods to the Shibei District of Qingdao, we demonstrate the suitability of SART-GC for multi-scale spatial information representation in digital earth systems.  相似文献   
9.
In Synthetic Aperture Radar (SAR) interferometry, the precise co-registration is a key problem to obtain the fine interferogram. Theoretically, 1/10 pixel accuracy is essential. Once the control points are determined, the relative rectification (pixel re-sampling) is also the important procedure. In this paper, a piecewise transformation algorithm based on Delaunay Tessellation algorithm is developed. The strategy is to “anchor” geographically all control points and transform the imagery on the basis of triangle by triangle. The piecewise algorithm, the accuracy evaluation and the preliminary results are described. The experiment result shows that the piecewise transformation out-performes the traditional polynomial transformation.  相似文献   
10.
In a previous work, discrete modeling and a statistical approach were used to quantify the minimal representative volume element (RVE) size for aggregate composites, such as bituminous materials in the linear elastic regime (Comput. Mater. Sci. 2005; 33 :467–490). In this paper, the discrete model is extended to strain‐softening behavior under cyclic loading. The existence of a RVE for fatigue life prediction is numerically proved and its minimal size is determined. It is found that it is much larger than the minimal RVE size in the elastic regime. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号