首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   13篇
  免费   0篇
  国内免费   1篇
测绘学   8篇
地质学   1篇
自然地理   5篇
  2019年   2篇
  2016年   1篇
  2014年   1篇
  2013年   9篇
  2012年   1篇
排序方式: 共有14条查询结果,搜索用时 15 毫秒
1.
2.
3.
Abstract

This paper introduces a new concept, distributed geospatial information processing (DGIP), which refers to the process of geospatial information residing on computers geographically dispersed and connected through computer networks, and the contribution of DGIP to Digital Earth (DE). The DGIP plays a critical role in integrating the widely distributed geospatial resources to support the DE envisioned to utilise a wide variety of information. This paper addresses this role from three different aspects: 1) sharing Earth data, information, and services through geospatial interoperability supported by standardisation of contents and interfaces; 2) sharing computing and software resources through a GeoCyberinfrastructure supported by DGIP middleware; and 3) sharing knowledge within and across domains through ontology and semantic searches. Observing the long-term process for the research and development of an operational DE, we discuss and expect some practical contributions of the DGIP to the DE.  相似文献   
4.
ABSTRACT

Big data have shifted spatial optimization from a purely computational-intensive problem to a data-intensive challenge. This is especially the case for spatiotemporal (ST) land use/land cover change (LUCC) research. In addition to greater variety, for example, from sensing platforms, big data offer datasets at higher spatial and temporal resolutions; these new offerings require new methods to optimize data handling and analysis. We propose a LUCC-based geospatial cyberinfrastructure (GCI) that optimizes big data handling and analysis, in this case with raster data. The GCI provides three levels of optimization. First, we employ spatial optimization with graph-based image segmentation. Second, we propose ST Atom Model to temporally optimize the image segments for LUCC. At last, the first two domain ST optimizations are supported by the computational optimization for big data analysis. The evaluation is conducted using DMTI (DMTI Spatial Inc.) Satellite StreetView imagery datasets acquired for the Greater Montreal area, Canada in 2006, 2009, and 2012 (534 GB, 60 cm spatial resolution, RGB image). Our LUCC-based GCI builds an optimization bridge among LUCC, ST modelling, and big data.  相似文献   
5.
For geospatial cyberinfrastructure-enabled web services, the ability of rapidly transmitting and sharing spatial data over the Internet plays a critical role to meet the demands of real-time change detection, response and decision-making. Especially for vector datasets which serve as irreplaceable and concrete material in data-driven geospatial applications, their rich geometry and property information facilitates the development of interactive, efficient and intelligent data analysis and visualization applications. However, the big-data issues of vector datasets have hindered their wide adoption in web services. In this research, we propose a comprehensive optimization strategy to enhance the performance of vector data transmitting and processing. This strategy combines: (1) pre- and on-the-fly generalization, which automatically determines proper simplification level through the introduction of appropriate distance tolerance speed up simplification efficiency; (2) a progressive attribute transmission method to reduce data size and, therefore, the service response time; (3) compressed data transmission and dynamic adoption of a compression method to maximize the service efficiency under different computing and network environments. A cyberinfrastructure web portal was developed for implementing the proposed technologies. After applying our optimization strategies, substantial performance enhancement is achieved. We expect this work to facilitate real-time spatial feature sharing, visual analytics and decision-making.  相似文献   
6.
A variety of Earth observation systems monitor the Earth and provide petabytes of geospatial data to decision-makers and scientists on a daily basis. However, few studies utilize spatiotemporal patterns to optimize the management of the Big Data. This article reports a new indexing mechanism with spatiotemporal patterns integrated to support Big Earth Observation (EO) metadata indexing for global user access. Specifically, the predefined multiple indices mechanism (PMIM) categorizes heterogeneous user queries based on spatiotemporal patterns, and multiple indices are predefined for various user categories. A new indexing structure, the Access Possibility R-tree (APR-tree), is proposed to build an R-tree-based index using spatiotemporal query patterns. The proposed indexing mechanism was compared with the classic R*-tree index in a number of scenarios. The experimental result shows that the proposed indexing mechanism generally outperforms a regular R*-tree and supports better operation of Global Earth Observation System of Systems (GEOSS) Clearinghouse.  相似文献   
7.
Abstract

The geospatial sciences face grand information technology (IT) challenges in the twenty-first century: data intensity, computing intensity, concurrent access intensity and spatiotemporal intensity. These challenges require the readiness of a computing infrastructure that can: (1) better support discovery, access and utilization of data and data processing so as to relieve scientists and engineers of IT tasks and focus on scientific discoveries; (2) provide real-time IT resources to enable real-time applications, such as emergency response; (3) deal with access spikes; and (4) provide more reliable and scalable service for massive numbers of concurrent users to advance public knowledge. The emergence of cloud computing provides a potential solution with an elastic, on-demand computing platform to integrate – observation systems, parameter extracting algorithms, phenomena simulations, analytical visualization and decision support, and to provide social impact and user feedback – the essential elements of the geospatial sciences. We discuss the utilization of cloud computing to support the intensities of geospatial sciences by reporting from our investigations on how cloud computing could enable the geospatial sciences and how spatiotemporal principles, the kernel of the geospatial sciences, could be utilized to ensure the benefits of cloud computing. Four research examples are presented to analyze how to: (1) search, access and utilize geospatial data; (2) configure computing infrastructure to enable the computability of intensive simulation models; (3) disseminate and utilize research results for massive numbers of concurrent users; and (4) adopt spatiotemporal principles to support spatiotemporal intensive applications. The paper concludes with a discussion of opportunities and challenges for spatial cloud computing (SCC).  相似文献   
8.
Agent-based models (ABM) allow for the bottom-up simulation of dynamics in complex adaptive spatial systems through the explicit representation of pattern–process interactions. This bottom-up simulation, however, has been identified as both data- and computing-intensive. While cyberinfrastrucutre provides such support for intensive computation, the appropriate management and use of cyberinfrastructure (CI)-enabled computing resources for ABM raise a challenging and intriguing issue. To gain insight into this issue, in this article we present a service-oriented simulation framework that supports spatially explicit agent-based modeling within a CI environment. This framework is designed at three levels: intermodel, intrasimulation, and individual. Functionalities at these levels are encapsulated into services, each of which is an assembly of new or existing services. Services at the intermodel and intrasimulation levels are suitable for generic ABM; individual-level services are designed specifically for modeling intelligent agents. The service-oriented simulation framework enables the integration of domain-specific functionalities for ABM and allows access to high-performance and distributed computing resources to perform simulation tasks that are often computationally intensive. We used a case study to investigate the utility of the framework in enabling agent-based modeling within a CI environment. We conducted experiments using supercomputing resources on the TeraGrid – a key element of the US CI. It is indicated that the service-oriented framework facilitates the leverage of CI-enabled resources for computationally intensive agent-based modeling.  相似文献   
9.
Abstract

Geospatial simulation models can help us understand the dynamic aspects of Digital Earth. To implement high-performance simulation models for complex geospatial problems, grid computing and cloud computing are two promising computational frameworks. This research compares the benefits and drawbacks of both in Web-based frameworks by testing a parallel Geographic Information System (GIS) simulation model (Schelling's residential segregation model). The parallel GIS simulation model was tested on XSEDE (a representative grid computing platform) and Amazon EC2 (a representative cloud computing platform). The test results demonstrate that cloud computing platforms can provide almost the same parallel computing capability as high-end grid computing frameworks. However, cloud computing resources are more accessible to individual scientists, easier to request and set up, and have more scalable software architecture for on-demand and dedicated Web services. These advantages may attract more geospatial scientists to utilize cloud computing for the development of Digital Earth simulation models in the future.  相似文献   
10.
Abstract

One of the major scientific challenges and societal concerns is to make informed decisions to ensure sustainable groundwater availability when facing deep uncertainties. A major computational requirement associated with this is on-demand computing for risk analysis to support timely decision. This paper presents a scientific modeling service called ‘ModflowOnAzure’ which enables large-scale ensemble runs of groundwater flow models to be easily executed in parallel in the Windows Azure cloud. Several technical issues were addressed, including the conjunctive use of desktop tools in MATLAB to avoid license issues in the cloud, integration of Dropbox with Azure for improved usability and ‘Drop-and-Compute,’ and automated file exchanges between desktop and the cloud. Two scientific use cases are presented in this paper using this service with significant computational speedup. One case is from Arizona, where six plausible alternative conceptual models and a streamflow stochastic model are used to evaluate the impacts of different groundwater pumping scenarios. Another case is from Texas, where a global sensitivity analysis is performed on a regional groundwater availability model. Results of both cases show informed uncertainty analysis results that can be used to assist the groundwater planning and sustainability study.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号