首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   54篇
  免费   5篇
  国内免费   1篇
测绘学   4篇
大气科学   2篇
地球物理   20篇
地质学   24篇
天文学   7篇
自然地理   3篇
  2022年   2篇
  2021年   1篇
  2020年   5篇
  2019年   4篇
  2018年   1篇
  2017年   4篇
  2016年   5篇
  2015年   2篇
  2014年   3篇
  2013年   3篇
  2012年   3篇
  2011年   3篇
  2010年   1篇
  2009年   1篇
  2008年   6篇
  2007年   4篇
  2006年   2篇
  2005年   2篇
  2004年   3篇
  2000年   1篇
  1999年   1篇
  1996年   2篇
  1994年   1篇
排序方式: 共有60条查询结果,搜索用时 15 毫秒
41.
42.
The evolution of lava flows emplaced on Mount Etna (Italy) in September 2004 is examined in detail through the analysis of morphometric measurements of flow units. The growth of the main channelized flow is consistent with a layering of lava blankets, which maintains the initial geometry of the channel (although levees are widened and raised), and is here explicitly related to the repeated overflow of lava pulses. A simple analytical model is introduced describing the evolution of the flow level in a channelized flow unit fed by a fluctuating supply. The model, named FLOWPULSE, shows that a fluctuation in the velocity of lava extrusion at the vent triggers the formation of pulses, which become increasingly high the farther they are from the vent, and are invariably destined to overflow within a given distance. The FLOWPULSE simulations are in accordance with the observed morphology, characterized by a very flat initial profile followed by a massive increase in flow unit cross-section area between 600 and 700 m downflow. The modeled emplacement dynamics provides also an explanation for the observed substantial “loss” of the original flowing mass with increasing distance from the vent.  相似文献   
43.
Since the 1970s, multiple reconstruction techniques have been proposed and are currently used, to extrapolate and quantify eruptive parameters from sampled tephra fall deposit datasets. Atmospheric transport and deposition processes strongly control the spatial distribution of tephra deposit; therefore, a large uncertainty affects mass derived estimations especially for fall layer that are not well exposed. This paper has two main aims: the first is to analyse the sensitivity to the deposit sampling strategy of reconstruction techniques. The second is to assess whether there are differences between the modelled values for emitted mass and grainsize, versus values estimated from the deposits. We find significant differences and propose a new correction strategy. A numerical approach is demonstrated by simulating with a dispersal code a mild explosive event occurring at Mt. Etna on 24 November 2006. Eruptive parameters are reconstructed by an inversion information collected after the eruption. A full synthetic deposit is created by integrating the deposited mass computed by the model over the computational domain (i.e., an area of 7.5 × 104 km 2). A statistical analysis based on 2000 sampling tests of 50 sampling points shows a large variability, up to 50 % for all the reconstruction techniques. Moreover, for some test examples Power Law errors are larger than estimated uncertainty. A similar analysis, on simulated grain-size classes, shows how spatial sampling limitations strongly reduce the utility of available information on the total grain size distribution. For example, information on particles coarser than ?(?4) is completely lost when sampling at 1.5 km from the vent for all columns with heights less than 2000 m above the vent. To correct for this effect an optimal sampling strategy and a new reconstruction method are presented. A sensitivity study shows that our method can be extended to a wide range of eruptive scenarios including those in which aggregation processes are important. The new correction method allows an estimate of the deficiency for each simulated class in calculated mass deposited, providing reliable estimation of uncertainties in the reconstructed total (whole deposit) grainsize distribution.  相似文献   
44.
We compare the performances of four stochastic optimisation methods using four analytic objective functions and two highly non‐linear geophysical optimisation problems: one‐dimensional elastic full‐waveform inversion and residual static computation. The four methods we consider, namely, adaptive simulated annealing, genetic algorithm, neighbourhood algorithm, and particle swarm optimisation, are frequently employed for solving geophysical inverse problems. Because geophysical optimisations typically involve many unknown model parameters, we are particularly interested in comparing the performances of these stochastic methods as the number of unknown parameters increases. The four analytic functions we choose simulate common types of objective functions encountered in solving geophysical optimisations: a convex function, two multi‐minima functions that differ in the distribution of minima, and a nearly flat function. Similar to the analytic tests, the two seismic optimisation problems we analyse are characterised by very different objective functions. The first problem is a one‐dimensional elastic full‐waveform inversion, which is strongly ill‐conditioned and exhibits a nearly flat objective function, with a valley of minima extended along the density direction. The second problem is the residual static computation, which is characterised by a multi‐minima objective function produced by the so‐called cycle‐skipping phenomenon. According to the tests on the analytic functions and on the seismic data, genetic algorithm generally displays the best scaling with the number of parameters. It encounters problems only in the case of irregular distribution of minima, that is, when the global minimum is at the border of the search space and a number of important local minima are distant from the global minimum. The adaptive simulated annealing method is often the best‐performing method for low‐dimensional model spaces, but its performance worsens as the number of unknowns increases. The particle swarm optimisation is effective in finding the global minimum in the case of low‐dimensional model spaces with few local minima or in the case of a narrow flat valley. Finally, the neighbourhood algorithm method is competitive with the other methods only for low‐dimensional model spaces; its performance sensibly worsens in the case of multi‐minima objective functions.  相似文献   
45.
Continuous GPS (CGPS) data, collected at Mt. Etna between April 2012 and October 2013, clearly define inflation/deflation processes typically observed before/after an eruption onset. During the inflationary process from May to October 2013, a particular deformation pattern localised in the upper North Eastern sector of the volcano suggests that a magma intrusion had occurred a few km away from the axis of the summit craters, beneath the NE Rift system. This is the first time that this pattern has been recorded by CGPS data at Mt. Etna. We believe that this inflation process might have taken place periodically at Mt. Etna and might be associated with the intrusion of batches of magma that are separate from the main feeding system. We provide a model to explain this unusual behaviour and the eruptive regime of this rift zone, which is characterised by long periods of quiescence followed by often dangerous eruptions in which vents can open at low elevation and thus threaten the villages in this sector of the volcano.  相似文献   
46.
Permanent downhole sensors provide the eyes and ears to the reservoir and enable monitoring the reservoir conditions on a real‐time basis. In particular, the use of sensors and remotely controlled valves in wells and on the surface, in combination with reservoir flow models provide enormous benefits to reservoir management and oil production. We suggest borehole radar measurements as a promising technique capable to monitor the arrival of undesired fluids in the proximity of production wells. We use 1D modelling to investigate the expected signal magnitude and depth of investigation of a borehole radar sensor operating in an oilfield environment. We restrict the radar applicability to environments where the radar investigation depth can fit the reservoir size necessary to be monitored. Potential applications are steam chamber monitoring in steam assisted gravity drainage processes and water front monitoring in thin oil rim environments. A more sophisticated analysis of the limits of a radar system is carried out through 2D finite‐difference time‐domain simulations. The metal components of the wellbore casing can cause destructive interference with the emitted signal. A high dielectric medium surrounding the production well increases the amplitude of the signal and so the radar performance. Other reservoir constraints are given by the complexity of the reservoir and the dynamic of the fluids. Time‐lapse changes in the heterogeneity of the background formation strongly affect the retrieval of the target reflections and gradual fluid saturation changes reduce the amplitudes of the reflections.  相似文献   
47.
Interest in high-resolution satellite imagery (HRSI) is spreading in several application fields, at both scientific and commercial levels. Fundamental and critical goals for the geometric use of this kind of imagery are their orientation and orthorectification, processes able to georeference the imagery and correct the geometric deformations they undergo during acquisition. In order to exploit the actual potentialities of orthorectified imagery in Geomatics applications, the definition of a methodology to assess the spatial accuracy achievable from oriented imagery is a crucial topic.In this paper we want to propose a new method for accuracy assessment based on the Leave-One-Out Cross-Validation (LOOCV), a model validation method already applied in different fields such as machine learning, bioinformatics and generally in any other field requiring an evaluation of the performance of a learning algorithm (e.g. in geostatistics), but never applied to HRSI orientation accuracy assessment.The proposed method exhibits interesting features which are able to overcome the most remarkable drawbacks involved by the commonly used method (Hold-Out Validation — HOV), based on the partitioning of the known ground points in two sets: the first is used in the orientation–orthorectification model (GCPs — Ground Control Points) and the second is used to validate the model itself (CPs — Check Points). In fact the HOV is generally not reliable and it is not applicable when a low number of ground points is available.To test the proposed method we implemented a new routine that performs the LOOCV in the software SISAR, developed by the Geodesy and Geomatics Team at the Sapienza University of Rome to perform the rigorous orientation of HRSI; this routine was tested on some EROS-A and QuickBird images. Moreover, these images were also oriented using the world recognized commercial software OrthoEngine v. 10 (included in the Geomatica suite by PCI), manually performing the LOOCV since only the HOV is implemented.The software comparison guaranteed about the overall correctness and good performances of the SISAR model, whereas the results showed the good features of the LOOCV method.  相似文献   
48.
Data discoverability, accessibility, and integration are frequent barriers for scientists and a major obstacle for favorable results on environmental research. To tackle this issue, the Group on Earth Observations (GEO) is leading the development of the Global Earth Observation System of Systems (GEOSS), a voluntary effort that connects Earth Observation resources world‐wide, acting as a gateway between producers and users of environmental data. GEO recognizes the importance of capacity building and education to reach large adoption, acceptance and commitment on data sharing principles to increase the capacity to access and use Earth Observations data. This article presents “Bringing GEOSS services into practice” (BGSIP), an integrated set of teaching material and software to facilitate the publication and use of environmental data through standardized discovery, view, download, and processing services, further facilitating the registration of data into GEOSS. So far, 520 participants in 10 countries have been trained using this material, leading to numerous Spatial Data Infrastructure implementations and 1,000 tutorial downloads. This workshop lowers the entry barriers for both data providers and users, facilitates the development of technical skills, and empowers people.  相似文献   
49.
50.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号