首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   14篇
  免费   0篇
地球物理   13篇
海洋学   1篇
  2020年   1篇
  2016年   1篇
  2015年   3篇
  2014年   1篇
  2013年   1篇
  2012年   1篇
  2011年   1篇
  2009年   2篇
  2008年   2篇
  2006年   1篇
排序方式: 共有14条查询结果,搜索用时 15 毫秒
1.
The proper assessment of design hydrographs and their main properties (peak, volume and duration) in small and ungauged basins is a key point of many hydrological applications. In general, two types of methods can be used to evaluate the design hydrograph: one approach is based on the statistics of storm events, while the other relies on continuously simulating rainfall‐runoff time series. In the first class of methods, the design hydrograph is obtained by applying a rainfall‐runoff model to a design hyetograph that synthesises the storm event. In the second approach, the design hydrograph is quantified by analysing long synthetic runoff time series that are obtained by transforming synthetic rainfall sequences through a rainfall‐runoff model. These simulation‐based procedures overcome some of the unrealistic hypotheses which characterize the event‐based approaches. In this paper, a simulation experiment is carried out to examine the differences between the two types of methods in terms of the design hydrograph's peak, volume and duration. The results conclude that the continuous simulation methods are preferable because the event‐based approaches tend to underestimate the hydrograph's volume and duration. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   
2.
Stochastic Environmental Research and Risk Assessment - Investigating the nature of trends in time series is one of the most common analyses performed in hydro-climate research. However, trend...  相似文献   
3.
4.
The joint occurrence of extreme hydroclimatic events, such as simultaneous precipitation deficit and high temperature, results in the so-called compound events, and has a serious impact on risk assessment and mitigation strategies. Multivariate frequency analysis (MFA) allows a probabilistic quantitative assessment of this risk under uncertainty. Analyzing precipitation and temperature records in the contiguous United States (CONUS), and focusing on the assessment of the degree of rarity of the 2014 California drought, we highlight some critical aspects of MFA that are often overlooked and should be carefully taken into account for a correct interpretation of the results. In particular, we show that an informative exploratory data analysis (EDA) devised to check the basic hypotheses of MFA, a suitable assessment of the sampling uncertainty, and a better understanding of probabilistic concepts can help to avoid misinterpretation of univariate and multivariate return periods, and incoherent conclusions concerning the risk of compound extreme hydroclimatic events. Empirical results show that the dependence between precipitation deficit and temperature across the CONUS can be positive, negative or not significant and does not exhibit significant changes in the last three decades. Focusing on the 2014 California drought as a compound event and based on the data used, the probability of occurrence strongly depends on the selected variables and how they are combined, and is affected by large uncertainty, thus preventing definite conclusions about the actual degree of rarity of this event.  相似文献   
5.
When waves impact a seawall, a vertical breakwater, an exposed jetty, a pier or a coastal bridge, they abruptly transfer their momentum into the structure. This energy transfer can be very violent and its duration exceptionally short. In the case of coastal bridges, whose spans are designed to have very short vibration period, wave impacts might have duration comparable to the natural period of oscillation of the structure, which therefore becomes prone to damage and failure. Previous forensic studies have documented the relative importance of impulsive loads on deck suspended structure, demonstrating the need to assess the effect of wave impacts on both the stability and the integrity of structural members since the early stages of the design. This requires the estimation of the dynamic characteristics of the loading pattern, and in particular the wave impulse and corresponding impact maxima and rise times. Based on the conservation of momentum, functional relationships between these parameters have been identified since pioneering work dating back to the late '30s of the 20th century. The complexity of the loading process, however, results in a significantly large variability of wave impact maxima and rise times even under similar conditions, suggesting the need for a probabilistic approach to the definition of the relationship between these two variables, to be applied when estimating the dynamic properties of wave for use in structural analysis of coastal structures. In the recent past, some effort has been made to identify functional relationships between such quantities; these require the assessment of the conditional quantiles (or similarly the conditional distribution) of wave impact maxima given the rise times. In this paper, we compare three different statistical methods proposed in the literature to accomplish this task, in order to assess the reliability of the approach and suggest guidelines for practical applications. A copula-based method, Generalized Additive Models for Location, Scale and Shape (GAMLSS), and quantile regression are applied to measurements from large-scale 3-dimensional physical model tests. The investigation suggests that quantile regression gives the simplest results to be used in practice; copula approach and GAMLSS are possible alternative when semi-parametric or fully parametric modeling is needed.  相似文献   
6.
A comprehensive parametric approach to study the probability distribution of rainfall data at scales of hydrologic interest (e.g. from few minutes up to daily) requires the use of mixed distributions with a discrete part accounting for the occurrence of rain and a continuous one for the rainfall amount. In particular, when a bivariate vector (X, Y) is considered (e.g. simultaneous observations from two rainfall stations or from two instruments such as radar and rain gauge), it is necessary to resort to a bivariate mixed model. A quite flexible mixed distribution can be defined by using a 2-copula and four marginals, obtaining a bivariate copula-based mixed model. Such a distribution is able to correctly describe the intermittent nature of rainfall and the dependence structure of the variables. Furthermore, without loss of generality and with gain of parsimony this model can be simplified by some transformations of the marginals. The main goals of this work are: (1) to empirically explore the behaviour of the parameters of marginal transformations as a function of time scale and inter-gauge distance, by analysing data from a network of rain gauges; (2) to compare the properties of the regression curves associated to the copula-based mixed model with those derived from the model simplified by transformations of the marginals. The results from the investigation of transformations’ parameters are in agreement with the expected theoretical dependence on inter-gauge distance, and show dependence on time scale. The analysis on the regression curves points out that: (1) a copula-based mixed model involves regression curves quite close to some non-parametric models; (2) the performance of the parametric regression decreases in the same cases in which non-parametric regression shows some instability; (3) the copula-based mixed model and its simplified version show similar behaviour in term of regression for mid-low values of rainfall. An erratum to this article can be found at  相似文献   
7.
8.
There are large uncertainties associated with radar estimates of rainfall, including systematic errors as well as the random effects from several sources. This study focuses on the modeling of the systematic error component, which can be described mathematically in terms of a conditional expectation function. The authors present two different approaches: non-parametric (kernel-based) and parametric (copula-based). A large sample (more than six years) of rain gauge measurements from a dense network located in south-west England is used as an approximation of the true ground rainfall. These data are complemented with rainfall estimates by a C-band weather radar located at Wardon Hill, which is about 40 km from the catchment. The authors compare the results obtained using the parametric and non-parametric schemes for four temporal scales of hydrologic interest (5 and 15 min, hourly and three-hourly) by means of several different performance indices and discuss the strengths and weaknesses of each approach.  相似文献   
9.
Several statistical tests are available for testing the Poisson hypothesis and/or the equidispersion of a point process. The capability to discriminate between the Poissonian behaviour and more complex processes is fundamental in many areas of research including earthquake analysis, hydrology, ecology, biology, signal analysis and sociology. This study investigates the relationship between two indices often used for detecting departures from equidispersion, namely, the index of dispersion (ID) and the Allan factor (AF). Since an approximation of the sampling distribution of AF for Poisson data has been recently proposed in the literature, we perform a detailed analysis of its properties and its relationship with the asymptotic sampling distribution of ID. Moreover, the statistical power of the AF for testing the Poisson hypothesis is assessed by using an extensive Monte Carlo simulation, and the performances of AF and ID are compared. We propose a simplified version of the AF sampling distribution that does not depend on the rate of occurrence keeping the maximum errors of extreme percentiles always smaller than 2–3 %. The power study highlights that ID systematically outperforms AF for discriminating between equidispersed and under/over dispersed data. Both indices show the same lack of power for distinguishing between data drawn from equidispersed non-Poissonian distribution functions. Therefore, even though AF is a useful statistical tool for detecting the possible fractal behaviour of point processes, ID should be preferred when the analysis aims at assessing equidispersion. The lack of power for a small sample size confirms the difficulty of identifying the true nature of the occurrence process of rare events.  相似文献   
10.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号