首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 33 毫秒
1.
Planning and design of coastal protection rely on information about the probabilities of very severe storm tides and the possible changes that may occur in the course of climate change. So far, this information is mostly provided in the form of high percentiles obtained from frequency distributions or return values. More detailed information and assessments of events that may cause extreme damages or have extreme consequences at the coast are so far still unavailable. We describe and compare two different approaches that may be used to identify highly unlikely but still physically possible and plausible events from model simulations. Firstly, in the case when consistent wind and tide-surge data are available, different metrics such as the height of the storm surge can be derived directly from the simulated water levels. Secondly, in cases where only atmospheric data are available, the so called effective wind may be used. The latter is the projection of the horizontal wind vector on that direction which is most effective in producing surges at the coast. Comparison of events identified by both methods show that they can identify extreme events but that knowledge of the effective wind alone does not provide sufficient information to identify the highest storm surges. Tracks of the low-pressure systems over the North Sea need to be investigated to find those cases, where the duration of the high wind is too short to induce extreme storm tides. On the other hand, factors such as external surges or variability in mean sea level may enhance surge heights and are not accounted for in estimates based on effective winds only. Results from the analysis of an extended data set suggest that unprecedented storm surges at the German North Sea coast are possible even without taking effects from rising mean sea level into account. The work presented is part of the ongoing project “Extreme North Sea Storm Surges and Their Consequences” (EXTREMENESS) and represents the first step towards an impact assessment for very severe storm surges which will serve as a basis for development of adaptation options and evaluation criteria.  相似文献   

2.
The research presented in this paper involves the application of the joint probability method to the estimation of extreme water levels resulting from astronomical tides and surge residuals and the investigation of the effects of tide–surge interactions on extreme water levels. The distribution of tide peaks was analysed from field records (<20 years) and a 46-year dataset of monthly maximum tidal amplitudes. Large surges were extracted from both field records and a numerical model hindcast covering the 48 largest storm events in the Irish Sea over the period 1959–2005. Extreme storm surges and tides were independently modelled using the generalised extreme value statistical model, and derived probability distributions were used to compute extreme water levels. An important, and novel, aspect of this research is an analysis of tide–surge interactions and their effects on total water level; where interactions exist, they lead to lower total water levels than in the case of independency. The degree of decrease varies with interaction strength, magnitude of surge peak at a particular phase of tide and the distribution of peaks over a tidal cycle. Therefore, including interactions in the computation of extreme levels may provide very useful information at the design stage of coastal protection systems.  相似文献   

3.
Managing environmental and social systems in the face of uncertainty requires the best possible forecasts of future conditions. We use space–time variability in historical data and projections of future population density to improve forecasting of residential water demand in the City of Phoenix, Arizona. Our future water estimates are derived using the first and second order statistical moments between a dependent variable, water use, and an independent variable, population density. The independent variable is projected at future points, and remains uncertain. We use adjusted statistical moments that cover projection errors in the independent variable, and propose a methodology to generate information-rich future estimates. These updated estimates are processed in Bayesian Maximum Entropy (BME), which produces maps of estimated water use to the year 2030. Integrating the uncertain estimates into the space–time forecasting process improves forecasting accuracy up to 43.9% over other space–time mapping methods that do not assimilate the uncertain estimates. Further validation studies reveal that BME is more accurate than co-kriging that integrates the error-free independent variable, but shows similar accuracy to kriging with measurement error that processes the uncertain estimates. Our proposed forecasting method benefits from the uncertain estimates of the future, provides up-to-date forecasts of water use, and can be adapted to other socio-economic and environmental applications.  相似文献   

4.
Radar rainfall estimation for flash flood forecasting in small, urban catchments is examined through analyses of radar, rain gage and discharge observations from the 14.3 km2 Dead Run drainage basin in Baltimore County, Maryland. The flash flood forecasting problem pushes the envelope of rainfall estimation to time and space scales that are commensurate with the scales at which the fundamental governing laws of land surface processes are derived. Analyses of radar rainfall estimates are based on volume scan WSR-88D reflectivity observations for 36 storms during the period 2003–2005. Gage-radar analyses show large spatial variability of storm total rainfall over the 14.3 km2 basin for flash flood producing storms. The ability to capture the detailed spatial variation of rainfall for flash flood producing storms by WSR-88D rainfall estimates varies markedly from event to event. As spatial scale decreases from the 14.3 km2 scale of the Dead Run watershed to 1 km2 (and the characteristic time scale of flash flood producing rainfall decreases from 1 h to 15 min) the predictability of flash flood response from WSR-88D rainfall estimates decreases sharply. Storm to storm variability of multiplicative bias in storm total rainfall estimates is a dominant element of the error structure of radar rainfall estimates, and it varies systematically over the warm season and with flood magnitude. Analyses of the 7 July 2004 and 28 June 2005 storms illustrate microphysical and dynamical controls on radar estimation error for extreme flash flood producing storms.  相似文献   

5.
Abstract

This paper analyses a number of aspects related to the estimation of the design flood for a dam. A new approach to the estimation of the probable maximum precipitation (PMP) is described which takes advantage of the spatial variability of precipitation by using radar-derived distributed rainfall measurements. Procedures which utilize storm transposition and storm maximization are introduced to estimate the probable maximum flood (PMF) and are compared with regionalized statistical methods based upon the Wakeby and generalized extreme value distributions.  相似文献   

6.
In Smith (1986, J. Hydrol. 86, 27–43), a family of statistical distributions and estimators for extreme values based on a fixed number r > = 1 of the largest annual events are presented. The method of estimation was numerical maximum likelihood. In this paper, we consider the robust estimation of parameters in such families of distributions. The estimation technique, which is based on optimal B-robust estimates, will assign weights to each observation and give estimates of the parameters based on the data which are well modeled by the distribution. Thus, observations which are not consistent with the proposed distribution can be identified and the validity of the model can be assessed. The method is illustrated on Venice sea level data.  相似文献   

7.
Sea-level rise, as a result of global warming, may lead to more natural disasters in coastal regions where there are substantial aggregations of population and property. Thus, this paper focuses on the impact of sea-level rise on the recurrence periods of extreme water levels fitted using the Pearson type III (P-III) model. Current extreme water levels are calculated using observational data, including astronomical high tides and storm surges, while future extreme water levels are determined by superposing scenario data of sea-level rise onto current extreme water levels. On the basis of a case study using data from Shandong Province, China, results indicated that sea-level rise would significantly shorten the recurrence periods of extreme water levels, especially under higher representative concentration pathway (RCP) scenarios. Results showed that by the middle of the century, 100-year current extreme water levels for all stations would translate into once in 15–30 years under RCP 2.6, and once in ten to 25 years under RCP 8.5. Most seriously, the currently low probability event of a 1000-year recurrence would become common, occurring nearly every 10 years by 2100, based on projections under RCP 8.5. Therefore, according to this study, corresponding risk to coastlines could well be increase in future, as the recurrence periods of extreme water levels would be shortened with climate change.  相似文献   

8.
Spatial interpolation methods used for estimation of missing precipitation data generally under and overestimate the high and low extremes, respectively. This is a major limitation that plagues all spatial interpolation methods as observations from different sites are used in local or global variants of these methods for estimation of missing data. This study proposes bias‐correction methods similar to those used in climate change studies for correcting missing precipitation estimates provided by an optimal spatial interpolation method. The methods are applied to post‐interpolation estimates using quantile mapping, a variant of equi‐distant quantile matching and a new optimal single best estimator (SBE) scheme. The SBE is developed using a mixed‐integer nonlinear programming formulation. K‐fold cross validation of estimation and correction methods is carried out using 15 rain gauges in a temperate climatic region of the U.S. Exhaustive evaluation of bias‐corrected estimates is carried out using several statistical, error, performance and skill score measures. The differences among the bias‐correction methods, the effectiveness of the methods and their limitations are examined. The bias‐correction method based on a variant of equi‐distant quantile matching is recommended. Post‐interpolation bias corrections have preserved the site‐specific summary statistics with minor changes in the magnitudes of error and performance measures. The changes were found to be statistically insignificant based on parametric and nonparametric hypothesis tests. The correction methods provided improved skill scores with minimal changes in magnitudes of several extreme precipitation indices. The bias corrections of estimated data also brought site‐specific serial autocorrelations at different lags and transition states (dry‐to‐dry, dry‐to‐wet, wet‐to‐wet and wet‐to‐dry) close to those from the observed series. Bias corrections of missing data estimates provide better serially complete precipitation time series useful for climate change and variability studies in comparison to uncorrected filled data series. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

9.
Determining extreme parameter correlation in ground water models   总被引:4,自引:0,他引:4  
Hill MC  Osterby O 《Ground water》2003,41(4):420-430
In ground water flow system models with hydraulic-head observations but without significant imposed or observed flows, extreme parameter correlation generally exists. As a result, hydraulic conductivity and recharge parameters cannot be uniquely estimated. In complicated problems, such correlation can go undetected even by experienced modelers. Extreme parameter correlation can be detected using parameter correlation coefficients, but their utility depends on the presence of sufficient, but not excessive, numerical imprecision of the sensitivities, such as round-off error. This work investigates the information that can be obtained from parameter correlation coefficients in the presence of different levels of numerical imprecision, and compares it to the information provided by an alternative method called the singular value decomposition (SVD). Results suggest that (1) calculated correlation coefficients with absolute values that round to 1.00 were good indicators of extreme parameter correlation, but smaller values were not necessarily good indicators of lack of correlation and resulting unique parameter estimates; (2) the SVD may be more difficult to interpret than parameter correlation coefficients, but it required sensitivities that were one to two significant digits less accurate than those that required using parameter correlation coefficients; and (3) both the SVD and parameter correlation coefficients identified extremely correlated parameters better when the parameters were more equally sensitive. When the statistical measures fail, parameter correlation can be identified only by the tedious process of executing regression using different sets of starting values, or, in some circumstances, through graphs of the objective function.  相似文献   

10.
The use of historical data can significantly reduce the uncertainty around estimates of the magnitude of rare events obtained with extreme value statistical models. For historical data to be included in the statistical analysis a number of their properties, e.g. their number and magnitude, need to be known with a reasonable level of confidence. Another key aspect of the historical data which needs to be known is the coverage period of the historical information, i.e. the period of time over which it is assumed that all large events above a certain threshold are known. It might be the case though, that it is not possible to easily retrieve with sufficient confidence information on the coverage period, which therefore needs to be estimated. In this paper methods to perform such estimation are introduced and evaluated. The statistical definition of the problem corresponds to estimating the size of a population for which only few data points are available. This problem is generally refereed to as the German tanks problem, which arose during the second world war, when statistical estimates of the number of tanks available to the German army were obtained. Different estimators can be derived using different statistical estimation approaches, with the maximum spacing estimator being the minimum-variance unbiased estimator. The properties of three estimators are investigated by means of a simulation study, both for the simple estimation of the historical coverage and for the estimation of the extreme value statistical model. The maximum spacing estimator is confirmed to be a good approach to the estimation of the historical period coverage for practical use and its application for a case study in Britain is presented.  相似文献   

11.
The effect of wind waves on water level and currents during two storms in the North Sea is investigated using a high-resolution Nucleus for European Modelling of the Ocean (NEMO) model forced with fluxes and fields from a high-resolution wave model. The additional terms accounting for wave-current interaction that are considered in this study are the Stokes-Coriolis force, the sea-state-dependent energy and momentum fluxes. The individual and collective role of these processes is quantified and the results are compared with a control run without wave effects as well as against current and water-level measurements from coastal stations. We find a better agreement with observations when the circulation model is forced by sea-state-dependent fluxes, especially in extreme events. The two extreme events, the storm Christian (25–27 October 2013), and about a month later, the storm Xaver (5–7 December 2013), induce different wave and surge conditions over the North Sea. Including the wave effects in the circulation model for the storm Xaver raises the modelled surge by more than 40 cm compared with the control run in the German Bight area. For the storm Christian, a difference of 20–30 cm in the surge level between the wave-forced and the stand-alone ocean model is found over the whole southern part of the North Sea. Moreover, the modelled vertical velocity profile fits the observations very well when the wave forcing is accounted for. The contribution of wave-induced forcing has been quantified indicating that this represents an important mechanism for improving water-level and current predictions.  相似文献   

12.
Sudong Xu  Wenrui Huang 《水文研究》2008,22(23):4507-4518
In the Coastal Flood Insurance Study by the Federal Emergency Management Agency (FEMA, 2005), 1% annual maximum coastal water levels are used in coastal flood hazard mitigation and engineering design in coastal areas of USA. In this study, a frequency analysis method has been developed to provide more accurate predictions of 1% annual maximum water levels for the Florida coast waters. Using 82 and 94 years of annual maximum water level data at Pensacola and Fernandina, performances of traditional frequency analysis methods, including advanced method of Generalized Extreme Value distribution method, have been evaluated. Comparison with observations of annual maximum water levels with 83 and 95 years of return periods indicate that traditional methods are unable to provide satisfactory predictions of 1% annual maximum water levels to account for hurricane‐induced extreme water levels. Based on the characteristics of annual maximum water level distribution of Pensacola and Fernandina stations, a new probability distribution method has been developed in this study. Comparison with observations indicates that the method presented in this study significantly improves the accuracy of predictions of 1% annual maximum water levels. For Fernandina station, predictions of extreme water level match well with the general trend of observations. With a correlation coefficient of 0·98, the error for the maximum observed extreme water level of 3·11 m (National Geodetic Vertical Datum) with 95 years of return period is 0·92%. For Pensacola station, the prediction error for the maximum observed extreme water level with a return period of 83 years is 5·5%, with a correlation value of 0·98. The frequency analysis has also been reasonably compared to the more costly Monte Carlo simulation method. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

13.
The goal of quantile regression is to estimate conditional quantiles for specified values of quantile probability using linear or nonlinear regression equations. These estimates are prone to “quantile crossing”, where regression predictions for different quantile probabilities do not increase as probability increases. In the context of the environmental sciences, this could, for example, lead to estimates of the magnitude of a 10-year return period rainstorm that exceed the 20-year storm, or similar nonphysical results. This problem, as well as the potential for overfitting, is exacerbated for small to moderate sample sizes and for nonlinear quantile regression models. As a remedy, this study introduces a novel nonlinear quantile regression model, the monotone composite quantile regression neural network (MCQRNN), that (1) simultaneously estimates multiple non-crossing, nonlinear conditional quantile functions; (2) allows for optional monotonicity, positivity/non-negativity, and generalized additive model constraints; and (3) can be adapted to estimate standard least-squares regression and non-crossing expectile regression functions. First, the MCQRNN model is evaluated on synthetic data from multiple functions and error distributions using Monte Carlo simulations. MCQRNN outperforms the benchmark models, especially for non-normal error distributions. Next, the MCQRNN model is applied to real-world climate data by estimating rainfall Intensity–Duration–Frequency (IDF) curves at locations in Canada. IDF curves summarize the relationship between the intensity and occurrence frequency of extreme rainfall over storm durations ranging from minutes to a day. Because annual maximum rainfall intensity is a non-negative quantity that should increase monotonically as the occurrence frequency and storm duration decrease, monotonicity and non-negativity constraints are key constraints in IDF curve estimation. In comparison to standard QRNN models, the ability of the MCQRNN model to incorporate these constraints, in addition to non-crossing, leads to more robust and realistic estimates of extreme rainfall.  相似文献   

14.
In the present paper, an ensemble approach is proposed to estimate possible modifications caused by climate changes in the extreme precipitation regime, with the rain gauge Napoli Servizio Idrografico (Naples, Italy) chosen as test case. The proposed research, focused on the analysis of extremes on the basis of climate model simulations and rainfall observations, is structured in several consecutive steps. In the first step, all the dynamically downscaled EURO‐CORDEX simulations at about 12 km horizontal resolution are collected for the current period 1971–2000 and the future period 2071–2100, for the RCP4.5 and the RCP8.5 concentration scenarios. In the second step, the significance of climate change effects on extreme precipitation is statistically tested by comparing current and future simulated data and bias‐correction is performed by means of a novel approach based on a combination of simple delta change and quantile delta mapping, in compliance with the storm index method. In the third step, two different ensemble models are proposed, accounting for the variabilities given by the use of different climate models and for their hindcast performances. Finally, the ensemble models are used to build novel intensity–duration–frequency curves, and their effects on the early warning system thresholds for the area of interest are evaluated.  相似文献   

15.
In Part 1 of this work (Akmaev, 1999), an overview of the theory of optimal interpolation (OI) (Gandin, 1963) and related techniques of data assimilation based on linear optimal estimation (Liebelt, 1967; Catlin, 1989; Mendel, 1995) is presented. The approach implies the use in data analysis of additional statistical information in the form of statistical moments, e.g., the mean and covariance (correlation). The a priori statistical characteristics, if available, make it possible to constrain expected errors and obtain optimal in some sense estimates of the true state from a set of observations in a given domain in space and/or time. The primary objective of OI is to provide estimates away from the observations, i.e., to fill in data voids in the domain under consideration. Additionally, OI performs smoothing suppressing the noise, i.e., the spectral components that are presumably not present in the true signal. Usually, the criterion of optimality is minimum variance of the expected errors and the whole approach may be considered constrained least squares or least squares with a priori information. Obviously, data assimilation techniques capable of incorporating any additional information are potentially superior to techniques that have no access to such information as, for example, the conventional least squares (e.g., Liebelt, 1967; Weisberg, 1985; Press et al., 1992; Mendel, 1995).  相似文献   

16.
Methods for estimating the magnitude of extreme floods are reviewed. A method which combines a probabilistic storm transposition technique with a physically-based distributed rainfallrunoff model is described. Synthetic storms with detailed spatial and temporal distributions are generated and applied to the calibrated model of the Brue river basin, U.K. (area 135 km2). The variability of catchment response due to storm characteristics (storm area, storm duration, storm movement, storm shape and within storm variation) and initial catchment wetness conditions is investigated. A probabilistic approach to estimating the return periods of extreme catchment responses is suggested.  相似文献   

17.
The performances of the spectral ratio (SR), frequency centroid shift (FCS), and frequency peak shift (FPS) methods to estimate the effective quality factor Q are compared. These methods do not demand true amplitude data and their implementations were done following an “as simple as possible” approach to highlight their intrinsic potentials and limitations. We use synthetic zero-offset seismic data generated with a simple layer-cake isotropic model. The methods can be ranked from simple to complex in terms of automation as: FPS, FCS and SR. This is a consequence of: (i) peak identification consists basically of a sorting procedure, (ii) centroid estimation involves basically the evaluation of two well-behaved integrals, and (iii) implementation of the SR method involves at least choosing a usable frequency bandwidth and fitting a gradient. The methods can be ranked from robust to sensitive in the presence of noise content in the sequence SR, FCS, and FPS. This is consequence of: (i) the gradient estimate associated to the SR method averages out the noise content in the entire usable frequency bandwidth, (ii) in the presence of moderate-to-high noise level, the centroid estimation is biassed towards overestimating Q due to noise contribution in the tail of the amplitude spectrum, and (iii) peak identification is unstable due to local noise fluctuation in the amplitude spectrum around the peak frequency. Regarding the stability of the estimates relative to the attenuation amount, SR and FCS methods show similar behaviours, whereas FPS method presents an inferior performance. This fact is an indirect consequence of the sensitivity of FPS method to the noise content because the higher is the attenuation the lower is the signal-to-noise ratio. Finally, regarding the robustness of the methods to the presence of dipping layers, only SR and FCS methods provide good estimates, at least to typical dips in non-faulted sedimentary layers, with the estimates obtained with SR method being more accurate that those obtained with FCS method. Except in relation to the automation complexity, which is less important than the performances of the methods, SR method was superior or showed similar performance to FCS method in all scenarios we tried.  相似文献   

18.
徐云波 《内陆地震》1992,6(4):359-363
本文引进应用极值顺序统计量推断一个概率分布尾部性质的方法,并介绍了参数估计的方法。文中举例说明如何应用这一理论预测灾害性地震。这种方法比传统极值方法更适用于无季节性变化的数据。  相似文献   

19.
Based on tide gauge observations spanning almost 200 years, homogeneous time series of the mean relative sea level were derived for nine sites at the southern coast of the Baltic Sea. Our regionally concentrated data were complemented by long-term relative sea-level records retrieved from the data base of the Permanent Service for Mean Sea Level (PSMSL). From these records relative sea-level change rates were derived at 51 tide gauge stations for the period between 1908 and 2007. A minimum observation time of 60 years is required for the determination of reliable sea-level rates. At present, no anthropogenic acceleration in sea-level rise is detected in the tide gauge observations in the southern Baltic. The spatial variation of the relative sea-level rates reflects the fingerprint of GIA-induced crustal uplift. Time series of extreme sea levels were also inferred from the tide gauge records. They were complemented by water level information from historic storm surge marks preserved along the German Baltic coast. Based on this combined dataset the incidence and spatial variation of extreme sea levels induced by storm surges were analysed yielding important information for hazard assessments. Permanent GPS observations were used to determine recent crustal deformation rates for 44 stations in the Baltic Sea region. The GPS derived height change rates were applied to reduce the relative sea-level changes observed by tide gauges yielding an estimate for the eustatic sea-level change. For 13 tide gauge-GPS colocation sites a mean eustatic sea-level trend of 1.3 mm/a was derived for the last 100 years.  相似文献   

20.
Beach ridge stratigraphy can provide an important record of both sustained coastal progradation and responses to events such as extreme storms, as well as evidence of earthquake induced sediment pulses. This study is a stratigraphic investigation of the late Holocene mixed sand gravel (MSG) beach ridge plain on the Canterbury coast, New Zealand. The subsurface was imaged along a 370 m shore-normal transect using 100 and 200 MHz ground penetrating radar (GPR) antennae, and cored to sample sediment textures. Results show that, seaward of a back-barrier lagoon, the Pegasus Bay beach ridge plain prograded almost uniformly, under conditions of relatively stable sea level. Nearshore sediment supply appears to have created a sustained sediment surplus, perhaps as a result of post-seismic sediment pulses, resulting in a flat, morphologically featureless beach ridge plain. Evidence of a high magnitude storm provides an exception, with an estimated event return period in excess of 100 years. Evidence from the GPR sequence combined with modern process observations from MSG beaches indicates that a palaeo-storm initially created a washover fan into the back-barrier lagoon, with a large amount of sediment simultaneously moved off the beach face into the nearshore. This erosion event resulted in a topographic depression still evident today. In the subsequent recovery period, sediment was reworked by swash onto the beach as a sequence of berm deposit laminations, creating an elevated beach ridge that also has a modern-day topographic signature. As sediment supply returned to normal, and under conditions of falling sea level, a beach ridge progradation sequence accumulated seaward of the storm feature out to the modern-day beach as a large flat, uniform progradation plain. This study highlights the importance of extreme storm events and earthquake pulses on MSG coastlines in triggering high volume beach ridge formation during the subsequent recovery period. © 2019 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号