首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 640 毫秒
1.
Provision of reliable scientific support to socio‐economic development and eco‐environmental conservation is challenged by complexities of irregular nonlinearities, data uncertainties, and multivariate dependencies of hydrological systems in the Three Gorges Reservoir (TGR) region, China. Among them, the irregular nonlinearities mainly represent unreliability of regular functions for robust simulation of highly complicated relationships between variables. Based on the proposed discrete principal‐monotonicity inference (DPMI) approach, streamflow generation in the Xingshan Watershed, a representative watershed in this region, is examined. Based on system characterization, predictor identification, and streamflow distribution transformation, DPMI parameters are calibrated through a two‐stage strategy. Results indicate that the modelling efficiency of DPMI is satisfactory for streamflow simulation under these complexities. The distribution transformation method and the two‐stage calibration strategy can deal with non‐normality of streamflow and temporally unstable accuracy of hydrological models, respectively. The DPMI process and results reveal that both streamflow uncertainty and its rising tendency increase with flow levels. The dominant driving forces of streamflow generation are daily lowest temperature and daily cumulative precipitation in consideration of performances in global and local scales. The temporal heterogeneity of local significances to streamflow is insignificant for meteorological conditions. There is significant nonlinearity between meteorological conditions and streamflow and dependencies among meteorological conditions. The generation mechanism of low flows is more complicated than medium flows and high flows. The DPMI approach can facilitate improving robustness of hydro‐system analysis studies in the Xingshan Watershed or the TGR region. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

2.
A hybrid model that blends two non‐linear data‐driven models, i.e. an artificial neural network (ANN) and a moving block bootstrap (MBB), is proposed for modelling annual streamflows of rivers that exhibit complex dependence. In the proposed model, the annual streamflows are modelled initially using a radial basis function ANN model. The residuals extracted from the neural network model are resampled using the non‐parametric resampling technique MBB to obtain innovations, which are then added back to the ANN‐modelled flows to generate synthetic replicates. The model has been applied to three annual streamflow records with variable record length, selected from different geographic regions, namely Africa, USA and former USSR. The performance of the proposed ANN‐based non‐linear hybrid model has been compared with that of the linear parametric hybrid model. The results from the case studies indicate that the proposed ANN‐based hybrid model (ANNHM) is able to reproduce the skewness present in the streamflows better compared to the linear parametric‐based hybrid model (LPHM), owing to the effective capturing of the non‐linearities. Moreover, the ANNHM, being a completely data‐driven model, reproduces the features of the marginal distribution more closely than the LPHM, but offers less smoothing and no extrapolation value. It is observed that even though the preservation of the linear dependence structure by the ANNHM is inferior to the LPHM, the effective blending of the two non‐linear models helps the ANNHM to predict the drought and the storage characteristics efficiently. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

3.
This paper deals with the transient response of a non‐linear dynamical system with random uncertainties. The non‐parametric probabilistic model of random uncertainties recently published and extended to non‐linear dynamical system analysis is used in order to model random uncertainties related to the linear part of the finite element model. The non‐linearities are due to restoring forces whose parameters are uncertain and are modeled by the parametric approach. Jayne's maximum entropy principle with the constraints defined by the available information allows the probabilistic model of such random variables to be constructed. Therefore, a non‐parametric–parametric formulation is developed in order to model all the sources of uncertainties in such a non‐linear dynamical system. Finally, a numerical application for earthquake engineering analysis is proposed concerning a reactor cooling system under seismic loads. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

4.
Soil and water conservation measures including terracing, afforestation, construction of sediment‐trapping dams, and the ‘Grain for Green Program’ have been extensively implemented in the Yanhe River watershed, of the Loess Plateau, China, over the last six decades, and have resulted in large‐scale land use and land cover changes. This study examined the trends and shifts in streamflow regime over the period of 1953–2010 and relates them to changes in land use and soil and water conservation and to the climatic factors of precipitation and air temperature. The non‐parametric Mann–Kendall test and the Pettitt test were used to identify trends and shifts in streamflow and base flow. A method based on precipitation and potential evaporation was used to evaluate the impacts of climate variability and changes in non‐climate factors changes on annual streamflow. A significant decrease (p = 0.01) in annual streamflow was observed related to a significant change point in 1996, mostly because of significant decreases in streamflow (p = 0.01) in the July to September periods in subsequent years. The annual base flow showed no significant trend from 1953 to 2010 and no change point year, mostly because there were no significant seasonal trends, except for significant decreases (p = 0.05) in the July to September periods. There was no significant trend for precipitation over the studied time period, and no change point was detected. The air temperature showed a significant increasing trend (p < 0.01), and 1986 (p < 0.01) was the change point year. The climate variability, as measured by precipitation and temperature, and non‐climate factors including land use changes and soil and water conservation were estimated to have contributed almost equally to the reduction in annual streamflow. Soil and water conservation practices, including biological measures (e.g. revegetation, planting trees and grass) and engineering measures (such as fish‐scale pits, horizontal trenches, and sediment‐trapping dams) play an important role in reduction of the conversion of rainfall to run‐off. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

5.
Despite many recent improvements, ensemble forecast systems for streamflow often produce under‐dispersed predictive distributions. This situation is problematic for their operational use in water resources management. Many options exist for post‐processing of raw forecasts. However, most of these have been developed using meteorological variables such as temperature, which displays characteristics very different from streamflow. In addition, streamflow data series are often very short or contain numerous gaps, thus compromising the estimation of post‐processing statistical parameters. For operational use, a post‐processing method has to be effective while remaining as simple as possible. We compared existing post‐processing methods using normally distributed and gamma‐distributed synthetic datasets. To reflect situations encountered with ensemble forecasts of daily streamflow, four normal distribution parameterizations and six gamma distribution parameterizations were used. Three kernel‐based approaches were tested, namely, the ‘best member’ method and two improvements thereof, and one regression‐based approach. Additional tests were performed to assess the ability of post‐processing methods to cope with short calibration series, missing values or small numbers of ensemble members. We thus found that over‐dispersion is best corrected by the regression method, while under‐dispersion is best corrected by kernel‐based methods. This work also shows key limitations associated with short data series, missing values, asymmetry and bias. One of the improved best member methods required longer series for the estimation of post‐processing parameters, but if provided with adequate information, yielded the best improvement of the continuous ranked probability score. These results suggest guidelines for future studies involving real operational datasets. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

6.
Planar waves events recorded in a seismic array can be represented as lines in the Fourier domain. However, in the real world, seismic events usually have curvature or amplitude variability, which means that their Fourier transforms are no longer strictly linear but rather occupy conic regions of the Fourier domain that are narrow at low frequencies but broaden at high frequencies where the effect of curvature becomes more pronounced. One can consider these regions as localised “signal cones”. In this work, we consider a space–time variable signal cone to model the seismic data. The variability of the signal cone is obtained through scaling, slanting, and translation of the kernel for cone‐limited (C‐limited) functions (functions whose Fourier transform lives within a cone) or C‐Gaussian function (a multivariate function whose Fourier transform decays exponentially with respect to slowness and frequency), which constitutes our dictionary. We find a discrete number of scaling, slanting, and translation parameters from a continuum by optimally matching the data. This is a non‐linear optimisation problem, which we address by a fixed‐point method that utilises a variable projection method with ?1 constraints on the linear parameters and bound constraints on the non‐linear parameters. We observe that slow decay and oscillatory behaviour of the kernel for C‐limited functions constitute bottlenecks for the optimisation problem, which we partially overcome by the C‐Gaussian function. We demonstrate our method through an interpolation example. We present the interpolation result using the estimated parameters obtained from the proposed method and compare it with those obtained using sparsity‐promoting curvelet decomposition, matching pursuit Fourier interpolation, and sparsity‐promoting plane‐wave decomposition methods.  相似文献   

7.
This paper describes statistical procedures for developing earthquake damage fragility functions. Although fragility curves abound in earthquake engineering and risk assessment literature, the focus has generally been on the methods for obtaining the damage data (i.e., the analysis of structures), and little emphasis is placed on the process for fitting fragility curves to this data. This paper provides a synthesis of the most commonly used methods for fitting fragility curves and highlights some of their significant limitations. More novel methods are described for parametric fragility curve development (generalized linear models and cumulative link models) and non‐parametric curves (generalized additive model and Gaussian kernel smoothing). An extensive discussion of the advantages and disadvantages of each method is provided, as well as examples using both empirical and analytical data. The paper further proposes methods for treating the uncertainty in intensity measure, an issue common with empirical data. Finally, the paper describes approaches for choosing among various fragility models, based on an evaluation of prediction error for a user‐defined loss function. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

8.
There are two basic approaches for estimating flood quantiles: a parametric and a nonparametric method. In this study, the comparisons of parametric and nonparametric models for annual maximum flood data of Goan gauging station in Korea were performed based on Monte Carlo simulation. In order to consider uncertainties that can arise from model and data errors, kernel density estimation for fitting the sampling distributions was chosen to determine safety factors (SFs) that depend on the probability model used to fit the real data. The relative biases of Sheater and Jones plug-in (SJ) are the smallest in most cases among seven bandwidth selectors applied. The relative root mean square errors (RRMSEs) of the Gumbel (GUM) are smaller than those of any other models regardless of parent models considered. When the Weibull-2 is assumed as a parent model, the RRMSEs of kernel density estimation are relatively small, while those of kernel density estimation are much bigger than those of parametric methods for other parent models. However, the RRMSEs of kernel density estimation within interpolation range are much smaller than those for extrapolation range in comparison with those of parametric methods. Among the applied distributions, the GUM model has the smallest SFs for all parent models, and the general extreme value model has the largest values for all parent models considered.  相似文献   

9.
The spatial and temporal variations of precipitation and runoff for 139 basins in South Korea were investigated for 34 years (1968–2001). The Precipitation‐Runoff Modelling System (PRMS) was selected for the assessment of basin hydrologic response to varying climates and physiology. A non‐parametric Mann–Kendall's test and regression analysis are used to detect trends in annual, seasonal, and monthly precipitation and runoff, while Moran's I is adapted to determine the degree of spatial dependence in runoff trend among the basins. The results indicated that the long‐term trends in annual precipitation and runoff were increased in northern regions and decreased in south‐western regions of the study area during the study period. The non‐parametric Mann–Kendall test showed that spring streamflow was decreasing, while summer streamflow was increasing. April precipitation decreased between 15% and 74% for basins located in south‐western part of the Korean peninsula. June precipitation increased between 18% and 180% for the majority of the basins. Trends in seasonal and monthly streamflow show similar patterns compared to trends in precipitation. Decreases in spring runoff are associated with decreases in spring precipitation which, accompanied by rising temperatures, are responsible for reducing soil moisture. The regional patterns of precipitation and runoff changes show a strong to moderate positive spatial autocorrelation, suggesting that there is a high potential for severe spring drought and summer flooding in some parts of Korea if these trends continue in the future. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

10.
In recent years, the Xitiaoxi river basin in China has experienced intensified human activity, including city expansion and increased water demand. Climate change also has influenced streamflow. Assessing the impact of climate variability and human activity on hydrological processes is important for water resources planning and management and for the sustainable development of eco‐environmental systems. The non‐parametric Mann–Kendall test was employed to detect the trends of climatic and hydrological variables. The Mann–Kendall–Sneyers test and the moving t‐test were used to locate any abrupt change of annual streamflow. A runoff model, driven by precipitation and potential evapotranspiration, was employed to assess the impact of climate change on streamflow. A significant downward trend was detected for annual streamflow from 1975 to 2009, and an abrupt change occurred in 1999, which was consistent with the change detected by the double mass curve test between streamflow and precipitation. The annual precipitation decreased slightly, but upward trends of annual mean temperature and potential evapotranspiration were significant. The annual streamflow during the period 1999–2009 decreased by 26.19% compared with the reference stage, 1975–1998. Climate change was estimated to be responsible for 42.8% of the total reduction in annual streamflow, and human activity accounted for 57.2%. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

11.
Stream‐flow recessions are commonly characterized by the exponential equation or in the alternative power form equation of a single linear reservoir. The most common measure of recession is the recession constant K, which relates to the power function form of the recession equation for a linear reservoir. However, in reality it can be seen that the groundwater dynamics of even the simplest of aquifers may behave in a non‐linear fashion. In this study three different storage–outflow algorithms; single linear, non‐linear and multiple linear reservoir were considered to model the stream‐flow recession of the upper Blue Nile. The recession parameters for the linear and non‐linear models were derived by the use of least‐squares regression procedures. Whereas, for the multiple linear reservoir model, a second‐order autoregressive AR (2) model was applied first in order to determine the parameters by the least‐squares method. The modelling of the upper Blue Nile recession flow performed shortly after the wet season, when interflow and bank storage may be contributing considerably to the river flow, showed that the non‐linear reservoir model simulates well with the observed counterparts. The variation related to preceding flow on a recession parameter of the non‐linear reservoir remains significant, which was obtained by stratification of the recession curves. Although a similar stratification did not show any systematic variation on the recession parameters for the linear and multiple linear reservoir models. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

12.
We propose a novel technique for improving a long‐term multi‐step‐ahead streamflow forecast. A model based on wavelet decomposition and a multivariate Bayesian machine learning approach is developed for forecasting the streamflow 3, 6, 9, and 12 months ahead simultaneously. The inputs of the model utilize only the past monthly streamflow records. They are decomposed into components formulated in terms of wavelet multiresolution analysis. It is shown that the model accuracy can be increased by using the wavelet boundary rule introduced in this study. A simulation study is performed to evaluate the effects of different wavelet boundary rules using synthetic and real streamflow data from the Yellowstone River in the Uinta Basin in Utah. The model based on the combination of wavelet and Bayesian machine learning regression techniques is compared with that of the wavelet and artificial neural networks‐based model. The robustness of the models is evaluated. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

13.
Previous work has shown that streamflow response during baseflow conditions is a function of storage, but also that this functional relationship varies among seasons and catchments. Traditionally, hydrological models incorporate conceptual groundwater models consisting of linear or non‐linear storage–outflow functions. Identification of the right model structure and model parameterization however is challenging. The aim of this paper is to systematically test different model structures in a set of catchments where different aquifer types govern baseflow generation processes. Nine different two‐parameter conceptual groundwater models are applied with multi‐objective calibration to transform two different groundwater recharge series derived from a soil‐atmosphere‐vegetation transfer model into baseflow separated from streamflow data. The relative performance differences of the model structures allow to systematically improve the understanding of baseflow generation processes and to identify most appropriate model structures for different aquifer types. We found more versatile and more aquifer‐specific optimal model structures and elucidate the role of interflow, flow paths, recharge regimes and partially contributing storages. Aquifer‐specific recommendations of storage models were found for fractured and karstic aquifers, whereas large storage capacities blur the identification of superior model structures for complex and porous aquifers. A model performance matrix is presented, which highlights the joint effects of different recharge inputs, calibration criteria, model structures and aquifer types. The matrix is a guidance to improve groundwater model structures towards their representation of the dominant baseflow generation processes of specific aquifer types. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

14.
Fragility functions that define the probabilistic relationship between structural damage and ground motion intensity are an integral part of performance‐based earthquake engineering or seismic risk analysis. This paper introduces three approaches based on kernel smoothing methods for developing analytical and empirical fragility functions. A kernel assigns a weight to each data that is inversely related to the distance between the data value and the input of the fragility function of interest. The kernel smoothing methods are, therefore, non‐parametric forms of data interpolation. These methods enable the implicit treatment of uncertainty in either or both of ground motion intensity and structural damage without making any assumption about the shape of the resulting fragility functions. They are particularly beneficial for sparse, noisy, or non‐homogeneous data sets. For illustration purposes, two types of data are considered. The first is a set of numerically simulated responses for a four‐story steel moment‐resisting frame, and the second is a set of field observations collected after the 2010 Haiti earthquake. The results demonstrate that these methods can develop continuous representations of fragility functions without specifying their functional forms and treat sparse data sets more efficiently than conventional data binning and parametric curve fitting methods. Moreover, various uncertainty analyses are conducted to address the issues of over‐fitting, bias, and confidence intervals. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

15.
Hypothesis testing about catchment functioning with conceptual hydrological models is affected by uncertainties in the model representation of reality as well as in the observed data used to drive and evaluate the model. We formulated a learning framework to investigate the role of observational uncertainties in hypothesis testing using conceptual models and applied it to the relatively data‐scarce tropical Sarapiqui catchment in Costa Rica. Observational uncertainties were accounted for throughout the framework that incorporated different choices of model structures to test process hypotheses, analyses of parametric uncertainties and effects of likelihood choice, a posterior performance analysis and (iteratively) formulation of new hypotheses. Estimated uncertainties in precipitation and discharge were linked to likely non‐linear near‐surface runoff generation and the potentially important role of soils in mediating the hydrological response. Some model‐structural inadequacies could be identified in the posterior analyses (supporting the need for an explicit soil‐moisture routine to match streamflow dynamics), but the available information about the observational uncertainties prevented conclusions about other process representations. The importance of epistemic data errors, the difficulty in quantifying them and their effect on model simulations was illustrated by an inconsistent event with long‐term effects. Finally we discuss the need for new data, new process hypotheses related to deep groundwater losses, and conclude that observational uncertainties need to be accounted for in hypothesis testing to reduce the risk of drawing incorrect conclusions. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

16.
The Vincent Thomas Bridge in the Los Angeles metropolitan area, is a critical artery for commercial traffic flow in and out of the Los Angeles Harbor, and is at risk in the seismically active Southern California region, particularly because it straddles the Palos Verdes fault zone. A combination of linear and non‐linear system identification techniques is employed to obtain a complete reduced‐order, multi‐input–multi‐output (MIMO) dynamic model of the Vincent Thomas Bridge based on the dynamic response of the structure to the 1987 Whittier and 1994 Northridge earthquakes. Starting with the available acceleration measurements (which consists of 15 accelerometers on the bridge structure and 10 accelerometers at various locations on its base), an efficient least‐squares‐based time‐domain identification procedure is applied to the data set to develop a reduced‐order, equivalent linear, multi‐degree‐of‐freedom model. Although not the main focus of this study, the linear system identification method is also combined with a non‐parametric identification technique, to generate a reduced‐order non‐linear mathematical model suitable for use in subsequent studies to predict, with good fidelity, the total response of the bridge under arbitrary dynamic environments. Results of this study yield measurements of the equivalent linear modal properties (frequencies, mode shapes and non‐proportional damping) as well as quantitative measures of the extent and nature of non‐linear interaction forces arising from strong ground shaking. It is shown that, for the particular subset of observations used in the identification procedure, the apparent non‐linearities in the system restoring forces are quite significant, and they contribute substantially to the improved fidelity of the model. Also shown is the potential of the identification technique under discussion to detect slight changes in the structure's influence coefficients, which may be indicators of damage and degradation in the structure being monitored. Difficulties associated with accurately estimating damping for lightly damped long‐span structures from their earthquake response are discussed. The technical issues raised in this paper indicate the need for added spatial resolution in sensor instrumentation to obtain identified mathematical models of structural systems with the broadest range of validity. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

17.
Low‐flow characteristics can be estimated by multiple linear regressions or the index‐streamgage approach. The latter transfers streamflow information from a hydrologically similar, continuously gaged basin (‘index streamgage’) to one with a very limited streamflow record, but often results in biased estimates. The application of the index‐streamgage approach can be generalized into three steps: (1) selection of streamflow information of interest, (2) definition of hydrologic similarity and selection of index streamgage, and (3) application of an information‐transfer approach. Here, we explore the effects of (1) the range of streamflow values, (2) the areal density of streamgages, and (3) index‐streamgage selection criteria on the bias of three information‐transfer approaches on estimates of the 7‐day, 10‐year minimum streamflow (Q7, 10). The three information‐transfer approaches considered are maintenance of variance extension, base‐flow correlation, and ratio of measured to concurrent gaged streamflow (Q‐ratio invariance). Our results for 1120 streamgages throughout the United States suggest that only a small portion of the total bias in estimated streamflow values is explained by the areal density of the streamgages and the hydrologic similarity between the two basins. However, restricting the range of streamflow values used in the index‐streamgage approach reduces the bias of estimated Q7, 10 values substantially. Importantly, estimated Q7, 10 values are heavily biased when the observed Q7, 10 values are near zero. Results of the analysis also showed that Q7, 10 estimates from two of the three index‐streamgage approaches have lower root‐mean‐square error values than estimates derived from multiple regressions for the large regions considered in this study. Published in 2011 by John Wiley & Sons, Ltd.  相似文献   

18.
Abstract

The method of fragments is applied to the generation of synthetic monthly streamflow series using streamflow data from 34 gauging stations in mainland Portugal. A generation model based on the random sampling of the log-Pearson Type III distribution was applied to each sample to generate 1200 synthetic series of annual streamflow with an equal length to that of the sample. The synthetic annual streamflow series were then disaggregated into monthly streamflows using the method of fragments, by three approaches that differed in terms of the establishment of classes and the selection of fragments. The results of the application of such approaches were compared in terms of the capacity of the method to preserve the main monthly statistical parameters of the historical samples.

Editor D. Koutsoyiannis; Associate editor C. Onof

Citation Silva, A.T. and Portela, M.M., 2012. Disaggregation modelling of monthly streamflows using a new approach of the method of fragments. Hydrological Sciences Journal, 57 (5), 942–955.  相似文献   

19.
20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号