首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
Abstract

Time series of soil moisture-related parameters provide important insights into the functioning of soil water systems. Analysis of patterns within such time series has been used in several studies. The objective of this work was to compare patterns in observed and simulated soil moisture contents to understand whether modelling leads to a substantial loss of information or complexity. The time series were observed at four plots in sandy soils within the USDA-ARS OPE3 experimental watershed, for a year; precipitation and evapotranspiration (ET) were measured and estimated, respectively, and used for soil water flow simulation with the HYDRUS-1D software. The information content measures are the metric entropy and the mean information gain, and complexity measures are the fluctuation complexity and the effective measure complexity. These measures were computed based on the binary encoding of soil moisture time series, and used probabilities of patterns, i.e. probabilities of joint or sequential appearance of symbol sequences. The information content of daily soil moisture time series was much smaller than that of rainfall data, and had higher complexity, indicating that soil worked essentially as an information filter. Information content and complexity decreased and increased with depth, respectively, demonstrating the increase in the information filtering action of soil. The information measures of simulated soil moisture content were close to those of the measurements, indicating the successful simulation of patterns in the data. The spatial variability of the information measures for simulated soil moisture content at all depths was less pronounced than the one of measured time series. Compared with precipitation and estimated ET, soil moisture time series had more structure and less randomness in this work. The information measures can provide useful complementary knowledge about model performance and patterns in observation and modelling results.

Citation Pan, F., Pachepsky, Y. A., Guber, A. K., & Hill, R. L. (2011) Information and complexity measures applied to observed and simulated soil moisture time series. Hydrol. Sci. J. 56(6), 1027–1039.  相似文献   

2.
Hydrological model parameter estimation is an important aspect in hydrologic modelling. Usually, parameters are estimated through an objective function minimization, quantifying the mismatch between the model results and the observations. The objective function choice has a large impact on the sensitivity analysis and calibration outcomes. In this study, it is assessed whether spectral objective functions can compete with an objective function in the time domain for optimization of the Soil and Water Assessment Tool (SWAT). Three empirical spectral objective functions were applied, based on matching (i) Fourier amplitude spectra, (ii) periodograms and (iii) Fourier series of simulated and observed discharge time series. It is shown that most sensitive parameters and their optimal values are distinct for different objective functions. The best results were found through calibration with an objective function based on the square difference between the simulated and observed discharge Fourier series coefficients. The potential strengths and weaknesses of using a spectral objective function as compared to utilising a time domain objective function are discussed. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

3.
Abstract

Flood frequency estimation is crucial in both engineering practice and hydrological research. Regional analysis of flood peak discharges is used for more accurate estimates of flood quantiles in ungauged or poorly gauged catchments. This is based on the identification of homogeneous zones, where the probability distribution of annual maximum peak flows is invariant, except for a scale factor represented by an index flood. The numerous applications of this method have highlighted obtaining accurate estimates of index flood as a critical step, especially in ungauged or poorly gauged sections, where direct estimation by sample mean of annual flood series (AFS) is not possible, or inaccurate. Therein indirect methods have to be used. Most indirect methods are based upon empirical relationships that link index flood to hydrological, climatological and morphological catchment characteristics, developed by means of multi-regression analysis, or simplified lumped representation of rainfall–runoff processes. The limits of these approaches are increasingly evident as the size and spatial variability of the catchment increases. In these cases, the use of a spatially-distributed, physically-based hydrological model, and time continuous simulation of discharge can improve estimation of the index flood. This work presents an application of the FEST-WB model for the reconstruction of 29 years of hourly streamflows for an Alpine snow-fed catchment in northern Italy, to be used for index flood estimation. To extend the length of the simulated discharge time series, meteorological forcings given by daily precipitation and temperature at ground automatic weather stations are disaggregated hourly, and then fed to FEST-WB. The accuracy of the method in estimating index flood depending upon length of the simulated series is discussed, and suggestions for use of the methodology provided.
Editor D. Koutsoyiannis  相似文献   

4.
Abstract

Gridded meteorological data are available for all of Norway as time series dating from 1961. A new way of interpolating precipitation in space from observed values is proposed. Based on the criteria that interpolated precipitation fields in space should be consistent with observed spatial statistics, such as spatial mean, variance and intermittency, spatial fields of precipitation are simulated from a gamma distribution with parameters determined from observed data, adjusted for intermittency. The simulated data are distributed in space, using the spatial pattern derived from kriging. The proposed method is compared to indicator kriging and to the current methodology used for producing gridded precipitation data. Cross-validation gave similar results for the three methods with respect to RMSE, temporal mean and standard deviation, whereas a comparison on estimated spatial variance showed that the new method has a near perfect agreement with observations. Indicator kriging underestimated the spatial variance by 60–80% and the current method produced a significant scatter in its estimates.

Citation Skaugen, T. & Andersen, J. (2010) Simulated precipitation fields with variance-consistent interpolation. Hydrol. Sci. J. 55(5), 676–686.  相似文献   

5.
Abstract

The Pettitt test is a non-parametric test that has been used in a number of hydroclimatological studies to detect abrupt changes in the mean of the distribution of the variable of interest. This test is based on the Mann-Whitney two-sample test (rank-based test), and allows the detection of a single shift at an unknown point in time. This test is often used to detect shifts in extremes because of the lack of distributional assumptions. However, the downside of not specifying a distribution is that the Pettitt test may be inefficient in detecting breaks when dealing with extremes. Here we adopt a Monte Carlo approach to examine the sensitivity of the Pettitt test in detecting shifts in the mean under different conditions (location of the break within the series, magnitude of the shift, record length, level of variability in the data, extreme vs non-extreme records, and pre-assigned significance level). These simulation results show that the sensitivity of this test in detecting abrupt changes increases with the increase in the magnitude of the shift and record length. The number of detections is higher when the time series represents the central part of the distribution (e.g. changes in the time series of medians), while the skill decreases as we move toward either low or high extremes (e.g. changes in the time series of maxima). Furthermore, the number of detections decreases as the variability in the data increases. Finally, abrupt changes are more easily detected when they occur toward the center of the time series.
Editor D. Koutsoyiannis Associate editor K. Hamed  相似文献   

6.
ABSTRACT

This study relies on the use and analysis of hydro-meteorological variables, long turbidity time series (from 1988 to 2009, 21 years) and a sedimentary record to provide better understanding of the hydro-sedimentary variability of the karst system near the town of Radicatel, France. Wavelet analysis of rainfall, piezometric level and turbidity, as well as the sediment archive, show common modes of variability. A common spectral composition emphasizes the influence of climate controls. Comparison of the wavelet spectra with the North Atlantic Oscillation (NAO) spectrum clearly highlights the control of the latter on hydro-meteorological variables at the regional level. Climatic fluctuations are recorded in the turbidity signal and in sedimentary fill, as revealed by the 5- to 8-year frequency band, which is characteristic of the NAO index. A climatic signal is recorded in both rainfall events and piezometric levels, and also in sediment transport and deposition at the scale of the local karst system. The overall climate control is also present beyond the local variations and heterogeneities.
EDITOR D. Koutsoyiannis ASSOCIATE EDITOR D. Yang  相似文献   

7.
ABSTRACT

An adaptive multilevel correlation analysis, a kind of data-driven methodology, is proposed. The analysis is done by subdividing the time series into segments such that adjacent segments have significantly different mean values. It is shown that the proposed methodology can provide multilevel information about the correlation between two variables. An integrated coefficient with its significance testing is also proposed to summarize the correlation at each level. Using the adaptive multilevel correlation analysis methodology, the correlation between streamflow and water level is investigated for a case study, and the results indicate that real correlation might be far more complicated than the empirically constructed picture.
EDITOR D. Koutsoyiannis ASSOCIATE EDITOR E. Volpi  相似文献   

8.
地震信号往往是非线性非平稳的,时频分析技术能同时展示信号在时间域和频率域的局部化特征。本文研究4种时频分析方法:短时傅里叶变换(STFT)、小波变换、广义S变换和Wigner-Ville分布。在理论上运用雷克子波模拟地震记录进行时频分析对比,发现广义S变换具有相对较好的时频聚焦性以及较好的交叉项抑制性。为了验证这一理论,我们分别在两个工区做了实际的地震实验,分析并阐述4种时频分析方法的优缺点,同时也验证广义S变换的优越性。   相似文献   

9.
小波变换在少震、弱震区地下水位数据分析中的应用   总被引:12,自引:2,他引:12  
利用小波变换和多尺度分析原理,对位于少震弱震区的湖南的具有不同变化形态的6口典型井4年的水位日均值序列进行了分析。结果显示,通过多尺度分析可以很方便地将水位日均值序列中的高频部分与低频部分分开,而且对不同井孔的含水层系统而言,分离出去的频率成分的主模是不同的,这反应出不同井孔的含水层系统对同一频率的响应存在差异,主模的频率反映了正常背景下井孔-含水层系统的优势频率。Morlet小波分析的结果对时间序列的突然变化和周期结构都有很好的反映。  相似文献   

10.
Groundwater resources are typically the main fresh water source in arid and semi‐arid regions. Natural recharge of aquifers is mainly based on precipitation; however, only heavy precipitation events (HPEs) are expected to produce appreciable aquifer recharge in these environments. In this work, we used daily precipitation and monthly water level time series from different locations over a Mediterranean region of Southeastern Spain to identify the critical threshold value to define HPEs that lead to appreciable aquifer recharge in this region. Wavelet and trend analyses were used to study the changes in the temporal distribution of the chosen HPEs (≥20 mm day?1) over the observed period 1953–2012 and its projected evolution by using 18 downscaled climate projections over the projected period 2040–2099. The used precipitation time series were grouped in 10 clusters according to similarities between them assessed by using Pearson correlations. Results showed that the critical HPE threshold for the study area is 20 mm day?1. Wavelet analysis showed that observed significant seasonal and annual peaks in global wavelet spectrum in the first sub‐period (1953–1982) are no longer significant in the second sub‐period (1983–2012) in the major part of the ten clusters. This change is because of the reduction of the mean HPEs number, which showed a negative trend over the observed period in nine clusters and was significant in five of them. However, the mean size of HPEs showed a positive trend in six clusters. A similar tendency of change is expected over the projected period. The expected reduction of the mean HPEs number is two times higher under the high climate scenario (RCP8.5) than under the moderate scenario (RCP4.5). The mean size of these events is expected to increase under the two scenarios. The groundwater availability will be affected by the reduction of HPE number which will increase the length of no aquifer recharge periods (NARP) accentuating the groundwater drought in the region. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

11.
Integrated dynamic water and chloride balance models with a catchment‐scale hydrological model (PRMS) are used to investigate the response of a terminal tropical lake, Lake Abiyata, to climate variability and water use practices in its catchment. The hydrological model is used to investigate the response of the catchment to different climate and land‐use change scenarios that are incorporated into the lake model. Lake depth–area–volume relationships were established from lake bathymetries. Missing data in the time series were filled using statistical regression techniques. Based on mean monthly data, the lake water balance model produced a good agreement between the simulated and observed levels of Lake Abiyata for the period 1968–83. From 1984 onwards the simulated lake level is overestimated with respect to the observed one, while the chloride concentration is largely underestimated. This discrepancy is attributed to human use of water from the influent rivers or directly from the lake. The simulated lake level and chloride concentration are in better agreement with observed values (r2 = 0·96) when human water use for irrigation and salt exploitation are included in the model. A comparison of the simulation with and without human consumption indicates that climate variability controls the interannual fluctuations and that the human water use affects the equilibrium of the system by strongly reducing the lake level. Sensitivity analysis based on a mean climatic year showed that, after prolonged mean climatic conditions, Lake Abiyata reacts more rapidly to an abrupt shift to wetter conditions than to dry conditions. This study shows the significant sensitivity of the level and salinity of the terminal Lake Abiyata to small changes in climate or land use, making it a very good ‘recorder’ of environmental changes that may occur in the catchment at different time scales. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

12.
Abstract

This study aims to assess the potential impact of climate change on flood risk for the city of Dayton, which lies at the outlet of the Upper Great Miami River Watershed, Ohio, USA. First the probability mapping method was used to downscale annual precipitation output from 14 global climate models (GCMs). We then built a statistical model based on regression and frequency analysis of random variables to simulate annual mean and peak streamflow from precipitation input. The model performed well in simulating quantile values for annual mean and peak streamflow for the 20th century. The correlation coefficients between simulated and observed quantile values for these variables exceed 0.99. Applying this model with the downscaled precipitation output from 14 GCMs, we project that the future 100-year flood for the study area is most likely to increase by 10–20%, with a mean increase of 13% from all 14 models. 79% of the models project increase in annual peak flow.

Citation Wu, S.-Y. (2010) Potential impact of climate change on flooding in the Upper Great Miami River Watershed, Ohio, USA: a simulation-based approach. Hydrol. Sci. J. 55(8), 1251–1263.  相似文献   

13.
This study is an attempt to determine the trends in monthly, annual and monsoon total precipitation series over India by applying linear regression, the Mann-Kendall (MK) test and discrete wavelet transform (DWT). The linear regression test was applied on five consecutive classical 30-year climate periods and a long-term precipitation series (1851–2006) to detect changes. The sequential Mann-Kendall (SQMK) test was applied to identify the temporal variation in trend. Wavelet transform is a relatively new tool for trend analysis in hydrology. Comparison studies were carried out between decomposed series by DWT and original series. Furthermore, visualization of extreme and contributing events was carried out using the wavelet spectrum at different threshold values. The results showed that there are significant positive trends for annual and monsoon precipitation series in North Mountainous India (zone NMI) and North East India (NEI), whereas negative trends were detected when considering India as whole.

EDITOR A. Castellarin ASSOCIATE EDITOR S. Kanae  相似文献   

14.
Abstract

Abstract Time series analyses are applied to characterize the transient flow regimes of the Nam La cavern conduit, northwest Vietnam. The conduit transforms the input signal to an output signal, and the degree of transformation provides information on the nature of the flow system. The input for the analysis is net precipitation and the flow hydrograph at the cave entrance, while the output series is the flow hydrograph at the resurgence. Cross-correlation and cross-spectrum analysis are used to investigate the stationarity and linearity of the input–output transformation of the system, resulting in hydrodynamic properties such as system memory, response time, and mean delay between input and output. It is shown that during high flow periods, the flow in the conduit is pressurized. Consequently, the linear input–output assumption holds only for low flows. To highlight the hydrodynamics of the cavern conduit for the high flow periods, wavelet spectrum and wavelet cross-spectrum analyses are applied.  相似文献   

15.
Abstract

This article paves a way for assessing flood risk by the use of two-parameter distributions, for the intervals between threshold exceedences rather than by the traditional exponential distribution. In a case study, the apparent properties of intervals between exceedences of runoff events differ from those anticipated for exponentially distributed series. A procedure is proposed to relate two statistical parameters of the intervals to threshold discharges. It considers partial duration series (PDS) with thresholds equal to all high enough observed discharges. To avoid unnecessary assumptions on the behaviour of those parameters and effects of dependence between parameters for different PDS, a non-parametric trend-free pre-whitened scheme is applied. It leads to power-law relationships between a discharge and the mean and standard deviation of the intervals between its exceedences. Predicted mean inter-exceedence intervals, for the highest observed discharges at the stations, are closer to the observational periods than those predicted by GEV distributions fitted to AMS, and by GP distributions to fitted PDS. In the present case, the latter predictions are longer than the observational periods whereas some of the predicted mean inter-exceedences are shorter than the corresponding observational periods and some others are longer.

Citation Ben-Zvi, A. & Azmon, B. (2010) Direct relationships of discharges to the mean and standard deviation of the intervals between their exceedences. Hydrol. Sci J. 55(4), 565–577.  相似文献   

16.
17.
Turbulent magnetofluids appear in various geophysical and astrophysical contexts, in phenomena associated with planets, stars, galaxies and the universe itself. In many cases, large-scale magnetic fields are observed, though a better knowledge of magnetofluid turbulence is needed to more fully understand the dynamo processes that produce them. One approach is to develop the statistical mechanics of ideal (i.e. non-dissipative), incompressible, homogeneous magnetohydrodynamic (MHD) turbulence, known as “absolute equilibrium ensemble” theory, as far as possible by studying model systems with the goal of finding those aspects that survive the introduction of viscosity and resistivity. Here, we review the progress that has been made in this direction. We examine both three-dimensional (3-D) and two-dimensional (2-D) model systems based on discrete Fourier representations. The basic equations are those of incompressible MHD and may include the effects of rotation and/or a mean magnetic field B o. Statistical predictions are that Fourier coefficients of the velocity and magnetic field are zero-mean random variables. However, this is not the case, in general, for we observe non-ergodic behavior in very long time computer simulations of ideal turbulence: low wavenumber Fourier modes that have relatively large means and small standard deviations, i.e. coherent structure. In particular, ergodicity appears strongly broken when B o?=?0 and weakly broken when B o?≠?0. Broken ergodicity in MHD turbulence is explained by an eigenanalysis of modal covariance matrices. This produces a set of modal eigenvalues inversely proportional to the expected energy of their associated eigenvariables. A large disparity in eigenvalues within the same mode (identified by wavevector k ) can occur at low values of wavenumber k?=?| k |, especially when B o?=?0. This disparity breaks the ergodicity of eigenvariables with smallest eigenvalues (largest energies). This leads to coherent structure in models of ideal homogeneous MHD turbulence, which can occur at lowest values of wavenumber k for 3-D cases, and at either lowest or highest k for ideal 2-D magnetofluids. These ideal results appear relevant for unforced, decaying MHD turbulence, so that broken ergodicity effects in MHD turbulence survive dissipation. In comparison, we will also examine ideal hydrodynamic (HD) turbulence, which, in the 3-D case, will be seen to differ fundamentally from ideal MHD turbulence in that coherent structure due to broken ergodicity can only occur at maximum k in numerical simulations. However, a nonzero viscosity eliminates this ideal 3-D HD structure, so that unforced, decaying 3-D HD turbulence is expected to be ergodic. In summary, broken ergodicity in MHD turbulence leads to energetic, large-scale, quasistationary magnetic fields (coherent structures) in numerical models of bounded, turbulent magnetofluids. Thus, broken ergodicity provides a large-scale dynamo mechanism within computer models of homogeneous MHD turbulence. These results may help us to better understand the origin of global magnetic fields in astrophysical and geophysical objects.  相似文献   

18.
ABSTRACT

Suspended solids are present in every river, but high quantities can worsen the ecological conditions of streams; therefore, effective monitoring and analysis of this hydrological variable are necessary. Frequency, seasonality, inter-correlation, extreme events, trends and lag analyses were carried out for peaks of suspended sediment concentration (SSC) and discharge (Q) data from Slovenian streams using officially monitored data from 1955 to 2006 that were made available by the Slovenian Environment Agency. In total more than 500 station-years of daily Q and SSC data were used. No uniform (positive or negative) trend was found in the SSC series; however, all the statistically significant trends were decreasing. No generalization is possible for the best fit distribution function. A seasonality analysis showed that most of the SSC peaks occurred in the summer (short-term intense convective precipitation produced by thunderstorms) and in the autumn (prolonged frontal precipitation). Correlations between Q and SSC values were generally relatively small (Pearson correlation coefficient values from 0.05 to 0.59), which means that the often applied Q–SSC curves should be used with caution when estimating annual suspended sediment loads. On average, flood peak Q occurred after the corresponding SSC peak (clockwise-positive hysteresis loops), but the average lag time was rather small (less than 1 day).
Editor M.C. Acreman; Associate editor Y. Gyasi-Agyei  相似文献   

19.
Accelerographic time series of the M 6.5 Bam (Iran) earthquake of December 26, 2003, are used to calibrate the source and propagation path characteristics based on a hybrid stochastic approach, which includes stochastic finite fault and analytical modeling. Estimation of source characteristics is based on the calibration of finite-fault modeling to near-source observed time series, while propagation characteristics are estimated using far-field recorded ones. The distance-dependent Kappa factor is obtained from the slope of smoothed amplitude of acceleration Fourier spectrum at higher frequencies. The estimated zero-distance Kappa value is 0.06. Calibration based on near-source time series indicates a stress drop of 130 bars for the Bam earthquake. The strong impulsive long-period motion, recorded at the only near-source station, is modeled using analytical modeling of Mavroeidis and Papageorgiou (2003).  相似文献   

20.
We investigate a new proxy for ENSO climate variability based on particle‐size data from long‐term, coastal sediment records preserved in a barrier estuary setting. Corresponding ~4–8 year periodicities identified from Wavelet analysis of particle‐size data from Pescadero Marsh in Central Coast California and rainfall data from San Francisco reflect established ENSO periodicity, as further evidenced in the Multivariate ENSO Index (MEI), and thus confirms an important ENSO control on both precipitation and barrier regime variability. Despite the fact that barrier estuary mean particle size is influenced by coastal erosion, precipitation and streamflow, balanced against barrier morphology and volume, it is encouraging that considerable correspondence can also be observed in the time series of MEI, regional rainfall and site‐based mean particle size over the period 1871–2008. This correspondence is, however, weakened after c.1970 by temporal variation in sedimentation rate and event‐based deposition. These confounding effects are more likely when: (i) accommodation space may be a limiting factor; and (ii) particularly strong El Niños, e.g. 1982/1983 and 1997/1998, deposit discrete >cm‐thick units during winter storms. The efficacy of the sediment record of climate variability appears not to be compromised by location within the back‐barrier setting, but it is limited to those El Niños that lead to barrier breakdown. For wider application of this particle size index of ENSO variability, it is important to establish a well‐resolved chronology and to sample the record at the appropriate interval to characterize deposition at a sub‐annual scale. Further, the sample site must be selected to limit the influence of decreasing accommodation space through time (infilling) and event‐based deposition. It is concluded that particle‐size data from back‐barrier sediment records have proven potential for preserving evidence of sub‐decadal climate variability, allowing researchers to explore temporal and spatial patterns in phenomena such as ENSO. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号