首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 390 毫秒
1.
Pelagic copepod species distributions were found to vary in relation to an environmental gradient. As suggested by Gray, systematic changes were noted in the skewness of the log-normal. However, no systematic changes in the standard of deviation of the log-normal were noted. The data indicate that such assemblages ill-fit Preston's ‘canonical’ expectation. Further, analysis of changes in the evenness function yielded similar conclusions.  相似文献   

2.
1 INTRODUCTIONMany hydro-momhological mathematical models neglect the innuence of river bed maerialheterogeneity and its time and space changes during transport and related erosion/dePosition processes. Inthese models a rePresentative diameter of the river bed grain-size distribution (for examPle d5o) isspecified as initial data in each comPutational point of the modeled domain. Different d5o can be assignedto each ghd point but temPoral changes in bed material gradation cannot be simula…  相似文献   

3.
An attempt is made in this study to search for a model which is capable of reproducing the reservoir storage components and runs' characteristics along with the mean, standard deviation, skewness and correlation coefficients of historical monthly streamflows. The model based upon the method of fragments and a scheme devised for annual values is found to yield satisfactory results and is, therefore, recommended for use.  相似文献   

4.
The log-normal distribution of individuals among species of shallow, brackish water benthos is compared to other, more commonly used methods. This method, the use of which was introduced in marine monitoring in 1979, has not previously been tested on macrobenthos of the northern Baltic Sea. The relevance of other indices are discussed in relation to the log-normal distribution. Based on samples from three localities (soft littoral benthos, soft bottom benthos and sand bottom benthos) it is shown that the log-normal distribution could well be used as an important tool in interpreting benthic data sets from low-saline waters with few macrozoobenthic species. It illustrates natural and anthropogenic changes on the zoobenthic communities more adequately than the other parameters calculated.  相似文献   

5.
This study draws attention on the extreme precipitation changes over the eastern Himalayan region of the Teesta river catchment. To explore the precipitation variability and heterogeneity, observed (1979–2005) and statistically downscaled (2006–2100) Coupled Model Intercomparison Project Phase Five earth system model global circulation model daily precipitation datasets are used. The trend analysis is performed to analyze the long-term changes in precipitation scenarios utilizing non-parametric Mann–Kendall (MK) test, Kendall Tau test, and Sen’s slope estimation. A quantile regression (QR) method has been applied to assess the lower and upper tails changes in precipitation scenarios. Precipitation extreme indices were generated to quantify the extremity of precipitation in observed and projected time domains. To portrait the spatial heterogeneity, the standard deviation and skewness are computed for precipitation extreme indices. The results show that the overall precipitation amount will be increased in the future over the Himalayan region. The monthly time series trend analysis based results reflect an interannual variability in precipitation. The QR analysis results showed significant increments in precipitation amount in the upper and lower quantiles. The extreme precipitation events are increased during October to June months; whereas, it decreases from July to September months. The representative concentration pathway (RCP) 8.5 based experiments showed extreme changes in precipitation compared to RCP2.6 and RCP4.5. The precipitation extreme indices results reveal that the intensity of precipitation events will be enhanced in future time. The spatial standard deviation and skewness based observations showed a significant variability in precipitation over the selected Himalayan catchment.  相似文献   

6.
Excluding the most polluted areas, a general problem to handle in an ecological monitoring context is the recognition of pollution-induced changes as opposed to natural variations always occurring in marine communities. In the search for solutions to this problem, a combination of different methods is suggested. The use of basic ecological community studies, comparison of common patterns on a broad geographical scale, and making predictions from short and long term trend models are the methods that have been tried and discussed. Ecological knowledge and comparison of common patterns are here suggested as the most fruitful tools in an ecological monitoring programme.  相似文献   

7.
The effects of prolonged exposure to low concentrations of herbicides on tropical periphyton (biofilm) communities are largely unknown. Tropical estuarine biofilms established in microcosms were therefore exposed to diuron (photosystem-II-inhibitor) at 2-16μg L(-1) for 4 weeks. The biofilms, consisting of diatoms, filamentous brown algae and cyanobacteria, developed a tolerance to diuron during this period as measured by Phyto-PAM fluorometry. Microscopy and pigment analysis revealed that this decrease in sensitivity was accompanied by a shift in species composition towards communities dominated by diatoms. The combination of techniques enabled the first identification of pollution-induced community tolerance (PICT) in tropical estuarine periphyton in response to chronic herbicide exposures. Community composition changed compared to controls at environmentally relevant concentrations of 1.6μg L(-1), while development of PICT was evident at 6.5μg L(-1) diuron, with no recovery (over 2 weeks) in uncontaminated water, indicating chronic pollution induced shifts in community structure.  相似文献   

8.
An algorithm is proposed for simulating a hydrological random process for the case of a vector, whose components are runoff values for the Khoper R. in three high-water months and the Volga R. in February, April, and the average over April, May, and June. The simulation is shown to have a high accuracy: the expectations and the coefficients of variation exactly coincide with specified values; the skewness has an accuracy of 0.1; the deviation of the coefficients of autocorrelation and cross-correlation of vector components from the specified values never exceeds 0.05.  相似文献   

9.
In unpolluted areas the log-normal distribution of individuals per species fits well data from many benthic communities. Using a simple plotting method, under slight pollution the data show a distinct break in the normally straight line log-normal plot and the plot covers more geometric classes than data from unpolluted areas. Under more severe pollution stress the data show a return to a log-normal distribution, but with a shallower slope, and span more geometric classes than data from less polluted areas. The above-mentioned patterns are consistent for spatial effects of pollution in Oslofjord and for temporal effects in data from a Scottish sea loch and also apply to other communities. An explanation of the reasons for such patterns is given and consideration given to the robustness of and application of the methods described.  相似文献   

10.
This paper studies the statistics of the soil moisture condition and its monthly variation for the purpose of evaluating drought vulnerability. A zero-dimensional soil moisture dynamics model with the rainfall forcing by the rectangular pulses Poisson process model are used to simulate the soil moisture time series for three sites in Korea: Seoul, Daegu, and Jeonju. These sites are located in the central, south-eastern, and south-western parts of the Korean Peninsular, respectively. The model parameters are estimated on a monthly basis using hourly rainfall data and monthly potential evaporation rates obtained by the Penmann method. The resulting soil moisture simulations are summarized on a monthly basis. In brief, the conclusions of our study are as follows. (1) Strong seasonality is observed in the simulations of soil moisture. The soil moisture mean is less than 0.5 during the dry spring season (March, April, and June), but other months exceed the 0.5 value. (2) The spring season is characterized by a low mean value, a high standard deviation and a positive skewness of the soil moisture content. On the other hand, the wet season is characterized by a high mean value, low standard deviation, and negative skewness of the soil moisture content. Thus, in the spring season, much drier soil moisture conditions are apparent due to the higher variability and positive skewness of the soil moisture probability density function (PDF), which also indicates more vulnerability to severe drought occurrence. (3) Seoul, Daegue, and Jeonju show very similar overall trends of soil moisture variation; however, Daegue shows the least soil moisture contents all through the year, which implies that the south-eastern part of the Korean Peninsula is most vulnerable to drought. On the other hand, the central part and the south-western part of the Korean peninsula are found to be less vulnerable to the risk of drought. The conclusions of the study are in agreement with the climatology of the Korean Peninsula.  相似文献   

11.
陕西数字地震台网子台测定震级的研究   总被引:1,自引:0,他引:1  
使用陕西数字地震台网1998~2003年间的地震记录,以台网平均震级为标准震级,分析了各子台的测定震级。结果表明,各子台测定震级的偏差存在地域分布特征,震级随方位角显示趋势性变化。台址位于平原和山区交界处的台站,对于山区发生的地震测定震级偏大,对于平原地区发生的地震测定震级偏小。最后对震级偏差成因进行了探讨。  相似文献   

12.
Abstract

Wavelet or Fourier analysis is proposed as an alternative nonparametric method to simulate streamflows. An observed series is decomposed into its components at various resolutions and then recombined randomly to generate synthetic series. The mean and standard deviation are perfectly reproduced and coefficient of skewness tends to zero as the number of simulations increases. Normalizing transforms can be used for skewed series. Autocorrelation coefficients and the dependence structure are better preserved when Fourier analysis is used, but the mean and variance remain constant when the simulated and observed series have the same length. Monthly as well as annual flows can be simulated by this technique as illustrated on some examples. Wavelet analysis should be preferred as it generates flow series that exhibit a wider range of required reservoir capacities.  相似文献   

13.
《水文科学杂志》2012,57(1):87-101
ABSTRACT

The coefficient of determination R2 and Pearson correlation coefficient ρ = R are standard metrics in hydrology for the evaluation of the goodness of fit between model simulations and observations, and as measures of the degree of dependence of one variable upon another. We show that the standard product moment estimator of ρ, termed r, while well-behaved for bivariate normal data, is upward biased and highly variable for bivariate non-normal data. We introduce three alternative estimators of ρ which are nearly unbiased and exhibit much less variability than r for non-normal data. We also document remarkable upward bias and tremendous increases in variability associated with r using both synthetic data and daily streamflow simulations from 905 calibrated rainfall–runoff models. We show that estimators of ρ = R accounting for skewness are needed for daily streamflow series because they exhibit high variability and skewness compared to, for example, monthly/annual series, where r should perform well.  相似文献   

14.
Accurate simulation of seismic wave propagation in complex geological structures is of particular interest nowadays. However conventional methods may fail to simulate realistic wavefields in environments with great and rapid structural changes, due for instance to the presence of shadow zones, diffractions and/or edge effects. Different methods, developed to improve seismic modeling, are typically tested on synthetic configurations against analytical solutions for simple canonical problems or reference methods, or via direct comparison with real data acquired in situ. Such approaches have limitations, especially if the propagation occurs in a complex environment with strong-contrast reflectors and surface irregularities, as it can be difficult to determine the method which gives the best approximation of the “real” solution, or to interpret the results obtained without an a priori knowledge of the geologic environment. An alternative approach for seismics consists in comparing the synthetic data with high-quality data collected in laboratory experiments under controlled conditions for a known configuration. In contrast with numerical experiments, laboratory data possess many of the characteristics of field data, as real waves propagate through models with no numerical approximations. We thus present a comparison of laboratory-scaled measurements of 3D zero-offset wave reflection of broadband pulses from a strong topographic environment immersed in a water tank with numerical data simulated by means of a spectral-element method and a discretized Kirchhoff integral method. The results indicate a good quantitative fit in terms of time arrivals and acceptable fit in amplitudes for all datasets.  相似文献   

15.
It is found by experiment that under the thermal convection condition, the temperature fluctuation in the urban canopy layer turbulence has the hard state character, and the temperature difference between two points has the exponential probability density function distribution. At the same time, the turbulent energy dissipation rate fits the log-normal distribution, and is in accord with the hypothesis proposed by Kolmogorov in 1962 and lots of reported experimental results. In this paper, the scaling law of hard state temperature n order structure function is educed by the self-similar multiplicative cascade models. The theory formula is Sn = n/3μ{n(n+6)/72+[2lnn!-nln2]/2ln6}, and μ Is intermittent exponent. The formula can fit the experimental results up to order 8 exponents, is superior to the predictions by the Kolmogorov theory, the β And log-normal model.  相似文献   

16.
A model of long-term river runoff variations is proposed. The model is based on a difference stochastic equation of water balance on a watershed. Precipitation and evaporation on the watershed are simulated by stochastic, dependent, non-Gaussian Markov processes. Long-term river runoff variations are described by a component of three-dimensional non-Gaussian Markov process. It is shown that the autocorrelation and skewness coefficients for river runoff can be negative. The proposed model can be used to assess the effect of climate-induced variations in precipitation and evaporation regimes in a watershed on long-term river runoff variations.  相似文献   

17.
Large-scale flood modelling approaches designed for regional to continental scales usually rely on relatively simple assumptions to represent the potentially highly complex river bathymetry at the watershed scale based on digital elevation models (DEMs) with a resolution in the range of 25–30 m. Here, high-resolution (1 m) LiDAR DEMs are employed to present a novel large-scale methodology using a more realistic estimation of bathymetry based on hydrogeomorphological GIS tools to extract water surface slope. The large-scale 1D/2D flood model LISFLOOD-FP is applied to validate the simulated flood levels using detailed water level data in four different watersheds in Quebec (Canada), including continuous profiles over extensive distances measured with the HydroBall technology. A GIS-automated procedure allows to obtain the average width required to run LISFLOOD-FP. The GIS-automated procedure to estimate bathymetry from LiDAR water surface data uses a hydraulic inverse problem based on discharge at the time of acquisition of LiDAR data. A tiling approach, allowing several small independent hydraulic simulations to cover an entire watershed, greatly improves processing time to simulate large watersheds with a 10-m resampled LiDAR DEM. Results show significant improvements to large-scale flood modelling at the watershed scale with standard deviation in the range of 0.30 m and an average fit of around 90%. The main advantage of the proposed approach is to avoid the need to collect expensive bathymetry data to efficiently and accurately simulate flood levels over extensive areas.  相似文献   

18.
The first phase (1997–2003) of the Global Geodynamics Project (GGP) has now been completed. Data from superconducting gravimeters (SGs) within GGP have shown great capabilities in a wide spectrum of geophysical applications from the tidal studies to the long-period seismology. Here, we compare the noise levels of the different contributing stations over the whole spectrum. We use three different processing procedures to evaluate the combined instrument-plus-site noise in the long-period seismic band (200–600 s), in the sub-seismic band (1–6 h) and in the tidal bands (12–24 h). The analysis in the seismic band has demonstrated that SGs are particularly well suited for the studies of the long-period normal modes and thus are complementary to long-period seismometers. In the sub-seismic band, the power spectral densities, computed over a period of 15 continuous days for every GGP station, cross the New Low Noise Model of Peterson from T = 16 min to T = 4.6 h. SG data are therefore appropriate for studying long-period seismic and sub-seismic modes. In the tidal bands, the noise comparison is realised by a least-squares fit to tides, local air pressure and instrumental drift, leading to gravity residuals where we estimate a standard deviation and average noise levels in different tidal frequency bands. Tidal gravity observations using SGs have also shown to be an independent validation tool of ocean tidal models, and they are therefore complementary to tide gauge and altimetric data sets. Knowledge of the noise levels at each station is important in a number of studies that combine the data to determine global Earth parameters. We illustrate it with the stacking of the data in the search for the gravity variations associated with the sub-seismic translational motions of the inner core, the so-called Slichter triplet.  相似文献   

19.
磁暴急始的自动识别拾取与磁暴报告自动产出是国家地磁台网数据产品产出中重要的内容。为实现在1s采样率下的磁暴急始高精度拾取,本文针对性地提出基于Walsh变换和Akaike信息准则(AIC)的起跳点检测算法——Walsh-AIC算法,将其应用于98个磁暴急始事件的拾取上,并采用由国际地磁指数服务(ISGI)发布的磁暴时刻为标准对应用效果进行评估。结果表明,该算法不仅能够将急始磁暴起跳时刻的位置更加明显地刻画出来,而且能够有效避免传统识别急始磁暴的AIC算法中存在的识别起跳时刻结果晚于实际起跳时刻的情况。同时,将Walsh-AIC算法与其他主流拾取方法进行对比,发现前者定位结果的平均偏差和标准差均明显小于其他算法。此外,Walsh-AIC算法结果的误差关于0点的对称性较好,表明该算法拾取磁暴急始起跳时刻的精度较高,适合于解决磁暴急始的自动拾取问题。  相似文献   

20.
根据Kolmogorov-Smirnov分布检验法,利用2000年以来1.7≤ML≤2.9地震目录,对2010年4月14日青海玉树MS7.1前震中附近地区2000年以来地震活动的月频次和2007年7月以来地震活动的周频次分布进行了检验。结果表明,地震活动周频次和月频次都不符合泊松分布和正态分布,但周、月频次累积次数的对数与周、月频次之间呈线性关系,类似于G-R关系。分析了周、月频次的标准差σ、CV值、偏度Sk、峰度Ku和bm值等参数随时间的变化,发现在2010年4月14日青海玉树地震前,这些参数都出现了不同程度的异常变化,其中周频次的分布参数随时间的变化似乎呈现出了某种周期性特征。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号