首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We present a procedure for the segmentation of hydrological and environmental time series. The procedure is based on the minimization of Hubert’s segmentation cost or various generalizations of this cost. This is achieved through a dynamic programming algorithm, which is guaranteed to find the globally optimal segmentations with K=1, 2, ..., K max segments. Various enhancements can be used to speed up the basic dynamic programming algorithm, for example recursive computation of segment errors and “block segmentation”. The “true” value of K is selected through the use of the Bayesian information criterion. We evaluate the segmentation procedure with experiments which involve artificial as well as temperature and river discharge time series.  相似文献   

2.
In this paper we present a procedure for the segmentation of hydrological and enviromental time series. We consider the segmentation problem from a purely computational point of view which involves the minimization of Huberts segmentation cost; in addition this least squares segmentation is equivalent to Maximum Likelihood segmentation. Our segmentation procedure maximizes Likelihood and minimizes Huberts least squares criterion using a hidden Markov model (HMM) segmentation algorithm. This algorithm is guaranteed to achieve a local maximum of the Likelihood. We evaluate the segmentation procedure with numerical experiments which involve artificial, temperature and river discharge time series. In all experiments, the procedure actually achieves the global minimum of the Likelihood; furthermore execution time is only a few seconds, even for time series with over a thousand terms.  相似文献   

3.
In this study, we propose a new segmentation algorithm to partition univariate and multivariate time series, where fuzzy clustering is realized for the segments formed in this way. The clustering algorithm involves a new objective function, which incorporates an extra variable related to segmentation, while dynamic time warping (DTW) is applied to determine distances between non-equal-length series. As optimizing the introduced objective function is a challenging task, we put forward an effective approach using dynamic programming (DP) algorithm. When calculating the DTW distance, a DP-based method is developed to reduce the computational complexity. In a series of experiments, both synthetic and real-world time series are used to evaluate the performance of the proposed algorithm. The results demonstrate higher effectiveness and advantages of the constructed algorithm when compared with the existing segmentation approaches.  相似文献   

4.
Dynamic programming approach for segmentation of multivariate time series   总被引:1,自引:1,他引:0  
In this paper, dynamic programming (DP) algorithm is applied to automatically segment multivariate time series. The definition and recursive formulation of segment errors of univariate time series are extended to multivariate time series, so that DP algorithm is computationally viable for multivariate time series. The order of autoregression and segmentation are simultaneously determined by Schwarz’s Bayesian information criterion. The segmentation procedure is evaluated with artificially synthesized and hydrometeorological multivariate time series. Synthetic multivariate time series are generated by threshold autoregressive model, and in real-world multivariate time series experiment we propose that besides the regression by constant, autoregression should be taken into account. The experimental studies show that the proposed algorithm performs well.  相似文献   

5.
In this paper, an improved Gath–Geva clustering algorithm is proposed for automatic fuzzy segmentation of univariate and multivariate hydrometeorological time series. The algorithm considers time series segmentation problem as Gath–Geva clustering with the minimum message length criterion as segmentation order selection criterion. One characteristic of the improved Gath–Geva clustering algorithm is its unsupervised nature which can automatically determine the optimal segmentation order. Another characteristic is the application of the modified component-wise expectation maximization algorithm in Gath–Geva clustering which can avoid the drawbacks of the classical expectation maximization algorithm: the sensitivity to initialization and the need to avoid the boundary of the parameter space. The other characteristic is the improvement of numerical stability by integrating segmentation order selection into model parameter estimation procedure. The proposed algorithm has been experimentally tested on artificial and hydrometeorological time series. The obtained experimental results show the effectiveness of our proposed algorithm.  相似文献   

6.
Fast segmentation algorithms for long hydrometeorological time series   总被引:2,自引:0,他引:2  
A time series with natural or artificially created inhomogeneities can be segmented into parts with different statistical characteristics. In this study, three algorithms are presented for time series segmentation; the first is based on dynamic programming and the second and the third—the latter being an improved version of the former—are based on the branch‐and‐bound approach. The algorithms divide the time series into segments using the first order statistical moment (average). Tested on real world time series of several hundred or even over a thousand terms the algorithms perform segmentation satisfactorily and fast. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

7.
Abstract

A new method is presented to generate stationary multi-site hydrological time series. The proposed method can handle flexible time-step length, and it can be applied to both continuous and intermittent input series. The algorithm is a departure from standard decomposition models and the Box-Jenkins approach. It relies instead on the recent advances in statistical science that deal with generation of correlated random variables with arbitrary statistical distribution functions. The proposed method has been tested on 11 historic weekly input series, of which the first seven contain flow data and the last four have precipitation data. The article contains an extensive review of the results.

Editor D. Koutsoyiannis

Citation Ilich, N., 2014. An effective three-step algorithm for multi-site generation of stochastic weekly hydrological time series. Hydrological Sciences Journal, 59 (1), 85–98.  相似文献   

8.
Abstract

The segmentation of flood seasons has both theoretical and practical importance in hydrological sciences and water resources management. The probability change-point analysis technique is applied to segmenting a defined flood season into a number of sub-seasons. Two alternative sampling methods, annual maximum and peaks-over-threshold, are used to construct the new flow series. The series is assumed to follow the binomial distribution and is analysed with the probability change-point analysis technique. A Monte Carlo experiment is designed to evaluate the performance of proposed flood season segmentation models. It is shown that the change-point based models for flood season segmentation can rationally partition a flood season into appropriate sub-seasons. China's new Three Gorges Reservoir, located on the upper Yangtze River, was selected as a case study since a hydrological station with observed flow data from 1882 to 2003 is located 40 km downstream of the dam. The flood season of the reservoir can be reasonably divided into three sub-seasons: the pre-flood season (1 June–2 July); the main flood season (3 July–10 September); and the post-flood season (11–30 September). The results of flood season segmentation and the characteristics of flood events are reasonable for this region.

Citation Liu, P., Guo, S., Xiong, L. & Chen, L. (2010) Flood season segmentation based on the probability change-point analysis technique. Hydrol. Sci. J. 55(4), 540–554.  相似文献   

9.
王芳  青松  刘楠  郝艳玲  包玉海 《湖泊科学》2022,34(4):1150-1163
湖泊富营养化已经成为水资源领域的研究焦点,是水环境领域面临的长期严峻挑战.为探明干旱半干旱区域湖泊营养状态,以典型岱海水体为例,利用2019—2020年6次野外实测数据为基础,针对Sentinel_2A和Landsat_8 OLI遥感数据,基于营养状态指数TSISDD与色度角之间的相关关系,建立了岱海水体营养状态评估模型,并利用1986—2020年遥感影像数据,得到了长时间序列的水体营养状态.结果表明:(1)本文建立的营养状态评估模型,根据精度检验结果显示模型精度较好,决定系数(R2)为0.74,均方根误差(RMSE)为3.66,平均绝对百分比误差(MAPE)为4.84%.(2)将算法应用到时间序列MSI、TM、ETM+和OLI数据,得到了岱海水体1986—2020年的营养状态动态特征.结果表明,岱海水体面积逐年减少,且多数时间处在轻度富营养化状态.水体富营养化现象大体上从边缘逐渐向湖中心趋于缓和,离岸边越近富营养化现象越严重,通常趋向湖中心以中营养为主,整体上贫营养化现象极少.(3)岱海营养状态时空变化与气温、风速和降水量等气候因子的相关性并不显著,对其解释率为13%.气候因子对营养状态的月变化影响显著,对其解释率为93%.  相似文献   

10.
A hybrid optimization scheme, comprising a genetic algorithm in series with a local least-squares fit operator, is used for the inversion of weak and strong motion downhole array data obtained by the Kik-Net Strong Motion Network during the Mw7.0 Sanriku-Minami Earthquake. Inversion of low-amplitude waveforms is first employed for the estimation of low-strain dynamic soil properties at five stations. Successively, the frequency-dependent equivalent linear algorithm is used to predict the mainshock site response at these stations, by subjecting the best-fit elastic profiles to the downhole-recorded strong motion. Finally, inversion of the mainshock empirical site response is employed to extract the equivalent linear dynamic soil properties at the same locations. The inversion algorithm is shown to provide robust estimates of the linear and equivalent linear impedance profiles, while the attenuation structures are strongly affected by scattering effects in the near-surficial heterogeneous layers. The forward and inversely estimated equivalent linear shear wave velocity structures are found to be in very good agreement, illustrating that inversion of strong motion site response data may be used for the approximate assessment of nonlinear effects experienced by soil formations during strong motion events.  相似文献   

11.
Segmentation algorithm for long time series analysis   总被引:2,自引:2,他引:0  
Time series analysis is an important issue in the earth science-related engineering applications such as hydrology, meteorology and environmetrics. Inconsistency and nonhomogeneity that might arise in a time series yield segments with different statistical characteristics. In this study, an algorithm based on the first order statistical moment (average) of a time series is developed and applied on five time series with length ranging from 84 items to nearly 1,300. Comparison to the existing segmentation algorithms proves the applicability and usefulness of the proposed algorithm in long hydrometeorological and geophysical time series analysis.  相似文献   

12.
Abstract

The use of a physically-based hydrological model for streamflow forecasting is limited by the complexity in the model structure and the data requirements for model calibration. The calibration of such models is a difficult task, and running a complex model for a single simulation can take up to several days, depending on the simulation period and model complexity. The information contained in a time series is not uniformly distributed. Therefore, if we can find the critical events that are important for identification of model parameters, we can facilitate the calibration process. The aim of this study is to test the applicability of the Identification of Critical Events (ICE) algorithm for physically-based models and to test whether ICE algorithm-based calibration depends on any optimization algorithm. The ICE algorithm, which uses the data depth function, was used herein to identify the critical events from a time series. Low depth in multivariate data is an unusual combination and this concept was used to identify the critical events on which the model was then calibrated. The concept is demonstrated by applying the physically-based hydrological model WaSiM-ETH on the Rems catchment, Germany. The model was calibrated on the whole available data, and on critical events selected by the ICE algorithm. In both calibration cases, three different optimization algorithms, shuffled complex evolution (SCE-UA), parameter estimation (PEST) and robust parameter estimation (ROPE), were used. It was found that, for all the optimization algorithms, calibration using only critical events gave very similar performance to that using the whole time series. Hence, the ICE algorithm-based calibration is suitable for physically-based models; it does not depend much on the kind of optimization algorithm. These findings may be useful for calibrating physically-based models on much fewer data.

Editor D. Koutsoyiannis; Associate editor A. Montanari

Citation Singh, S.K., Liang, J.Y., and Bárdossy, A., 2012. Improving calibration strategy of physically-based model WaSiM-ETH using critical events. Hydrological Sciences Journal, 57 (8), 1487–1505.  相似文献   

13.
Time variations in the parameters of seismic activity in two regions in Greece, which are known to have different geodynamical conditions, are analyzed using the FastBEE algorithm suggested in (Papadopoulos and Baskoutas, 2009). The study is based on the data on weak earthquakes that occurred in two local regions. One region pertains to the zone dominated by intensive compression stress field, while another is located in the region of a relatively lower intensity extension stress field. It is shown that in the zone of compression the seismic parameters exhibit anomalous temporal behavior before strong earthquakes with Ms ≥ 5.7, whereas in the zones of extension, similar anomalies precede earthquakes with lower magnitudes of up to Ms ≥ 4.9. The most informative parameters for the purposes of predicting strong seismic events are the released seismic energy in the form logE 2/3 and the slope of the frequency-magnitude dependence, b-value. The seismic activity in the region, expressed in terms of the logarithmic number of earthquakes, per unit time in some cases does not exhibit any particular pattern of behavior before strong earthquakes. In the time series of the studied parameters, four stages in the seismic process are clearly distinguished before strong earthquakes. Typically, a strong earthquake has a low probability to occur within the first two stages. Instead, this probability arises at stage III and attains its maximum at the end of this stage coinciding with the occurrence of the strong earthquake. We suggest these features of the time series to be used for the assessment of seismic hazard and for the real-time prediction of strong earthquakes. The time variations in the b-value are found to be correlated with the time variations inlogE 2/3. This correlation is closely approximated by the power-law function. The parameters of this function depend on the geodynamical features of the region and characterize the intensity and the type of the regional tectonic stresses. The results of our study show that the FastBEE algorithm can be successfully applied for monitoring seismic hazard and predicting strong earthquakes.  相似文献   

14.
A new method was developed and implemented into an Excel Visual Basic for Applications (VBAs) algorithm utilizing trigonometry laws in an innovative way to overlap recession segments of time series and create master recession curves (MRCs). Based on a trigonometry approach, the algorithm horizontally translates succeeding recession segments of time series, placing their vertex, that is, the highest recorded value of each recession segment, directly onto the appropriate connection line defined by measurement points of a preceding recession segment. The new method and algorithm continues the development of methods and algorithms for the generation of MRC, where the first published method was based on a multiple linear/nonlinear regression model approach (Posavec et al. 2006). The newly developed trigonometry‐based method was tested on real case study examples and compared with the previously published multiple linear/nonlinear regression model‐based method. The results show that in some cases, that is, for some time series, the trigonometry‐based method creates narrower overlaps of the recession segments, resulting in higher coefficients of determination R2, while in other cases the multiple linear/nonlinear regression model‐based method remains superior. The Excel VBA algorithm for modeling MRC using the trigonometry approach is implemented into a spreadsheet tool (MRCTools v3.0 written by and available from Kristijan Posavec, Zagreb, Croatia) containing the previously published VBA algorithms for MRC generation and separation. All algorithms within the MRCTools v3.0 are open access and available free of charge, supporting the idea of running science on available, open, and free of charge software.  相似文献   

15.
In terms of the chaos theory, the phase-space-reconstruction method has been employed to describe the multi-dimensional phase space for the time series of air pollution index (API) during the past 10 years in Lanzhou, northwest China. The mutual information and Cao method were used to determine the reconstruction parameters, and the characteristic quantities including the Lyapunov exponent and the correlation dimension were calculated respectively. As a result, the correlation dimensions were fractioned, and the maximum Lyapunov exponent (λ 1) > 0. It shows that these presented the obvious chaotic characteristics that resulted from the evolution of non-linear chaotic dynamic system in the time series of air pollution index over the past 10 years. In the meanwhile, three or even four main dynamic variables were discussed here that could effectively interpret the changes of air pollution index time series and their causes. Some reasonable preventive countermeasures were thus put forward. These findings might provide a scientific basis for probing further into the regional complexity and evolution of the time series of air pollution index.  相似文献   

16.
Abstract

A novel approach is presented for combining spatial and temporal detail from newly available TRMM-based data sets to derive hourly rainfall intensities at 1-km spatial resolution for hydrological modelling applications. Time series of rainfall intensities derived from 3-hourly 0.25° TRMM 3B42 data are merged with a 1-km gridded rainfall climatology based on TRMM 2B31 data to account for the sub-grid spatial distribution of rainfall intensities within coarse-scale 0.25° grid cells. The method is implemented for two dryland catchments in Tunisia and Senegal, and validated against gauge data. The outcomes of the validation show that the spatially disaggregated and intensity corrected TRMM time series more closely approximate ground-based measurements than non-corrected data. The method introduced here enables the generation of rainfall intensity time series with realistic temporal and spatial detail for dynamic modelling of runoff and infiltration processes that are especially important to water resource management in arid regions.

Editor D. Koutsoyiannis

Citation Tarnavsky, E., Mulligan, M. and Husak, G., 2012. Spatial disaggregation and intensity correction of TRMM-based rainfall time series for hydrological applications in dryland catchments. Hydrological Sciences Journal, 57 (2), 248–264.  相似文献   

17.
Oil spill detection with fully polarimetric UAVSAR data   总被引:3,自引:0,他引:3  
Liu P  Li X  Qu JJ  Wang W  Zhao C  Pichel W 《Marine pollution bulletin》2011,62(12):2611-2618
In this study, two ocean oil spill detection approaches based on four scattering matrices measured by fully polarimetric synthetic aperture radar (SAR) are presented and compared. The first algorithm is based on the co-polar correlation coefficient, ρ, and the scattering matrix decomposition parameters, Cloud entropy (H), mean scattering angle (α) and anisotropy (A). While each of these parameters has oil spill signature in it, we find that combining these parameters into a new parameter, F, is a more effective way for oil slick detection. The second algorithm uses the total power of four polarimetric channels image (SPAN) to find the optimal representation of the oil spill signature. Otsu image segmentation method can then be applied to the F and SPAN images to extract the oil slick features. Using the L-band fully polarimetric Uninhabited Aerial Vehicle – synthetic aperture radar (UAVSAR) data acquired during the 2010 Deepwater Horizon oil spill disaster event in the Gulf of Mexico, we are able to successfully extract the oil slick information in the contaminated ocean area. Our result shows that both algorithms perform well in identifying oil slicks in this case.  相似文献   

18.
Surface soil moisture (SSM) is a critical variable for understanding water and energy flux between the atmosphere and the Earth's surface. An easy to apply algorithm for deriving SSM time series that primarily uses temporal parameters derived from simulated and in situ datasets has recently been reported. This algorithm must be assessed for different biophysical and atmospheric conditions by using actual geostationary satellite images. In this study, two currently available coarse‐scale SSM datasets (microwave and reanalysis product) and aggregated in situ SSM measurements were implemented to calibrate the time‐invariable coefficients of the SSM retrieval algorithm for conditions in which conventional observations are rare. These coefficients were subsequently used to obtain SSM time series directly from Meteosat Second Generation (MSG) images over the study area of a well‐organized soil moisture network named REMEDHUS in Spain. The results show a high degree of consistency between the estimated and actual SSM time series values when using the three SSM dataset‐calibrated time‐invariable coefficients to retrieve SSM, with coefficients of determination (R2) varying from 0.304 to 0.534 and root mean square errors ranging from 0.020 m3/m3 to 0.029 m3/m3. Further evaluation with different land use types results in acceptable debiased root mean square errors between 0.021 m3/m3 and 0.048 m3/m3 when comparing the estimated MSG pixel‐scale SSM with in situ measurements. These results indicate that the investigated method is practical for deriving time‐invariable coefficients when using publicly accessed coarse‐scale SSM datasets, which is beneficial for generating continuous SSM dataset at the MSG pixel scale.  相似文献   

19.
Abstract

Hydrological models are commonly used to perform real-time runoff forecasting for flood warning. Their application requires catchment characteristics and precipitation series that are not always available. An alternative approach is nonparametric modelling based only on runoff series. However, the following questions arise: Can nonparametric models show reliable forecasting? Can they perform as reliably as hydrological models? We performed probabilistic forecasting one, two and three hours ahead for a runoff series, with the aim of ascribing a probability density function to predicted discharge using time series analysis based on stochastic dynamics theory. The derived dynamic terms were compared to a hydrological model, LARSIM. Our procedure was able to forecast within 95% confidence interval 1-, 2- and 3-h ahead discharge probability functions with about 1.40 m3/s of range and relative errors (%) in the range [–30; 30]. The LARSIM model and the best nonparametric approaches gave similar results, but the range of relative errors was larger for the nonparametric approaches.

Editor D. Koutsoyiannis; Associate editor K. Hamed

Citation Costa, A.C., Bronstert, A. and Kneis, D., 2012. Probabilistic flood forecasting for a mountainous headwater catchment using a nonparametric stochastic dynamic approach. Hydrological Sciences Journal, 57 (1), 10–25.  相似文献   

20.
ABSTRACT

Climate patterns, including rainfall prediction, is one of the most complex problems for hydrologist. It is inherited by its natural and stochastic phenomena. In this study, a new approach for rainfall time series forecasting is introduced based on the integration of three stochastic modelling methods, including the seasonal differencing, seasonal standardization and spectral analysis, associated with the genetic algorithm (GA). This approach is specially tailored to eradicate the periodic pattern effects notable on the rainfall time series stationarity behaviour. Two different climates are selected to evaluate the proposed methodology, in tropical and semi-arid regions (Malaysia and Iraq). The results show that the predictive model registered an acceptable result for the forecasting of rainfall for both the investigated regions. The attained determination coefficient (R2) for the investigated stations was approx. 0.91, 0.90 and 0.089 for Mosul, Baghdad and Basrah (Iraq), and 0.80, 0.87 and 0.94 for Selangor, Negeri Sembilan and Johor (Malaysia).  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号