首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 793 毫秒
1.
For effective hazard mitigation planning and prompt-but-prudent post-disaster responses, it is essential to evaluate the reliability of infrastructure networks accurately and efficiently. A nonsimulation-based algorithm, termed as a recursive decomposition algorithm (RDA), was recently proposed to identify disjoint cut sets and link sets and to compute the network reliability. This paper introduces a ‘selective’ RDA, which preferentially identifies critical disjoint cut sets and link sets to calculate the probabilities of network disconnection events with a significantly reduced number of identified sets. To this end, the original RDA is improved by replacing the shortest path algorithm with an algorithm that identifies the most reliable path, and by using a graph decomposition scheme based on the probabilities associated with the subgraphs. The critical sets identified by the algorithm are also used to compute conditional probability-based importance measures that quantify the relative importance of network components by their contributions to network disconnection events. This paper also introduces a risk assessment framework for lifeline networks based on the use of the selective RDA, which can consider both interevent and intraevent uncertainties of spatially correlated ground motions. The risk assessment framework and the selective RDA are demonstrated by a hypothetical network example, and the gas and water transmission networks of Shelby County in Tennessee, USA. The examples show that the proposed framework and the selective RDA greatly improve efficiency of risk assessment of complex lifeline networks, which are characterized by a large number of components, complex network topology, and statistical dependence between component failures. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

2.
Recent earthquake events evidenced that damage of structural components in a lifeline network may cause prolonged disruption of lifeline services, which eventually results in significant socio‐economic losses in the affected area. Despite recent advances in network reliability analysis, the complexity of the problem and various uncertainties still make it a challenging task to evaluate the post‐hazard performance and connectivity of lifeline networks efficiently and accurately. In order to overcome such challenges and take advantage of merits of multi‐scale analysis, this paper develops a multi‐scale system reliability analysis method by integrating a network decomposition approach with the matrix‐based system reliability (MSR) method. In addition to facilitating system reliability analysis of large‐size networks, the multi‐scale approach enables optimizing the level of computational effort on subsystems; identifying the relative importance of components and subsystems at multiple scales; and providing a collaborative risk management framework. The MSR method is uniformly applied for system reliability analyses at both the lower‐scale (for link failure) and the higher‐scale (for system connectivity) to obtain the probability of general system events, various conditional probabilities, component importance measures, statistical correlation between subsystem failures and parameter sensitivities. The proposed multi‐scale analysis method is demonstrated by its application to a gas distribution network in Shelby County of Tennessee. A parametric study is performed to determine the number of segments during the lower‐scale MSR analysis of each pipeline based on the strength of the spatial correlation of seismic intensity. It is shown that the spatial correlation should be considered at both scales for accurate reliability evaluation. The proposed multi‐scale analysis approach provides an effective framework of risk assessment and decision support for lifeline networks under earthquake hazards. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

3.
The quality of digital elevation model (DEM)‐derived river drainage networks (RDNs) is influenced by DEM quality, basin physical characteristics, scale, and algorithms used; these factors should not be neglected. However, few research studies analyse the different evaluation approaches used in the literature with respect to adequacy, meaning of the results, advantages, and limitations. Focusing on coarse‐resolution networks, this paper reviews the use of these techniques and offers new insights on these issues. Additionally, we propose adaptations for selected metrics and discuss distinct interpretations for the evaluation of RDNs derived at different spatial resolutions (1, 5, 10, 20, and 30 km) considering the Uruguay River basin (206,000 km2) as a case study. The results demonstrate that lumped basin/river characteristics and basin delineation analysis should not be used as evaluation criteria for RDN quality; however, some of these metrics offer useful complementary information. Percentage of the DEM‐derived RDN within a uniform buffer placed around a river network considered as reference and mean separation distance between these two networks are more suitable metrics, but the former is insensitive to serious errors. The change in reference from a fine‐scale network to a coarse‐resolution manual tracing network significantly augments the discrepancy of these largest errors when the mean distance metric was applied, and visual comparison analysis is necessary to interpret the results for other metrics. We recommend the use of the mean distance metric in combination with a detailed visual assessment, the importance of which increases as the resolution coarsens. In both cases, the impact of network quality can be further refined by quantifying the basin shape and river length errors.  相似文献   

4.
Pebble clusters are common small‐scale morphological features in gravel‐bed rivers, occupying as much as 10 per cent of the bed surface. Important links exist between the presence of pebble clusters and the development of flow structures. These links are poorly understood at the three‐dimensional level. Particularly neglected has been the effect of clusters on the lateral flow characteristics. A laboratory study was conducted using a hydraulic flume, within which simulated pebble clusters were superimposed onto a plane bed of gravel material. High‐resolution three‐dimensional flow data were collected above the bed at two different flow depths using an acoustic Doppler velocimeter. The results present evidence of the importance of lateral flow in the development of turbulent flow structure. Narrow regions of high lateral and downstream turbulence intensity exist to both sides of clusters and in a three‐dimensional separation zone in their lee. This may indicate the presence of horseshoe‐type vortical structures analogous to those identified in less hydraulically rough environments. However, it is likely that these structures are more complicated given the mutual interference of the surrounding medium. The lateral flow was also identified as a key component in the upwelling identified by other authors in the lee of pebble clusters. The results of the vertical flow analysis confirm the hypothesis that six regions with distinct vertical flow characteristics exist above clusters: flow acceleration up the stoss‐side of the cluster; recirculation behind the cluster in the wake region; vortex shedding from the pebble crest and shear layer; flow reattachment downstream of the cluster; upwelling of flow downstream of the point of reattachment; and recovery of flow. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

5.
Growing interest in the use of artificial neural networks (ANNs) in rainfall‐runoff modelling has suggested certain issues that are still not addressed properly. One such concern is the use of network type, as theoretical studies on a multi‐layer perceptron (MLP) with a sigmoid transfer function enlightens certain limitations for its use. Alternatively, there is a strong belief in the general ANN user community that a radial basis function (RBF) network performs better than an MLP, as the former bases its nonlinearities on the training data set. This argument is not yet substantiated by applications in hydrology. This paper presents a comprehensive evaluation of the performance of MLP‐ and RBF‐type neural network models developed for rainfall‐runoff modelling of two Indian river basins. The performance of both the MLP and RBF network models were comprehensively evaluated in terms of their generalization properties, predicted hydrograph characteristics, and predictive uncertainty. The results of the study indicate that the choice of the network type certainly has an impact on the model prediction accuracy. The study suggests that both the networks have merits and limitations. For instance, the MLP requires a long trial‐and‐error procedure to fix the optimal number of hidden nodes, whereas for an RBF the structure of the network can be fixed using an appropriate training algorithm. However, a judgment on which is superior is not clearly possible from this study. Copyright © 2004 John Wiley & Sons, Ltd.  相似文献   

6.
We developed a frequency‐domain acoustic‐elastic coupled waveform inversion based on the Gauss‐Newton conjugate gradient method. Despite the use of a high‐performance computer system and a state‐of‐the‐art parallel computation algorithm, it remained computationally prohibitive to calculate the approximate Hessian explicitly for a large‐scale inverse problem. Therefore, we adopted the conjugate gradient least‐squares algorithm, which is frequently used for geophysical inverse problems, to implement the Gauss‐Newton method so that the approximate Hessian is calculated implicitly. Thus, there was no need to store the Hessian matrix. By simultaneously back‐propagating multi‐components consisting of the pressure and displacements, we could efficiently extract information on the subsurface structures. To verify our algorithm, we applied it to synthetic data sets generated from the Marmousi‐2 model and the modified SEG/EAGE salt model. We also extended our algorithm to the ocean‐bottom cable environment and verified it using ocean‐bottom cable data generated from the Marmousi‐2 model. With the assumption of a hard seafloor, we recovered both the P‐wave velocity of complicated subsurface structures as well as the S‐wave velocity. Although the inversion of the S‐wave velocity is not feasible for the high Poisson's ratios used to simulate a soft seafloor, several strategies exist to treat this problem. Our example using multi‐component data showed some promise in mitigating the soft seafloor effect. However, this issue still remains open.  相似文献   

7.
A new methodology for the development of bridge‐specific fragility curves is proposed with a view to improving the reliability of loss assessment in road networks and prioritising retrofit of the bridge stock. The key features of the proposed methodology are the explicit definition of critical limit state thresholds for individual bridge components, with consideration of the effect of varying geometry, material properties, reinforcement and loading patterns on the component capacity; the methodology also includes the quantification of uncertainty in capacity, demand and damage state definition. Advanced analysis methods and tools (nonlinear static analysis and incremental dynamic response history analysis) are used for bridge component capacity and demand estimation, while reduced sampling techniques are used for uncertainty treatment. Whereas uncertainty in both capacity and demand is estimated from nonlinear analysis of detailed inelastic models, in practical application to bridge stocks, the demand is estimated through a standard response spectrum analysis of a simplified elastic model of the bridge. The simplified methodology can be efficiently applied to a large number of bridges (with different characteristics) within a road network, by means of an ad hoc developed software involving the use of a generic (elastic) bridge model, which derives bridge‐specific fragility curves. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

8.
The integrated optimum problem of structures subjected to strong earthquakes and wind excitations, optimizing the number of actuators, the configuration of actuators and the control algorithms simultaneously, is studied. Two control algorithms, optimal control and acceleration feedback control, are used as the control algorithms. A multi‐level optimization model is proposed with respect to the solution procedure of the optimum problem. The characteristics of the model are analysed, and the formulation of each suboptimization problem at each level is presented. To solve the multi‐level optimization problem, a multi‐level genetic algorithm (MLGA) is proposed. The proposed model and MLGA are used to solve two multi‐level optimization problems in which the optimization of the number of actuators, the positions of actuators and the control algorithm are considered in different levels. In problem 1, an example structure is excited by strong wind, and in problem 2, an example structure is subjected to strong earthquake excitation. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

9.
Low‐flow events can cause significant impacts to river ecosystems and water‐use sectors; as such, it is important to understand their variability and drivers. In this study, we characterise the variability and timing of annual total frequency of low‐streamflow days across a range of headwater streams within the continental United States. To quantify this, we use a metric that counts the annual number of low‐flow days below a given threshold, defined as the cumulative dry days occurrence (CDO). First, we identify three large clusters of stream gauge locations using a Partitioning Around Medoids (PAM) clustering algorithm. In terms of timing, results reveal that for most clusters, the majority of low‐streamflow days occur from the middle of summer until early fall, although several locations in Central and Western United States also experience low‐flow days in cold seasons. Further, we aim to identify the regional climate and larger scale drivers for these low‐streamflow days. Regionally, we find that precipitation deficits largely associate with low‐streamflow days in the Western United States, whereas within the Central and Eastern U.S. clusters, high temperature indicators are also linked to low‐streamflow days. In terms of larger scale, we examine sea surface temperature (SST) anomalies, finding that extreme dry years exhibit a high degree of co‐occurrence with different patterns of warmer SST anomalies across the Pacific and Northern Atlantic Oceans. The linkages identified with regional climate and SSTs offer promise towards regional prediction of changing conditions of low‐streamflow events.  相似文献   

10.
With the availability of spatially distributed data, distributed hydrologic models are increasingly used for simulation of spatially varied hydrologic processes to understand and manage natural and human activities that affect watershed systems. Multi‐objective optimization methods have been applied to calibrate distributed hydrologic models using observed data from multiple sites. As the time consumed by running these complex models is increasing substantially, selecting efficient and effective multi‐objective optimization algorithms is becoming a nontrivial issue. In this study, we evaluated a multi‐algorithm, genetically adaptive multi‐objective method (AMALGAM) for multi‐site calibration of a distributed hydrologic model—Soil and Water Assessment Tool (SWAT), and compared its performance with two widely used evolutionary multi‐objective optimization (EMO) algorithms (i.e. Strength Pareto Evolutionary Algorithm 2 (SPEA2) and Non‐dominated Sorted Genetic Algorithm II (NSGA‐II)). In order to provide insights into each method's overall performance, these three methods were tested in four watersheds with various characteristics. The test results indicate that the AMALGAM can consistently provide competitive or superior results compared with the other two methods. The multi‐method search framework of AMALGAM, which can flexibly and adaptively utilize multiple optimization algorithms, makes it a promising tool for multi‐site calibration of the distributed SWAT. For practical use of AMALGAM, it is suggested to implement this method in multiple trials with relatively small number of model runs rather than run it once with long iterations. In addition, incorporating different multi‐objective optimization algorithms and multi‐mode search operators into AMALGAM deserves further research. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

11.
The present study aims to develop a hybrid multi‐model using the soft computing approach. The model is a combination of a fuzzy logic, artificial neural network (ANN) and genetic algorithm (GA). While neural networks are low‐level computational structures that perform well dealing with raw data, fuzzy logic deal with reasoning on a higher level by using linguistic information acquired from domain experts. However, fuzzy systems lack the ability to learn and cannot adjust themselves to a new environment. Moreover, experts occasionally make mistakes and thus some rules used in a system may be false. A network type structure of the present hybrid model is a multi‐layer feed‐forward network, the main part is a fuzzy system based on the first‐order Sugeno fuzzy model with a fuzzification and a defuzzification processes. The consequent parameters are determined by least square method. The back‐propagation is applied to adjust weights of network. Then, the antecedent parameters of the membership function are updated accordingly by the gradient descent method. The GA was applied to select the fuzzy rule. The hybrid multi‐model was used to forecast the flood level at Chiang Mai (under the big flood 2005) and the Koriyama flood (2003) in Japan. The forecasting results are evaluated using standard global goodness of fit statistic, efficient index (EI), the root mean square error (RMSE) and the peak flood error. Moreover, the results are compared to the results of a neuro‐genetic model (NGO) and ANFIS model using the same input and output variables. It was found that the hybrid multi‐model can be used successfully with an efficiency index (EI) more than 0·95 (for Chiang Mai flood up to 12 h ahead forecasting) and more than 0·90 (for Koriyama flood up to 8 h ahead forecasting). In general, all of three models can predict the water level with satisfactory results. However, the hybrid model gave the best flood peak estimation among the three models. Therefore, the use of fuzzy rule base, which is selected by GA in the hybrid multi‐model helps to improve the accuracy of flood peak. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

12.
Real‐time hybrid testing combines experimental testing and numerical simulation, and provides a viable alternative for the dynamic testing of structural systems. An integration algorithm is used in real‐time hybrid testing to compute the structural response based on feedback restoring forces from experimental and analytical substructures. Explicit integration algorithms are usually preferred over implicit algorithms as they do not require iteration and are therefore computationally efficient. The time step size for explicit integration algorithms, which are typically conditionally stable, can be extremely small in order to avoid numerical stability when the number of degree‐of‐freedom of the structure becomes large. This paper presents the implementation and application of a newly developed unconditionally stable explicit integration algorithm for real‐time hybrid testing. The development of the integration algorithm is briefly reviewed. An extrapolation procedure is introduced in the implementation of the algorithm for real‐time testing to ensure the continuous movement of the servo‐hydraulic actuator. The stability of the implemented integration algorithm is investigated using control theory. Real‐time hybrid test results of single‐degree‐of‐freedom and multi‐degree‐of‐freedom structures with a passive elastomeric damper subjected to earthquake ground motion are presented. The explicit integration algorithm is shown to enable the exceptional real‐time hybrid test results to be achieved. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

13.
14.
River networks have been shown to obey power scaling laws and to follow self‐organization principles. Their self‐similar (fractal) properties open a path to relate small scale and large scale hydrological processes, such as erosion, deposition or geological movements. However, the existence of a self‐similar dimension has only been checked using either the whole channel network or, on the contrary, a single channel link. No study has explicitly addressed the possible spatial variation of the self‐similar properties between these two extreme geomorphologic objects. Here, a new method based on self‐similarity maps (SSM) is proposed to spatially explore the stream length self‐similar dimension Dl within a river network. The mapping principle consists in computing local self‐similar dimensions deduced from a fit of stream length estimations using increasing divider sizes. A local uncertainty related to the fit quality is also computed and localized on every stream. To assess the efficiency of the approach, contrasted river networks are simulated using optimal channel networks (OCN), where each network is characterized by an exponent γ conditioning its overall topology. By building SSM of these networks, it is shown that deviations from uniform self‐similarity across space occur. Depending on the type of network (γ parameter), these deviations are or are not related to Strahler's order structure. Finally, it is found numerically that the structural averaged stream length self‐similar dimension Dl is closely related to the more functional γ parameter. Results form a bridge between the studies on river sinuosity (single channel) and growth of channel networks (watershed). As for every method providing spatial information where they were lacking before, the SSM may soon help to accurately interpret natural networks and help to simulate more realistic channel networks. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

15.
There is growing pressure from regulators on operators to adhere to increasingly stricter regulations related to the environment and safety. Hence, operators are required to predict and contain risks related to hydrocarbon production and their infrastructure in order to maintain their licence to operate. A deeper understanding of production optimisation and production‐related risk requires strengthened knowledge of reservoir behaviour and overburden dynamics. To accomplish this, sufficient temporal and spatial resolution is required as well as an integration of various sources of measurements. At the same time, tremendous developments are taking place in sensors, networks, and data analysis technologies. Sensors and accompanying channels are getting smaller and cheaper, and yet they offer high fidelity. New ecosystems of ubiquitous wireless communications including Internet of Things nowadays allow anyone to affordably connect to the Internet at any time and anywhere. Recent advances in cloud storage and computing combined with data analytics allow fast and efficient solutions to handle considerable amounts of data. This paper is an effort to pave the way for exploiting these three fundamental advances to create Internet of Things‐based wireless networks of seismic sensors. To this aim, we propose to employ a recently developed Internet of Things‐based wireless technology, so‐called low‐power wide‐area networks, to exploit their long range, low power, and inherent compatibility to cloud storage and computing. We create a remotely operated minimum‐maintenance wireless solution for four major seismic applications of interest. By proposing appropriate network architecture and data coordination (aggregation and transmission) designs, we show that neither the low data rate nor the low duty cycle of low‐power wide‐area networks imposes fundamental issues in handling a considerable amount of data created by complex seismic scenarios as long as the application is delay tolerant. In order to confirm this claim, we cast our ideas into a practical large‐scale networking design for simultaneous seismic monitoring and interferometry and carry out an analysis on the data generation and transmission rates. Finally, we present some results from a small‐scale field test in which we have employed our Internet of Things‐based wireless nodes for real‐time seismic quality control over clouds.  相似文献   

16.
In Italy, as in other high seismic risk countries, many bridges, nowadays deemed ‘strategic’ for civil protection interventions after an earthquake, were built without antiseismic criteria, and therefore their seismic assessment is mandatory. Accordingly, the development of a seismic assessment procedure that gives reliable results and, at the same time, is sufficiently simple to be applied on a large population of bridges in a short time is very useful. In this paper, a displacement‐based procedure for the assessment of multi‐span RC bridges, satisfying these requirements and called direct displacement‐based assessment (DDBA), is proposed. Based on the direct displacement‐based design previously developed by Priestley et al., DDBA idealizes the multi DOF bridge structure as an equivalent SDOF system and hence defines a safety factor in terms of displacement. DDBA was applied to hypothetical bridge configurations. The same structures were analyzed also using standard force‐based approach. The reliability of the two methods was checked performing IDA with response spectrum compatible accelerograms. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

17.
Rain‐gauge networks are often used to provide estimates of area average rainfall or point rainfalls at ungauged locations. The level of accuracy a network can achieve depends on the total number and locations of gauges in the network. A geostatistical approach for evaluation and augmentation of an existing rain‐gauge network is proposed in this study. Through variogram analysis, hourly rainfalls are shown to have higher spatial variability than annual rainfalls, with hourly Mei‐Yu rainfalls having the highest spatial variability. A criterion using ordinary kriging variance is proposed to assess the accuracy of rainfall estimation using the acceptance probability defined as the probability that estimation error falls within a desired range. Based on the criterion, the percentage of the total area with acceptable accuracy Ap under certain network configuration can be calculated. A sequential algorithm is also proposed to prioritize rain‐gauges of the existing network, identify the base network, and relocate non‐base gauges. Percentage of the total area with acceptable accuracy is mostly contributed by the base network. In contrast, non‐base gauges provide little contribution to Ap and are subject to removal or relocation. Using a case study in northern Taiwan, the proposed approach demonstrates that the identified base network which comprises of approximately two‐thirds of the total rain‐gauges can achieve almost the same level of performance (expressed in terms of percentage of the total area with acceptable accuracy) as the complete network for hourly Mei‐Yu rainfall estimation. The percentage of area with acceptable accuracy can be raised from 56% to 88% using an augmented network. A threshold value for the percentage of area with acceptable accuracy is also recommended to help determine the number of non‐base gauges which need to be relocated. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

18.
This paper explores the use of planview morphological metrics to quantitatively describe and distinguish mixed bedrock–alluvial multichannel networks from alluvial multichannel networks. The geometries of the channel planforms of two bedrock‐constrained networks (Mekong and Orange rivers) are compared with the classic alluvial anastomosed Upper Columbia River and the wandering Ganga River. Widely recognized indices utilized include: channel link count and channel sinuosity, with additional emphasis being given to the less common metrics: network bifurcation angles and island shape characteristics (i.e. aspect ratio, compactness, roundness and convexity). Link count data, with one notable exception, conform to theoretical expectations. Bifurcation angles for all four multichannel rivers are significantly greater than angles reported for braiding rivers. Island convexity clearly discriminates the two alluvial rivers from the two bedrock‐influenced rivers. The width of the macrochannel, in which each network develops, has a positive influence on the number of channel links and is further related to channel slope variations which, in turn, are influenced by terrain structure revealed using trend‐surface analysis. The geometry of multichannel networks are often laterally constrained such that the values of channel bifurcation angles and link sinuosity values reduce as the network intensifies and channel links are shortened. These latter observations go some way to explain the oft‐noted relatively ‘straight’ links seen within multichannel networks which are a necessary adjustment to space‐filling constraints placed on a network. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

19.
Lifeline systems, such as water distribution and gas supply networks, usually cover large areas. For these systems, seismic design is always a difficult problem because of the complexity of large‐scale networks. In this paper, a topology optimization technology for lifeline networks is established. Firstly, in order to speed up the convergence of optimization process, an element investment importance analysis is carried out to evaluate the importance of components to the lifeline network. Then a topology optimization model is established. The aim of the model is to find the least‐cost network topology while the seismic reliability between the sources and each terminal satisfies prescribed reliability constraints. For this optimization problem, a genetic algorithm, which takes network topologies as the individuals of its population, is used to search for the optimal solutions by suitable operators, including selection, crossover and mutation operators. The capacity of the proposed algorithm is illustrated by its applications to a simple example network consisting of 10 nodes and an actual network with 391 nodes located in a large city of China. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

20.
It is well recognized that the time series of hydrologic variables, such as rainfall and streamflow are significantly influenced by various large‐scale atmospheric circulation patterns. The influence of El Niño‐southern oscillation (ENSO) on hydrologic variables, through hydroclimatic teleconnection, is recognized throughout the world. Indian summer monsoon rainfall (ISMR) has been proved to be significantly influenced by ENSO. Recently, it was established that the relationship between ISMR and ENSO is modulated by the influence of atmospheric circulation patterns over the Indian Ocean region. The influences of Indian Ocean dipole (IOD) mode and equatorial Indian Ocean oscillation (EQUINOO) on ISMR have been established in recent research. Thus, for the Indian subcontinent, hydrologic time series are significantly influenced by ENSO along with EQUINOO. Though the influence of these large‐scale atmospheric circulations on large‐scale rainfall patterns was investigated, their influence on basin‐scale stream‐flow is yet to be investigated. In this paper, information of ENSO from the tropical Pacific Ocean and EQUINOO from the tropical Indian Ocean is used in terms of their corresponding indices for stream‐flow forecasting of the Mahanadi River in the state of Orissa, India. To model the complex non‐linear relationship between basin‐scale stream‐flow and such large‐scale atmospheric circulation information, artificial neural network (ANN) methodology has been opted for the present study. Efficient optimization of ANN architecture is obtained by using an evolutionary optimizer based on a genetic algorithm. This study proves that use of such large‐scale atmospheric circulation information potentially improves the performance of monthly basin‐scale stream‐flow prediction which, in turn, helps in better management of water resources. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号