排序方式: 共有52条查询结果,搜索用时 15 毫秒
21.
Bagher Shirmohammadi Hamidreza Moradi Vahid Moosavi Majid Taie Semiromi Ali Zeinali 《Natural Hazards》2013,69(1):389-402
Drought is accounted as one of the most natural hazards. Studying on drought is important for designing and managing of water resources systems. This research is carried out to evaluate the ability of Wavelet-ANN and adaptive neuro-fuzzy inference system (ANFIS) techniques for meteorological drought forecasting in southeastern part of East Azerbaijan province, Iran. The Wavelet-ANN and ANFIS models were first trained using the observed data recorded from 1952 to 1992 and then used to predict meteorological drought over the test period extending from 1992 to 2011. The performances of the different models were evaluated by comparing the corresponding values of root mean squared error coefficient of determination (R 2) and Nash–Sutcliffe model efficiency coefficient. In this study, more than 1,000 model structures including artificial neural network (ANN), adaptive neural-fuzzy inference system (ANFIS) and Wavelet-ANN models were tested in order to assess their ability to forecast the meteorological drought for one, two, and three time steps (6 months) ahead. It was demonstrated that wavelet transform can improve meteorological drought modeling. It was also shown that ANFIS models provided more accurate predictions than ANN models. This study confirmed that the optimum number of neurons in the hidden layer could not be always determined using specific formulas; hence, it should be determined using a trial-and-error method. Also, decomposition level in wavelet transform should be delineated according to the periodicity and seasonality of data series. The order of models with regard to their accuracy is as following: Wavelet-ANFIS, Wavelet-ANN, ANFIS, and ANN, respectively. To the best of our knowledge, no research has been published that explores coupling wavelet analysis with ANFIS for meteorological drought and no research has tested the efficiency of these models to forecast the meteorological drought in different time scales as of yet. 相似文献
22.
Sasan Barak Marziye Yousefi Hamidreza Maghsoudlou Sanaz Jahangiri 《Stochastic Environmental Research and Risk Assessment (SERRA)》2016,30(4):1167-1187
In the recent centuries, one of the most important ongoing challenges is energy consumption and its environmental impacts. As far as agriculture is concerned, it has a key role in the world economics and a great amount of energy from different sources is used in this sector. Since researchers have reported a high degree of inefficiency in developing countries, it is necessary for the modern management of cropping systems to have all factors (economics, energy and environment) in the decision-making process simultaneously. Therefore, the aim of this study is to apply Multi-Objective Particle Swarm Optimization (MOPSO) to analyze management system of an agricultural production. As well as MOPSO, two other optimization algorithm were used for comparing the results. Eventually, Taguchi method with metrics analysis was used to tune the algorithms’ parameters and choose the best algorithms. Watermelon production in Kerman province was considered as a case study. On average, the three multi-objective evolutionary algorithms could reduce about 30 % of the average Greenhouse Gas (GHG) emissions in watermelon production although as well as this reduction, output energy and benefit cost ratio were increased about 20 and 30 %, respectively. Also, the metrics comparison analysis determined that MOPSO provided better modeling and optimization results. 相似文献
23.
This article is devoted to evaluating destructive earthquakes (magnitude >6) of Iran and determining properties of their source parameters. First of all, a database of documented earthquakes has been prepared via reliable references and causative faults of each event have been determined. Then, geometric parameters of each fault have been presented completely. Critical parameters such as Maximum Credible Rupture, MCR, and Maximum Credible Earthquake, MCE, have been compiled based on the geometrical parameters of the earthquake faults. The calculated parameters have been compared to the maximum earthquake and the surface rupture which have been recorded for the earthquake faults. Also, the distance between the epicenter of documented earthquake events and their causative faults has been calculated (the distance was less than 20 km for 90% of the data). Then, the distance between destructive earthquakes (with the magnitude more than 6) and the nearest active fault has been calculated. If the estimated distance is less than 20 km and the mechanism of the active fault and the event are reported the same, the active fault will be introduced as a probable causative fault of that earthquake. In the process, all of the available geological, tectonic, seismotectonic maps, aerial geophysical data as well as remote sensing images have been evaluated. Based on the quality and importance of earthquake data, the events have been classified into three categories: (1) the earthquakes which have their causative faults documented, (2) the events with magnitude higher than 7, and (3) the events with the magnitude between 6 and 7. For each category, related maps and tables have been compiled and presented. Some important faults and events have been also described throughout the paper. As mentioned in this paper, these faults are likely to be in high seismic regions with potential for large-magnitude events as they are long, deep and bound sectors of the margins characterized by different deformation and coupling rates on the plate interface. 相似文献
24.
This research evaluates the performance of areal interpolation coupled with dasymetric refinement to estimate different demographic attributes, namely population sub-groups based on race, age structure and urban residence, within consistent census tract boundaries from 1990 to 2010 in Massachusetts. The creation of such consistent estimates facilitates the study of the nuanced micro-scale evolution of different aspects of population, which is impossible using temporally incompatible small-area census geographies from different points in time. Various unexplored ancillary variables, including the Global Human Settlement Layer (GHSL), the National Land-Cover Database (NLCD), parcels, building footprints and the proprietary ZTRAX® dataset are utilized for dasymetric refinement prior to areal interpolation to examine their effectiveness in improving the accuracy of multi-temporal population estimates. Different areal interpolation methods including Areal Weighting (AW), Target Density Weighting (TDW), Expectation Maximization (EM) and its data-extended approach are coupled with different dasymetric refinement scenarios based on these ancillary variables. The resulting consistent small area estimates of white and black subpopulations, people of age 18–65 and urban population show that dasymetrically refined areal interpolation is particularly effective when the analysis spans a longer time period (1990–2010 instead of 2000–2010) and the enumerated population is sufficiently large (e.g., counts of white vs. black). The results also demonstrate that current census-defined urban areas overestimate the spatial distribution of urban population and dasymetrically refined areal interpolation improves estimates of urban population. Refined TDW using building footprints or the ZTRAX® dataset outperforms all other methods. The implementation of areal interpolation enriched by dasymetric refinement represents a promising strategy to create more reliable multi-temporal and consistent estimates of different population subgroups and thus demographic compositions. This methodological foundation has the potential to advance micro-scale modeling of various subpopulations, particularly urban population to inform studies of urbanization and population change over time as well as future population projections. 相似文献
25.
26.
27.
28.
Hamidreza Zoraghein Stefan Leyk 《International journal of geographical information science》2018,32(10):1948-1976
To assess micro-scale population dynamics effectively, demographic variables should be available over temporally consistent small area units. However, fine-resolution census boundaries often change between survey years. This research advances areal interpolation methods with dasymetric refinement to create accurate consistent population estimates in 1990 and 2000 (source zones) within tract boundaries of the 2010 census (target zones) for five demographically distinct counties in the US. Three levels of dasymetric refinement of source and target zones are evaluated. First, residential parcels are used as a binary ancillary variable prior to regular areal interpolation methods. Second, Expectation Maximization (EM) and its data-extended version leverage housing types of residential parcels as a related ancillary variable. Finally, a third refinement strategy to mitigate the overestimation effect of large residential parcels in rural areas uses road buffers and developed land cover classes. Results suggest the effectiveness of all three levels of dasymetric refinement in reducing estimation errors. They provide a first insight into the potential accuracy improvement achievable in varying geographic and demographic settings but also through the combination of different refinement strategies in parts of a study area. Such improved consistent population estimates are the basis for advanced spatio-temporal demographic research. 相似文献
29.
30.