首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Virtual California: Fault Model, Frictional Parameters, Applications   总被引:1,自引:0,他引:1  
Virtual California is a topologically realistic simulation of the interacting earthquake faults in California. Inputs to the model arise from field data, and typically include realistic fault system topologies, realistic long-term slip rates, and realistic frictional parameters. Outputs from the simulations include synthetic earthquake sequences and space-time patterns together with associated surface deformation and strain patterns that are similar to those seen in nature. Here we describe details of the data assimilation procedure we use to construct the fault model and to assign frictional properties. In addition, by analyzing the statistical physics of the simulations, we can show that that the frictional failure physics, which includes a simple representation of a dynamic stress intensity factor, leads to self-organization of the statistical dynamics, and produces empirical statistical distributions (probability density functions: PDFs) that characterize the activity. One type of distribution that can be constructed from empirical measurements of simulation data are PDFs for recurrence intervals on selected faults. Inputs to simulation dynamics are based on the use of time-averaged event-frequency data, and outputs include PDFs representing measurements of dynamical variability arising from fault interactions and space-time correlations. As a first step for productively using model-based methods for earthquake forecasting, we propose that simulations be used to generate the PDFs for recurrence intervals instead of the usual practice of basing the PDFs on standard forms (Gaussian, Log-Normal, Pareto, Brownian Passage Time, and so forth). Subsequent development of simulation-based methods should include model enhancement, data assimilation and data mining methods, and analysis techniques based on statistical physics.  相似文献   

2.
The Aki-Utsu maximum likelihood method is widely used for estimation of the Gutenberg-Richter b-value, but not all authors are conscious of the method’s limitations and implicit requirements. The Aki/Utsu method requires a representative estimate of the population mean magnitude; a requirement seldom satisfied in b-value studies, particularly in those that use data from small geographic and/or time windows, such as b-mapping and b-vs-time studies. Monte Carlo simulation methods are used to determine how large a sample is necessary to achieve representativity, particularly for rounded magnitudes. The size of a representative sample weakly depends on the actual b-value. It is shown that, for commonly used precisions, small samples give meaningless estimations of b. Our results give estimates on the probabilities of getting correct estimates of b for a given desired precision for samples of different sizes. We submit that all published studies reporting b-value estimations should include information about the size of the samples used.  相似文献   

3.
2D Monte Carlo versus 2D Fuzzy Monte Carlo health risk assessment   总被引:15,自引:4,他引:11  
Risk estimates can be calculated using crisp estimates of the exposure variables (i.e., contaminant concentration, contact rate, exposure frequency and duration, body weight, and averaging time). However, aggregate and cumulative exposure studies require a better understanding of exposure variables and uncertainty and variability associated with them. Probabilistic risk assessment (PRA) studies use probability distributions for one or more variables of the risk equation in order to quantitatively characterize variability and uncertainty. Two-dimensional Monte Carlo Analysis (2D MCA) is one of the advanced modeling approaches that may be used to conduct PRA studies. In this analysis the variables of the risk equation along with the parameters of these variables (for example mean and standard deviation for a normal distribution) are described in terms of probability density functions (PDFs). A variable described in this way is called a second order random variable. Significant data or considerable insight to uncertainty associated with these variables is necessary to develop the appropriate PDFs for these random parameters. Typically, available data and accuracy and reliability of such data are not sufficient for conducting a reliable 2D MCA. Thus, other theories and computational methods that propagate uncertainty and variability in exposure and health risk assessment are needed. One such theory is possibility analysis based on fuzzy set theory, which allows the utilization of incomplete information (incomplete information includes vague and imprecise information that is not sufficient to generate probability distributions for the parameters of the random variables of the risk equation) together with expert judgment. In this paper, as an alternative to 2D MCA, we are proposing a 2D Fuzzy Monte Carlo Analysis (2D FMCA) to overcome this difficulty. In this approach, instead of describing the parameters of PDFs used in defining the variables of the risk equation as random variables, we describe them as fuzzy numbers. This approach introduces new concepts and risk characterization methods. In this paper we provide a comparison of these two approaches relative to their computational requirements, data requirements and availability. For a hypothetical case, we also provide a comperative interpretation of the results generated.  相似文献   

4.
With the potentially devastating consequences of flooding, it is crucial that uncertainties in the modelling process are quantified in flood simulations. In this paper, the impact of uncertainties in design losses on peak flow estimates is investigated. Simulations were carried out using a conceptual rainfall–runoff model called RORB in four catchments along the east coast of New South Wales, Australia. Monte Carlo simulation was used to evaluate parameter uncertainty in design losses, associated with three loss models (initial loss–continuing loss, initial loss–proportional loss and soil water balance model). The results show that the uncertainty originating from each loss model differs and can be quite significant in some cases. The uncertainty in the initial loss–proportional loss model was found to be the highest, with estimates up to 2.2 times the peak flow, whilst the uncertainty in the soil water balance model was significantly less, with up to 60 % variability in peak flows for an annual exceedance probability of 0.02. Through applying Monte Carlo simulation a better understanding of the predicted flows is achieved, thus providing further support for planning and managing river systems.  相似文献   

5.
As part II of a sequence of two papers, previously developed L-moments by Hosking (1990), and the LH-moments by Wang (1997) are further investigated. The LH-moments (L to L4) are used to develop the regional parameters of the generalized extreme value distribution, generalized Pareto (GPA) distribution and the generalized logistic (GLO) distributions. These respective probability distribution functions (PDFs) are evaluated in terms of their performances. Flood peaks by the corresponding PDFs are compared with those generated by Monte Carlo simulation of randomized data, considering the respective LH-moments. The influence of the LH-moments on estimated PDFs are studied by evaluating the relative bias (RBIAS) in quantile estimation due to variability of the k parameter. Karkhe watershed located in western Iran was used as a case study area. Part I of this study identified the study area as regions A and B. The minimum calculated relative root mean square error (RRMSE) and RBIAS between simulated flood peaks and flood peaks by the corresponding PDFs were used in PDF selection, considering the respective LH-moments. The boxplots of the RRMSE tests identified the L3 level of the GPA distribution as the suitable PDF for sample sizes 20 and 80; for region A. Similar results were found for the RBIAS test. As for region B, the boxplots of the RRMSE tests indicated similar results for the three PDFs. However, the boxplots of the RBIAS tests identified the L4 level of the GLO most suitable for sample sizes 20 and 80. Relative efficiencies of the LH-moments were investigated, measured as RRMSE ratios of L-moments over the respective LH-moments. For the most parts the findings of this part of the study were similar to those of part I.  相似文献   

6.
In many fields of study, and certainly in hydrogeology, uncertainty propagation is a recurring subject. Usually, parametrized probability density functions (PDFs) are used to represent data uncertainty, which limits their use to particular distributions. Often, this problem is solved by Monte Carlo simulation, with the disadvantage that one needs a large number of calculations to achieve reliable results. In this paper, a method is proposed based on a piecewise linear approximation of PDFs. The uncertainty propagation with these discretized PDFs is distribution independent. The method is applied to the upscaling of transmissivity data, and carried out in two steps: the vertical upscaling of conductivity values from borehole data to aquifer scale, and the spatial interpolation of the transmissivities. The results of this first step are complete PDFs of the transmissivities at borehole locations reflecting the uncertainties of the conductivities and the layer thicknesses. The second step results in a spatially distributed transmissivity field with a complete PDF at every grid cell. We argue that the proposed method is applicable to a wide range of uncertainty propagation problems.  相似文献   

7.
This paper proposes an approach to estimating the uncertainty related to EPA Storm Water Management Model model parameters, percentage routed (PR) and saturated hydraulic conductivity (Ksat), which are used to calculate stormwater runoff volumes. The methodology proposed in this paper addresses uncertainty through the development of probability distributions for urban hydrologic parameters through extensive calibration to observed flow data in the Philadelphia collection system. The established probability distributions are then applied to the Philadelphia Southeast district model through a Monte Carlo approach to estimate the uncertainty in prediction of combined sewer overflow volumes as related to hydrologic model parameter estimation. Understanding urban hydrology is critical to defining urban water resource problems. A variety of land use types within Philadelphia coupled with a history of cut and fill have resulted in a patchwork of urban fill and native soils. The complexity of urban hydrology can make model parameter estimation and defining model uncertainty a difficult task. The development of probability distributions for hydrologic parameters applied through Monte Carlo simulations provided a significant improvement in estimating model uncertainty over traditional model sensitivity analysis. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

8.
The parameterm in Ishimoto-Iida's relation was investigated for acoustic emissions (AEs) occurring in rock samples under uniaxial compression. In the experiment, we found: 1) The large AEs are counted without serious error but the number of small AEs is systematically underestimated at high AE rates, 2) the frequency distribution of maximum AE amplitudes becomes nonlinear in logarithmic scale with increasing AE rate, and 3) there exists a strong negative correlation betweenm-value and AE rate. The miscount of small AEs was interpreted as due to overlap of the large and small AEs. We call the miscount masking effect. A statistical analysis based on the masking effect showed that them-value decreases more effectively as the AE rate increases, and thus the masking effect is a possible origin both for the nonlinear frequency distribution of maximum AE amplitudes and for the negative correlation ofm-value with AE rate. We emphasize that one should be careful of the masking effect to examine correctly the change, ofm-value. In order to eliminate the masking effect, AEs should be measured by a measurement system with low sensitivity. Even if the masking effect is eliminated, them-value decreases before the main fracture of a rock sample. Them-value is a key parameter to predict the main fracture.  相似文献   

9.
In weather forecasting, current and past observational data are routinely assimilated into numerical simulations to produce ensemble forecasts of future events in a process termed “model steering”. Here we describe a similar approach that is motivated by analyses of previous forecasts of the Working Group on California Earthquake Probabilities (WGCEP). Our approach is adapted to the problem of earthquake forecasting using topologically realistic numerical simulations for the strike-slip fault system in California. By systematically comparing simulation data to observed paleoseismic data, a series of spatial probability density functions (PDFs) can be computed that describe the probable locations of future large earthquakes. We develop this approach and show examples of PDFs associated with magnitude M > 6.5 and M > 7.0 earthquakes in California.  相似文献   

10.
The correlation between the b-values of acoustic emissions (AEs) and the phase of the moon was investigated at the Underground Research Laboratory (URL) in Canada. The same data as those used in Iwata (2002) were examined, which showed that the occurrence of AEs is correlated with the phase of the moon. It was expected, therefore, that the b-value of the AEs would also be sensitive to tidal stress/strain fluctuations. We investigated the variation of the b-values as a function of the phase of the moon. Results show that b-values immediately following the times of full/new moon are higher than those at other times. Using AIC (Akaike Information Criterion) and random (Monte Carlo) simulations, it was confirmed that this feature is statistically significant. We also investigated whether or not there was a change in the b-values immediately before the times of full/new moon, but no statistically significant change was observed. The results suggest that the effect of stress/strain fluctuations on AE occurrences at the URL is asymmetric to the times of full/new moon.All authors are members of the Academic Robotics Group. In listing The Academic Robotics Group, the authors are endeavoring to place each of the participant institutions on an equal footing in terms of effort and authorship. M. A. Talamini is serving as presenter.  相似文献   

11.
针对现有的河道水流洪水演算模型只能模拟单一变量(流量或水位)的问题,以水流连续方程和河段蓄水量的两种不同表达形式(蓄水量等于平均过水断面面积与河段长乘积,蓄水量等于河段平均流量与传播时间的乘积)为基础,对马斯京根模型进行了通用性改进,提出了双变量耦合通用演算模型.选取了四大水系(包括内陆河流和入海河流)的16个河段汛期洪水资料进行模型检验,模型验证考虑了地理范围、不同的河段特征和水力特征、洪水量级等因素,全面地检验了模型结构的合理性和模拟实际洪水的有效性.将双变量耦合通用演算模型与传统的马斯京根法进行了效果比较,结果表明双变量耦合通用演算模型的模拟精度高于马斯京根法,模拟效果比马斯京根法稳定一些,而且具有较好的通用性.  相似文献   

12.
In risk analysis, a complete characterization of the concentration distribution is necessary to determine the probability of exceeding a threshold value. The most popular method for predicting concentration distribution is Monte Carlo simulation, which samples the cumulative distribution function with a large number of repeated operations. In this paper, we first review three most commonly used Monte Carlo (MC) techniques: the standard Monte Carlo, Latin Hypercube sampling, and Quasi Monte Carlo. The performance of these three MC approaches is investigated. We then apply stochastic collocation method (SCM) to risk assessment. Unlike the MC simulations, the SCM does not require a large number of simulations of flow and solute equations. In particular, the sparse grid collocation method and probabilistic collocation method are employed to represent the concentration in terms of polynomials and unknown coefficients. The sparse grid collocation method takes advantage of Lagrange interpolation polynomials while the probabilistic collocation method relies on polynomials chaos expansions. In both methods, the stochastic equations are reduced to a system of decoupled equations, which can be solved with existing solvers and whose results are used to obtain the expansion coefficients. Then the cumulative distribution function is obtained by sampling the approximate polynomials. Our synthetic examples show that among the MC methods, the Quasi Monte Carlo gives the smallest variance for the predicted threshold probability due to its superior convergence property and that the stochastic collocation method is an accurate and efficient alternative to MC simulations.  相似文献   

13.
Recreational fishing is a popular activity in many urbanized watersheds. When river water is incidentally ingested during fishing sessions, substantial waterborne fecal contamination can cause adverse health effects. This study aims to spatially map health risks for recreational fishers caused by waterborne Escherichia coli (E. coli) in the highly urbanized Tamsui River watershed. First, indicator kriging was used to probabilistically estimate the distributions of waterborne E. coli and determine the conditional cumulative distribution function (CCDF). Subsequently, to propagate the parameter variability, Monte Carlo simulation was adopted to characterize the ingestion rate and exposure duration for recreational fishers and E. coli realizations were generated using random fields on the basis of the estimated CCDF. Finally, after the three parameters were combined, the approximate beta-Poisson dose–response function was employed to quantitatively determine potential risks to recreational fishers in the Tamsui River and its tributaries. The analysis results revealed that the risks of recreational fishing exceed an acceptable level of 8 infections per 1000 fishers per day at several urban river courses. Therefore, recreational fishing activities in urban riverbanks pose a substantial health threat. Recreational fishing in urban riverbanks should be limited before the construction of complete sanitary sewer systems. The river mouth and certain upstream river sections are suitable for the development of recreational fishing.  相似文献   

14.
A statistical riverine litter propagation (RLP) model based on importance sampling Monte Carlo (ISMC) simulation was developed in order to predict the frequency distribution of certain litter types in river reaches. The model was preliminarily calibrated for plastic sheeting by a pilot study conducted on the River Taff, Wales (UK). Litter movement was predominantly controlled by reach characteristics, such as vegetation overhang and water-course obstructions. These affects were modeled in the simulations, by utilizing geometric distributions of river reaches in the time domain. The proposed model satisfactorily simulated the dosing experiments performed at the River Taff. It was concluded from the preliminary calibrations that, the RLP model can be efficiently utilized to portray litter propagation at any arbitrarily selected river site, provided that the stream flows and reach characteristics are calibrated by representative probability distributions of similar sections. Therefore, the RLP model can be considered as a new statistical technique that can predict litter propagation in river sections.  相似文献   

15.
Abstract

The Korba aquifer, located in the north of Tunisia, suffers heavily from salinization due to seawater intrusion. In 2000, the aquifer was exploited from more than 9000 wells. The problem is that no precise information was recorded concerning the current extraction rates, their spatial distribution, or their evolution in time. In this study, a geostatistical model of the exploitation rates was constructed based on a multi-linear regression model combining incomplete direct data and exhaustive secondary information. The impacts of the uncertainty on the spatial distribution of the pumping rates on seawater intrusion were evaluated using a 3-D density-dependent groundwater model. To circumvent the large amount of computing time required to run transient models, the simulations were run in a parallel fashion on the Grid infrastructure provided by the Enabling Grid for E-Science in Europe project. Monte Carlo simulations results showed that 8.3% of the aquifer area is affected by input uncertainty.

Citation Kerrou, J., Renard, P., Lecca, G. & Tarhouni, J. (2010 Kerrou, J., Renard, P. and Tarhouni, J. 2010. Status of the Korba groundwater resources (Tunisia): observations and three-dimensional modelling of seawater intrusion. Hydrogeol. J., 18(5): 11731190. doi:10.1007/s10040-010-0573-5[Crossref], [Web of Science ®] [Google Scholar]) Grid-enabled Monte Carlo analysis of the impacts of uncertain discharge rates on seawater intrusion in the Korba aquifer (Tunisia). Hydrol. Sci. J. 55(8), 1325–1336.  相似文献   

16.
Three kinds of the widely-used cloudiness parameterizations are compared with data produced from the cloud-resolving model(CRM) simulations of the tropical cloud system. The investigated schemes include those based on relative humidity(RH), the semi-empirical scheme using cloud condensate as a predictor, and the statistical scheme based on probability distribution functions(PDFs). Results show that all three schemes are successful in reproducing the timing of cloud generation, except for the RH-based scheme, in which low-level clouds are artificially simulated during cloudless days. In contrast, the low-level clouds are well simulated in the semi-empirical and PDF-based statistical schemes, both of which are close to the CRM explicit simulations. In addition to the Gaussian PDF, two alternative PDFs are also explored to investigate the impact of different PDFs on cloud parameterizations. All the PDF-based parameterizations are found to be inaccurate for high cloud simulations, in either the magnitude or the structure. The primary reason is that the investigated PDFs are symmetrically assumed, yet the skewness factors in deep convective cloud regimes are highly significant, indicating the symmetrical assumption is not well satisfied in those regimes. Results imply the need to seek a skewed PDF in statistical schemes so that it can yield better performance in high cloud simulations.  相似文献   

17.
Community-scale simulations were performed to investigate the risk to groundwater and indoor air receptors downgradient of a contaminated site following the remediation of a long-term source. Six suites of Monte Carlo simulations were performed using a numerical model that accounted for groundwater flow, reactive solute transport, soil gas flow, and vapour intrusion in buildings. The model was applied to a three-dimensional, community-scale (250 m × 1000 m × 14 m) domain containing heterogeneous, spatially correlated distributions of the hydraulic conductivity, fraction of organic carbon, and biodegradation rate constant, which were varied between realizations. Analysis considered results from both individual realizations as well as the suite of Monte Carlo simulations expressed through several novel, integrated parameters, such as the probability of exceeding a regulatory standard in either groundwater or indoor air. Results showed that exceedance probabilities varied considerably with the consideration of biodegradation in the saturated zone, and were less sensitive to changes in the variance of hydraulic conductivity or the incorporation of heterogeneous distributions of organic carbon at this spatial scale. A sharp gradient in exceedance probability existed at the lateral edges of the plumes due to variability in lateral dispersion, which defined a narrow region of exceedance uncertainty. Differences in exceedance probability between realizations (i.e., due to heterogeneity uncertainty) were similar to differences attributed to changes in the variance of hydraulic conductivity or fraction of organic carbon. Simulated clean-up times, defined by reaching an acceptable exceedance probability, were found to be on the order of decades to centuries in these community-scale domains. Results also showed that the choice of the acceptable exceedance probability level (e.g., 1 vs. 5 %) would likely affect clean up times on the order of decades. Moreover, in the scenarios examined here, the risk of exceeding indoor air standards was greater than that of exceeding groundwater standards at all times and places. Overall, simulations of coupled transport processes combined with novel spatial and temporal quantification metrics for Monte Carlo analyses, provide practical tools for assessing risk in wider communities when considering site remediation.  相似文献   

18.
Soils in post‐wildfire environments are often characterized by a low infiltration capacity with a high degree of spatial heterogeneity relative to unburned areas. Debris flows are frequently initiated by run‐off in recently burned steeplands, making it critical to develop and test methods for incorporating spatial variability in infiltration capacity into hydrologic models. We use Monte Carlo simulations of run‐off generation over a soil with a spatially heterogenous saturated hydraulic conductivity (Ks) to derive an expression for an aerially averaged saturated hydraulic conductivity ( ) that depends on the rainfall rate, the statistical properties of Ks, and the spatial correlation length scale associated with Ks. The proposed method for determining is tested by simulating run‐off on synthetic topography over a wide range of spatial scales. Results provide a simplified expression for an effective saturated hydraulic conductivity that can be used to relate a distribution of small‐scale Ks measurements to infiltration and run‐off generation over larger spatial scales. Finally, we use a hydrologic model based on to simulate run‐off and debris flow initiation at a recently burned catchment in the Santa Ana Mountains, CA, USA, and compare results to those obtained using an infiltration model based on the Soil Conservation Service Curve Number.  相似文献   

19.
Data assimilation is widely used to improve flood forecasting capability, especially through parameter inference requiring statistical information on the uncertain input parameters (upstream discharge, friction coefficient) as well as on the variability of the water level and its sensitivity with respect to the inputs. For particle filter or ensemble Kalman filter, stochastically estimating probability density function and covariance matrices from a Monte Carlo random sampling requires a large ensemble of model evaluations, limiting their use in real-time application. To tackle this issue, fast surrogate models based on polynomial chaos and Gaussian process can be used to represent the spatially distributed water level in place of solving the shallow water equations. This study investigates the use of these surrogates to estimate probability density functions and covariance matrices at a reduced computational cost and without the loss of accuracy, in the perspective of ensemble-based data assimilation. This study focuses on 1-D steady state flow simulated with MASCARET over the Garonne River (South-West France). Results show that both surrogates feature similar performance to the Monte-Carlo random sampling, but for a much smaller computational budget; a few MASCARET simulations (on the order of 10–100) are sufficient to accurately retrieve covariance matrices and probability density functions all along the river, even where the flow dynamic is more complex due to heterogeneous bathymetry. This paves the way for the design of surrogate strategies suitable for representing unsteady open-channel flows in data assimilation.  相似文献   

20.
Numerical simulations of non-ergodic transport of a non-reactive solute plume by steady-state groundwater flow under a uniform mean velocity, , were conducted in a three-dimensional heterogeneous and statistically isotropic aquifer. The hydraulic conductivity, K(x), is modeled as a random field which is assumed to be log-normally distributed with an exponential covariance. Significant efforts are made to reduce the simulation uncertainties. Ensemble averages of the second spatial moments of the plume and the plume centroid variances were simulated with 1600 Monte Carlo (MC) runs for three variances of log K, Y2=0.09, 0.23, and 0.46, and a square source normal to of three dimensionless lengths. It is showed that 1600 MC runs are needed to obtain stabilized results in mildly heterogeneous aquifers of Y20.5 and that large uncertainty may exist in the simulated results if less MC runs are used, especially for the transverse second spatial moments and the plume centroid variance in transverse directions. The simulated longitudinal second spatial moment and the plume centroid variance in longitudinal direction fit well to the first-order theoretical results while the simulated transverse moments are generally larger than the first-order values. The ergodic condition for the second spatial moments is far from reaching in all cases simulated and transport in transverse directions may reach ergodic condition much slower than that in longitudinal direction.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号