首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We report an analysis of the mechanisms responsible for interannual variability in the Greenland–Iceland–Norwegian (GIN) Seas in a control integration of the HadCM3 coupled climate model. Interannual variability in sea surface temperature (SST) and sea surface salinity (SSS) is dominated by a quasi-periodic ∼7-year signal. Analyses show that the mechanism involves a competition between convection and advection. Advection carries cold, fresh, Arctic water over warm, salty, Atlantic water, while convection periodically mixes these two water masses vertically, raising SST. Convection is able to raise SST because of the presence of a subsurface temperature maximum. The GIN Seas convection in HadCM3 is forced by wind stress anomalies related to the North Atlantic Oscillation (NAO). The consequent SST anomalies feedback positively to force the atmosphere, resulting in a weak spectral peak (at ∼7 years) in GIN Seas sea level pressure. Although there is no evidence of a similar oscillation in reality, key aspects of the simulated mechanism may be relevant to understanding variability in the real GIN Seas. In particular, the potential for increases in convection to raise SST offers a possible new explanation for increases in SST that occurred between the 1960s and the late 1980s/early 1990s. These SST increases may have contributed to the observed sea-ice retreat. In addition, a positive feedback between GIN Seas SST and the atmosphere could contribute to the persistence of the NAO, potentially helping to explain its red spectrum or recent northeastward shift.
Sonia R. Gamiz-FortisEmail:
  相似文献   

2.
Extreme atmospheric events are intimately related to the statistics of atmospheric turbulent velocities. These, in turn, exhibit multifractal scaling, which is determining the nature of the asymptotic behavior of velocities, and whose parameter evaluation is therefore of great interest currently. We combine singular value decomposition techniques and wavelet transform analysis to generalize the multifractal formalism to vector-valued random fields. The so-called Tensorial Wavelet Transform Modulus Maxima (TWTMM) method is calibrated on synthetic self-similar 2D vector-valued multifractal measures and monofractal 3D vector-valued fractional Brownian fields. We report the results of some application of the TWTMM method to turbulent velocity and vorticity fields generated by direct numerical simulations of the incompressible Navier–Stokes equations. This study reveals the existence of an intimate relationship between the singularity spectra of these two vector fields which are found significantly more intermittent than previously estimated from longitudinal and transverse velocity increment statistics.
Alain ArneodoEmail:
  相似文献   

3.
This paper, addresses the problem of novelty detection in the case that the observed data is a mixture of a known ‘background’ process contaminated with an unknown other process, which generates the outliers, or novel observations. The framework we describe here is quite general, employing univariate classification with incomplete information, based on knowledge of the distribution (the probability density function, pdf) of the data generated by the ‘background’ process. The relative proportion of this ‘background’ component (the priorbackground’ probability), the pdf and the prior probabilities of all other components are all assumed unknown. The main contribution is a new classification scheme that identifies the maximum proportion of observed data following the known ‘background’ distribution. The method exploits the Kolmogorov–Smirnov test to estimate the proportions, and afterwards data are Bayes optimally separated. Results, demonstrated with synthetic data, show that this approach can produce more reliable results than a standard novelty detection scheme. The classification algorithm is then applied to the problem of identifying outliers in the SIC2004 data set, in order to detect the radioactive release simulated in the ‘joker’ data set. We propose this method as a reliable means of novelty detection in the emergency situation which can also be used to identify outliers prior to the application of a more general automatic mapping algorithm.
Davide D’AlimonteEmail:
Dan Cornford (Corresponding author)Email:
  相似文献   

4.
Recently, Batabyal and Nijkamp (Environ Econ Policy Stud 7:39–51, 2005) have used a theoretical model of antibiotic use to study the relative merits of interventionist (antibiotics) and non-interventionist (no antibiotics) treatment options. A key assumption in their paper is that the default treatment option is the interventionist option. Because there are several instances in which this assumption is invalid, in this paper, we suppose that the default treatment option is the non-interventionist option. Specifically, we first derive the long run average cost of treating a common infection such as acute otitis media (AOM). Next, we show that there is a particular tolerance level and that when a physician uses this tolerance level to determine when to administer the non-antibiotic medicine, the long run average cost of treating the common infection under study is minimized.
Amitrajeet A. BatabyalEmail:
  相似文献   

5.
We examine the management of livestock diseases from the producers‘ perspective, incorporating information and incentive asymmetries between producers and regulators. Using a stochastic dynamic model, we examine responses to different policy options including indemnity payments, subsidies to report at-risk animals, monitoring, and regulatory approaches to decreasing infection risks when perverse incentives and multiple policies interact. This conceptual analysis illustrates the importance of designing efficient combinations of regulatory and incentive-based policies.
Ram RanjanEmail:
  相似文献   

6.
Storm-related sea level variations 1958–2002 along the North Sea coast from a high-resolution numerical hindcast are investigated and compared to the results of earlier studies. Considerable variations were found from year to year and over the entire period. The large-scale pattern of these variations is consistent with that derived from previous studies, while the magnitudes of the long-term trends differ. The latter is attributed to different analysis periods, improvements in the atmospheric forcing, and the enhanced spatial resolution of the numerical simulation. It is shown that the different analysis periods, in particular, represent an issue as the increase in storm-related sea levels was found to be weaker over the last few years that have not been included in earlier studies. These changes are consistent with observed changes of the storm climate over the North Sea. It is also shown that observed and hindcast trends may differ significantly. While the latter are in agreement with observed changes in the storm climate, it may be concluded that observed sea level changes along the North Sea coast comprise a considerable fraction that cannot be attributed to changes in the large-scale atmospheric circulation.
Ralf WeisseEmail:
  相似文献   

7.
Ocean/ice interaction at the base of deep-drafted Antarctic ice shelves modifies the physical properties of inflowing shelf waters to become Ice Shelf Water (ISW). In contrast to the conditions at the atmosphere/ocean interface, the increased hydrostatic pressure at the glacial base causes gases embedded in the ice to dissolve completely after being released by melting. Helium and neon, with an extremely low solubility, are saturated in glacial meltwater by more than 1000%. At the continental slope in front of the large Antarctic caverns, ISW mixes with ambient waters to form different precursors of Antarctic Bottom Water. A regional ocean circulation model, which uses an explicit formulation of the ocean/ice shelf interaction to describe for the first time the input of noble gases to the Southern Ocean, is presented. The results reveal a long-term variability of the basal mass loss solely controlled by the interaction between waters of the continental shelf and the ice shelf cavern. Modeled helium and neon supersaturations from the Filchner–Ronne Ice Shelf front show a “low-pass” filtering of the inflowing signal due to cavern processes. On circumpolar scales, the simulated helium and neon distributions allow us to quantify the ISW contribution to bottom water, which spreads with the coastal current connecting the major formation sites in Ross and Weddell Seas.
Christian B. RodehackeEmail:
  相似文献   

8.
A method to initialize an ensemble, introduced by Evensen (Physica, D 77:108–129, 1994a; J Geophys Res 99(C5):10143–10162, 1994b; Ocean Dynamics 53:343–367, 2003), was applied to the Ocean General Circulation Model (OGCM) HYbrid Coordinate Ocean Model (HYCOM) for the Pacific Ocean. Taking advantage of the hybrid coordinates, an initial ensemble is created by first perturbing the layer interfaces and then running the model for a spin-up period of 1 month forced by randomly perturbed atmospheric forcing fields. In addition to the perturbations of layer interfaces, we implemented perturbations of the mixed layer temperatures. In this paper, we investigate the quality of the initial ensemble generated by this scheme and the influence of the horizontal decorrelation scale and vertical correlation on the statistics of the resulting ensemble. We performed six ensemble generation experiments with different combinations of horizontal decorrelation scales and with/without perturbations in the mixed layer. The resulting six sets of initial ensembles are then analyzed in terms of sustainability of the ensemble spread and realism of the correlation patterns. The ensemble spreads are validated against the difference between model and observations after 20 years of free run. The correlation patterns of six sets of ensemble are compared to each other. This study shows that the ensemble generation scheme can effectively generate an initial ensemble whose spread is consistent with the observed errors. The correlation pattern of the ensemble also exhibits realistic features. The addition of mixed layer perturbations improves both the spread and correlation. Some limitations of the ensemble generation scheme are also discussed. We found that the vertical shift of isopycnal coordinates provokes unrealistically large deviations in shallow layers near the islands of the West Pacific. A simple correction circumvents the problem.
Liying WanEmail:
  相似文献   

9.
The paper focuses on the development of reservoir operating rules for dry and rainfall events, and their implementation in the case of the Ghézala dam located in northern Tunisia (characterized by Mediterranean climate). Rainfall events are defined in terms of depth and duration that are correlated to each other. A depth analysis per event is performed, conditioned on the event duration. The gamma distribution provides a good fit to depth per event, especially for events lasting at least 6 days. The event duration fits a geometric distribution, whereas the dry events during the rainy season fit a negative binomial distribution. The climatic cycle length is fitted to a gamma distribution. On this basis, many 50-year synthetic event series were generated. Every synthetic streamflow sequence obtained from synthetic rainfall sequences as well as the one derived from the historic rainfall events time series were optimized and optimal decisions were formulated. These decisions were assessed by means of multiple regression analysis to estimate the relation between the optimal decision to every stage (dry or rainfall event) and other system variables. Optimal rules, which have a linear form, were derived by predetermined useful storage interval and depend on storage, inflows and downstream demand at dry or rainfall event t. The range of t is 1–13 days (rainfall event) and 1–57 days (dry event). The rules were satisfactory for every predetermined useful storage interval. The simulated dam performance generated by the operation rules was compared with the deterministic optimum operation and the historical operation. Also included is the comparison of the implicit stochastic optimization-based operation policy per event during the water years 1985–2002.
Fethi LebdiEmail:
  相似文献   

10.
Bayesian modelling of health risks in relation to environmental exposures offers advantages over conventional (non-Bayesian) modelling approaches. We report an example using research into whether, after controlling for different confounders, air pollution (NOx) has a significant effect on coronary heart disease mortality, estimating the relative risk associated with different levels of exposure. We use small area data from Sheffield, England and describe how the data were assembled. We compare the results obtained using a generalized (Poisson) log-linear model with adjustment for overdispersion, with the results obtained using a hierarchical (Poisson) log-linear model with spatial random effects. Both classes of models were fitted using a Bayesian approach. Including spatial random effects models both overdispersion and spatial autocorrelation effects arising as a result of analysing data from small contiguous areas. The first modelling framework has been widely used, while the second provides a more rigorous model for hypothesis testing and risk estimation when data refer to small areas. When the models are fitted controlling only for the age and sex of the populations, the generalized log-linear model shows NOx effects are significant at all levels, whereas the hierarchical log-linear model with spatial random effects shows significant effects only at higher levels. We then adjust for deprivation and smoking prevalence. Uncertainty in the estimates of smoking prevalence, arising because the data are based on samples, was accounted for through errors-in-variables modelling. NOx effects apparently are significant at the two highest levels according to both modelling frameworks.
Paul BrindleyEmail:
  相似文献   

11.
Advances in computing technologies in recent decades have provided a means of generating and performing highly sophisticated computational simulations of electromagnetic phenomena. In particular, just after the turn of the twenty-first century, improvements to computing infrastructures provided for the first time the opportunity to conduct advanced, high-resolution three-dimensional full-vector Maxwell’s equations investigations of electromagnetic propagation throughout the global Earth-ionosphere spherical volume. These models, based on the finite-difference time-domain (FDTD) method, are capable of including such details as the Earth’s topography and bathymetry, as well as arbitrary horizontal/vertical geometrical and electrical inhomogeneities and anisotropies of the ionosphere, lithosphere, and oceans. Studies at this level of detail simply are not achievable using analytical methods. The goal of this paper is to provide an historical overview and future prospectus of global FDTD computational research for both natural and man-made electromagnetic phenomena around the world. Current and future applications of global FDTD models relating to lightning sources and radiation, Schumann resonances, hypothesized earthquake precursors, remote sensing, and space weather are discussed.
Jamesina J. SimpsonEmail:
  相似文献   

12.
This paper estimates the expected annual impacts of the Pink Hibiscus Mealybug infestation on the economies of Florida and the rest of the United States. The approach involves a Markov chain analysis wherein both short run and long run expected damages from infestation are calculated. Use is made of the CLIMEX model that predicts the potential pest-establishment regions in the US. While predictions based upon the CLIMEX model extend the scope of damages beyond Florida, the damages are significantly dependent upon the rate of arrival and detection of species in those regions. Damages are significantly higher when a longer time horizon is considered. When nursery owners bear the full cost of quarantines in the form of loss of sales and treatment costs of infected plants, the cost-effectiveness of quarantines as a regulatory tool is diminished. The long run propensity of the system, in terms of the fraction of time spent in the possible ‘states’ of infestation and control, determines the extent of damages, and not the annual value of crops that could be potential hosts to the pest.
Ram RanjanEmail: Phone: +1-352-3921881Fax: +1-352-3929898
  相似文献   

13.
Two different models, a Physical Model and a Neural Net (NN), are used for the derivation of the Photosynthetically Available Radiation (PAR) from METEOSAT data in the German Bight; advantages and disadvantages of both models are discussed. The use of a NN for derivation of PAR should be preferred to the Physical Model because by construction, a NN can take the various processes determining PAR on a surface much better into account than a non-statistical model relying on averaged relations.
Kathrin SchillerEmail:
  相似文献   

14.
The heat of the Earth derives from internal and external sources. A heat balance shows that most of the heat provided by external sources is re-emitted by long-wavelength heat radiation and that the dominant internal sources are original heat and heat generated by decay of unstable radioactive isotopes. Understanding of the thermal regime of the Earth requires appreciation of properties and mechanisms for heat generation, storage, and transport. Both experimental and indirect methods are available for inferring the corresponding rock properties. Heat conduction is the dominant transport process in the Earth’s crust, except for settings where appreciable fluid flow provides a mechanism for heat advection. For most crustal and mantle rocks, heat radiation becomes significant only at temperatures above 1200°C.
Christoph ClauserEmail:
  相似文献   

15.
We model multivariate hydrological risks in the case that at least one of the variables is extreme. Recently, Heffernan JE, Tawn JA (2004) A conditional approach for multivariate extremes. J R Stat Soc B 66(3):497–546 (thereafter called HT04) proposed a conditional multivariate extreme value model which applies to regions where not all variables are extreme and simultaneously identifies the type of extremal dependence, including negative dependence. In this paper we apply this modeling strategy and provide an application to multivariate observations of five rivers in two clearly distinct regions of Puerto Rico Island and for two different seasons each. This effective dimensionality of ten-dimensions cannot be handled by the traditional models of multivariate extremes. The resulting fitted model, following HT04 model and strategies of estimation, is able to make long term estimation of extremes, conditional than other rivers are extreme or not. The model shows considerable flexibility to address the natural questions that arise in multivariate extreme value assessments. In the Puerto Rico 5 rivers application, the model clearly puts together two regions one of two rivers and another of three rivers, which show strong relationships in the rainy season. This corresponds with the geographical distribution of the rivers.
Beatriz Vaz de Melo MendesEmail:
  相似文献   

16.
The concepts of system load and capacity are pivotal in risk analysis. The complexity in risk analysis increases when the input parameters are either stochastic (aleatory uncertainty) and/or missing (epistemic uncertainty). The aleatory and epistemic uncertainties related to input parameters are handled through simulation-based parametric and non-parametric probabilistic techniques. The complexities increase further when the empirical relationships are not strong enough to derive physical-based models. In this paper, ordered weighted averaging (OWA) operators are proposed to estimate the system load. The risk of failure is estimated by assuming normally distributed reliability index. The proposed methodology for risk analysis is illustrated using an example of nine-input parameters. Sensitivity analyses identified that the risk of failure is dominated by the attitude of a decision-maker to generate OWA weights, missing input parameters and system capacity.
Rehan Sadiq (Corresponding author)Email:
  相似文献   

17.
The remediation of sites contaminated with unexploded ordnance (UXO) remains an area of intense focus for the Department of Defense. Under the sponsorship of SERDP, data fusion techniques are being developed for use in enhancing wide-area assessment UXO remediation efforts and a data fusion framework is being created to provide a cohesive data management and decision-making utility to allow for more efficient expenditure of time, labor and resources. An important first step in this work is the development of feature extraction utilities and feature probability density maps for eventual input to data fusion algorithms, making data fusion of estimates of data quality, UXO-related features, non-UXO backgrounds, and correlations among independent data streams possible. Utilizing data acquired during ESTCP’s Wide-Area Assessment Pilot Program, the results presented here successfully demonstrate the feasibility of automated feature extraction from light detection and ranging, orthophotography, and helicopter magnetometry wide-area assessment survey data acquired at the Pueblo Precision Bombing Range #2. These data were imported and registered to a common survey map grid and UXO-related features were extracted and utilized to construct survey site-wide probability density maps that are well-suited for input to higher level data fusion algorithms. Preliminary combination of feature maps from the various data sources yielded maps for the Pueblo site that offered a more accurate UXO assessment than any one data source alone.
Susan L. Rose-PehrssonEmail:
  相似文献   

18.
Some of the major advances in the field of mining in the last three decades have referred to the development of new design and planning techniques for optimizing open-pit mining and the inclusion of a stochastic perspective in economic models that is more revealing than a purely deterministic perspective. These advances include the use of parametric techniques in the design and planning process, the formulation of criteria for establishing an optimum cut-off grade policy when the economic goal is to optimize net present value (NPV), and the introduction of economic risk analysis. This paper examines some of the difficulties involved in applying these techniques—arising largely as a result of a lack of knowledge of the spatial location and distribution of the deposit grades—and analyses how these difficulties can be tackled with the help of geostatistical simulation techniques that take probabilistic criteria into consideration during the optimization process. These techniques enable equally likely representations of the deposit to be obtained that reproduce the main dispersion features for the starting experimental data (covariance or variogram, as well as the histogram). Consequently, the uncertainty in regard to the deposit as well as its influence on the economic assessment of the deposit in risk terms can be evaluated. This paper also describes a simple method for introducing price and cost increases into the risk analysis via the Monte Carlo method and shows how geological, technical and economic uncertainty can be integrated in risk analyses. Although it is true that the relationship between prices and costs is maintained constant in mining planning based on using parametric techniques, it is no less true that the risk analysis requires the use of models in which the main parameters with a bearing on deposit economics are considered as stochastic variables. The proposed methodology simplifies the calculations and easily integrates the different sources of uncertainty.
F. G. BastanteEmail:
  相似文献   

19.
Morphological changes in coastal areas, especially in river estuaries, are of high interest in many parts of the world. Satellite data from both optical and radar sensors can help to monitor and investigate these changes. Data from both kinds of sensors being available for up to 30 years now, allow examinations over large timescales, while high resolution sensors developed within the last decade allow increased accuracy. So the creation of digital elevation models (DEMs) of, for example, the wadden sea from a series of satellite images is already possible. ENVISAT, successfully launched on March 1, 2002, continues the line of higher resolution synthetic aperture radar (SAR) imaging sensors with its ASAR instrument and now also allows several polarization modes for better separation of land and water areas. This article gives an overview of sensors and algorithms for waterline determination as well as several applications. Both optical and SAR images are considered. Applications include morphodynamic monitoring studies and DEM generation.
Andreas NiedermeierEmail:
  相似文献   

20.
Over the past four or five decades many advances have been made in earthquake ground-motion prediction and a variety of procedures have been proposed. Some of these procedures are based on explicit physical models of the earthquake source, travel-path and recording site while others lack a strong physical basis and seek only to replicate observations. In addition, there are a number of hybrid methods that seek to combine benefits of different approaches. The various techniques proposed have their adherents and some of them are extensively used to estimate ground motions for engineering design purposes and in seismic hazard research. These methods all have their own advantages and limitations that are not often discussed by their proponents. The purposes of this article are to: summarise existing methods and the most important references, provide a family tree showing the connections between different methods and, most importantly, to discuss the advantages and disadvantages of each method.
John DouglasEmail:
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号