首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Sequential P extraction was combined with electron microscop and X-ray spectroscopy to characterise various P species and to study their transformation in settling seston and in recent sediment. During early diagenesis most of the particulate P formed in the water was redissolved. No net transformation into species that would resist dissolution was observed.It was shown that
•  the phosphorus (P) content and the P flux of settling particles varied seasonally over one order of magnitude
•  particles became enriched with reductant soluble P (BD-P) while settling through the hypolimnion
•  changes in BD-P were highly significantly correlated with changes in reductant soluble iron (BD-Fe)
•  bacteria oxidising Fe and Mn seemed to be mainly responsible for this increase in P concentration
•  other fractions including organic P did not change during sedimentation
•  most of the organic P and of the Fe bound P and 70% of TP was released from the sediment during early diagenesis
•  the sediment surface did not act as a trap for P migrating upwards from deeper sediment layers
•  CaCO3 sedimentation contributed little to P sedimentation but significantly to the permanent burial of P.
  相似文献   

2.
During a cruise on board RV Gauss in May/June 1988, joint investigations into organochlorine compounds, dissolved trace metals, petroleum hydrocarbons and basic hydrography were carried out at representative stations of the Baltic Monitoring Programme (BMP). The aim of the cruise was to study distribution patterns and — using previous data — to establish temporal trends if at all discernible.Each group of contaminants investigated showed specific characteristics, with differences even between compounds within the same group. The differences are due to:
–  - the partition of contaminants between dissolved and adsorbed form;
–  - the response to redox conditions;
–  - the influence of microbial decay, organic production or changes in speciation.
  相似文献   

3.
StochasticmodelofbedrockearthquakemotionanddeterminationofitsparametersDi-TaoNIU(牛荻涛)(Xi'anUniversityofArchitecturalSciencean...  相似文献   

4.
Time series measurements from light vessel and coastal stations in the transition area of the Kattegat and the Baltic Sea are analyzed for the period August 1975 to March 1976. The data consist of daily sampled salinities from different depth levels and daily means of sea levels, surface current, and wind, respectively. The purpose of this paper is to examine the dynamics of the mass- and salt-transport during a major salt water inflow.The principal conclusions of this paper are that
1)  the dynamics of the barotropic water exchange between the Kattegat and the Baltic Sea resembles that of a Helmholtz-resonator with a geostrophically controlled flow in the connecting channel;
2)  the water exchange is forced by both the east component of the windstress over the North Sea and the windstress component in 30o true over the Baltic Sea;
3)  the salinity in the upper layer of the Kattegat is governed by a permanent weak salt flux directed from the bottom to the surface layer and the outflow of less saline Baltic water into the upper layer of the Kattegat whereas the salinity of the Belt Sea is advected by the local currents along the main channel;
4)  the most favorable conditions for a major salt inflow are initially a mean sea level of the Baltic lowered by about 30 cm followed by west winds steadily increasing over the following several ten days. Moreover, the hitherto used definition of a major salt water inflow is discussed and an improved definition is proposed.
  相似文献   

5.
Based on published literature and the response to a questionnaire sent to geomagnetic field, ionospheric and magnetospheric researchers, several methods of choosing periods of quiet conditions based on geomagnetic records, as well as other observed parameters, have been identified. Caveats with respect to using geomagnetic indices to select quiet periods include the following:
1.  Geomagnetic disturbances are strongly local. Even if the data from all available observatories indicate quiet behavior, there is the distinct possibility that some other location, not sampled, may be disturbed.
2.  Geomagnetic indices are convenient but imperfect indicators of geomagnetic activity. Indices based on a quiet-day reference level have uncertainties comparable to the threshold value for quiet conditions. Indices representing average conditions during a 24-hr UT day may not be appropriate.
3.  Geomagnetic activity does not fully reflect the range of possible factors that influence the ionosphere or magnetosphere.
  相似文献   

6.
Using the P-and S-wave arrivals from the 150 earthquakes distributed in Tibetan Plateau and its neighboring areas, recorded by Tibetan seismic network, Sichuan seismic network, WWSSN and the mobile network situated in Tibetan Plateau, we have obtained the average P-and S-wave velocity models of the crust and upper mantle for this region:
(1)  The crust of 70 km average thickness can be divided into two main layers: 16 km thick upper crust with P-wave velocity 5.55 km/s and S-wave velocity 3.25 km/s; and 54 km thick lower crust with P-wave velocity 6.52 km/s and S-wave velocity 3.76 km/s.
(2)  The p-wave velocity at the upper most mantle is 7.97 km/s, and the S-wave 4.55 km/s. The low velocity layer in the upper mantle occurs approximately at 140 km deep with a thickness of about 55–62 km. The prominent velocity gradient beneath the LVZ is comparable to the gradient above it.
The Chinese version of this paper appeared in the Chinese edition ofActa Seismologica Sinica,14, Supp., 573–579, 1992.  相似文献   

7.
The spontaneous and evoked activity of electroreceptors were electrophysiologically studied. The results are:
1.  The spontaneous discharge rate ranged from 15 to 85 imp/s with a mode of 50 imp/s for 126 organs of 18 animals.
2.  By analysis of periodic histograms and interspike interval histograms of responses to sinusoidal electric stimulations, the frequency response characteristic was determined. The frequency response curve shows a band pass type, the band width ranged from 5 Hz to 30 Hz and the best frequency is around 15 Hz.
3.  The thresholds of responses for 47 organs were measured by injection of sinusoidal current into the organs. The threshold values were less than 0.1nA (61μV/cm) for 35 organs (74%), and less than 0.01nA (6.1 μV/cm) for the rest of 9 organs (19%).
The Chinese version of this paper appeared in the Chinese edition ofActa Seismologica Sinica,13, 380–386, 1991.  相似文献   

8.
The problem of fitting a probability distribution, here log-Pearson Type III distribution, to extreme floods is considered from the point of view of two numerical and three non-numerical criteria. The six techniques of fitting considered include classical techniques (maximum likelihood, moments of logarithms of flows) and new methods such as mixed moments and the generalized method of moments developed by two of the co-authors. The latter method consists of fitting the distribution using moments of different order, in particular the SAM method (Sundry Averages Method) uses the moments of order 0 (geometric mean), 1 (arithmetic mean), –1 (harmonic mean) and leads to a smaller variance of the parameters. The criteria used to select the method of parameter estimation are:
–  - the two statistical criteria of mean square error and bias;
–  - the two computational criteria of program availability and ease of use;
–  - the user-related criterion of acceptability.
These criteria are transformed into value functions or fuzzy set membership functions and then three Multiple Criteria Decision Modelling (MCDM) techniques, namely, composite programming, ELECTRE, and MCQA, are applied to rank the estimation techniques.  相似文献   

9.
10.
This paper will look at what we have and have not achieved in reducing the risks to human life from earthquakes in the last 50 years. It will review how success has been achieved in a few parts of the world, and consider what needs to be done by the scientific and engineering community globally to assist in the future task of bringing earthquake risks under control. The first part of the talk will re-examine what we know about the casualties from earthquakes in the last 50 years. Almost 80% of about 1 million deaths turn out to have been caused by just ten great earthquakes, together affecting a tiny proportion of the territory at risk from heavy ground shaking. The disparity between richer and poorer countries is also evident, not only in fatality rates, but also in their rates of change. But the existing casualty database turns out to be a very poor basis for observing such differences, not only because of the small number of lethal events, but also because of the very limited data on causes of death, types and causes of injury. These have been examined in detail in only a few, recent events. All that can be said with certainty is that a few wealthier earthquake-prone countries or regions have made impressive progress in reducing the risk of death from earthquakes, while most of the rest of the world has achieved comparatively little, and in some areas the problem has become much worse. The second part of the paper looks in more detail at what has been achieved country-by-country. Based on a new expert-group survey of key individuals involved in earthquake risk mitigation, it will examine what are perceived to be the successes and failures of risk mitigation in each country or group of countries. This survey will be used to highlight the achievements of those countries which have successfully tackled their earthquake risk; it will examine the processes of earthquake risk mitigation, from campaigning to retrofitting, and it will consider to what extent the achievement is the result of affluence, scientific and technical activity, political advocacy, public awareness, or the experience of destructive events. It will ask to what extent the approaches pioneered by the global leaders can be adopted by the rest. The final section of the talk will argue that it can be useful to view earthquake protection activity as a public health matter to be advanced in a manner similar to globally successful disease-control measures: it will be argued that the key components of such programmes—building in protection; harnessing new technology and creating a safety culture—must be the key components of earthquake protection strategies also. It will consider the contribution which the scientific and engineering community can make to bringing down today’s unacceptably high global earthquake risk. It will be suggested that this role is wider than commonly understood and needs to include: Building-in protection
•  Improving and simplifying information available for designers and self-builders of homes and infrastructure.
•  Devising and running “building for safety” programmes to support local builders.
Harnessing new technologies
•  Developing and testing cost-effective techniques for new construction and retrofit.
Creating a safety culture
•  Involvement in raising public awareness.
•  Political advocacy to support new legislation and other actions.
•  Prioritising action on public buildings, especially schools and hospitals.
Examples of some of these actions will be given. International collaboration is essential to ensure that the resources and expertise available in the richer countries is shared with those most in need of help. And perhaps the most important single task for the engineering community is work to counter the widespread fatalistic attitude that future earthquakes are bound to be at least as destructive as those of the past.  相似文献   

11.
IntroductionThrough20-oddyears’observationandstudyafterthe1976Tangshangreatearthquake,theseismo-electromagneticradiationprec...  相似文献   

12.
DSDP Hole 504B was drilled into 6 Ma crust, about 200 km south of the Costa Rica Rift, Galapagos Spreading Center, penetrating 1.35 km into a section that can be divided into four zones—Zone I: oxic submarine weathering; Zone II: anoxic alteration; Zones III and IV: hydrothermal alteration to greenschist facies. In Zone III there is intense veining of pillow basalts. Zone IV consists of altered sheeted dikes. Isotopic geochemical signatures in relation to the alteration zones are recorded in Hole 504B, as follows:
ZoneDepth(m)Average87Sr/86SrAverage δ18O (%o)Average δD (%o)
I275–5500.70327.3−63
II550–8900.70296.5−45
III890–10500.70355.6−31
IV1050–13500.70325.5−36
Alteration temperatures are as low as 10°C in Zones I and II based on oxygen isotope fractionation. Strontium isotopic data indicate that a circulation of seawater is much more restricted in Zone II than in Zone I. Fluid inclusion measurements of vein quartz indicate the alteration temperature was mainly 300 ± 20°C in Zones III and IV, which is consistent with secondary mineral assemblages.The strontium, oxygen, and hydrogen isotopic compositions of hydrothermal fluids which were responsible for the greenschist facies alteration in Zones III and IV are estimated to be 0.7037, 2‰, and 3‰, respectively. Strontium and oxygen isotope data indicate that completely altered portions of greenstones and vein minerals were in equilibrium with modified seawater under low water/rock ratios (in weight) of about 1.6. This value is close to that of the end-member hydrothermal fluids issuing at 21°N EPR.Basement rocks are not completely hydrothermally altered. About 32% of the greenstones in Zones III and IV have escaped alteration. Thus 1 g of fresh basalt including the 32% unaltered portion are required in order to make 1 g of end-member solution from fresh seawater in water-rock reactions.  相似文献   

13.
Two different models, a Physical Model and a Neural Net (NN), are used for the derivation of the Photosynthetically Available Radiation (PAR) from METEOSAT data in the German Bight; advantages and disadvantages of both models are discussed. The use of a NN for derivation of PAR should be preferred to the Physical Model because by construction, a NN can take the various processes determining PAR on a surface much better into account than a non-statistical model relying on averaged relations.
Kathrin SchillerEmail:
  相似文献   

14.
15.
We examine the management of livestock diseases from the producers‘ perspective, incorporating information and incentive asymmetries between producers and regulators. Using a stochastic dynamic model, we examine responses to different policy options including indemnity payments, subsidies to report at-risk animals, monitoring, and regulatory approaches to decreasing infection risks when perverse incentives and multiple policies interact. This conceptual analysis illustrates the importance of designing efficient combinations of regulatory and incentive-based policies.
Ram RanjanEmail:
  相似文献   

16.
This paper, addresses the problem of novelty detection in the case that the observed data is a mixture of a known ‘background’ process contaminated with an unknown other process, which generates the outliers, or novel observations. The framework we describe here is quite general, employing univariate classification with incomplete information, based on knowledge of the distribution (the probability density function, pdf) of the data generated by the ‘background’ process. The relative proportion of this ‘background’ component (the priorbackground’ probability), the pdf and the prior probabilities of all other components are all assumed unknown. The main contribution is a new classification scheme that identifies the maximum proportion of observed data following the known ‘background’ distribution. The method exploits the Kolmogorov–Smirnov test to estimate the proportions, and afterwards data are Bayes optimally separated. Results, demonstrated with synthetic data, show that this approach can produce more reliable results than a standard novelty detection scheme. The classification algorithm is then applied to the problem of identifying outliers in the SIC2004 data set, in order to detect the radioactive release simulated in the ‘joker’ data set. We propose this method as a reliable means of novelty detection in the emergency situation which can also be used to identify outliers prior to the application of a more general automatic mapping algorithm.
Davide D’AlimonteEmail:
Dan Cornford (Corresponding author)Email:
  相似文献   

17.
Recently, Batabyal and Nijkamp (Environ Econ Policy Stud 7:39–51, 2005) have used a theoretical model of antibiotic use to study the relative merits of interventionist (antibiotics) and non-interventionist (no antibiotics) treatment options. A key assumption in their paper is that the default treatment option is the interventionist option. Because there are several instances in which this assumption is invalid, in this paper, we suppose that the default treatment option is the non-interventionist option. Specifically, we first derive the long run average cost of treating a common infection such as acute otitis media (AOM). Next, we show that there is a particular tolerance level and that when a physician uses this tolerance level to determine when to administer the non-antibiotic medicine, the long run average cost of treating the common infection under study is minimized.
Amitrajeet A. BatabyalEmail:
  相似文献   

18.
19.
Morphological changes in coastal areas, especially in river estuaries, are of high interest in many parts of the world. Satellite data from both optical and radar sensors can help to monitor and investigate these changes. Data from both kinds of sensors being available for up to 30 years now, allow examinations over large timescales, while high resolution sensors developed within the last decade allow increased accuracy. So the creation of digital elevation models (DEMs) of, for example, the wadden sea from a series of satellite images is already possible. ENVISAT, successfully launched on March 1, 2002, continues the line of higher resolution synthetic aperture radar (SAR) imaging sensors with its ASAR instrument and now also allows several polarization modes for better separation of land and water areas. This article gives an overview of sensors and algorithms for waterline determination as well as several applications. Both optical and SAR images are considered. Applications include morphodynamic monitoring studies and DEM generation.
Andreas NiedermeierEmail:
  相似文献   

20.
The geological model about volcanism of the Green Tuff geosyncline deduced from the field observations consists of the following processes:
  1. Dome-shaped uplift with a mean diameter of 30 km.
  2. Collapse of the central part of the domes forming basins with a mean diameter of 10 km.
  3. Volcanic activity inside the collapse basins. It is considered that these consecutive processes resulted from the magmatic uplift from a deep part of the crust.
In finite element analyses performed as plane strain problems, earth's crust is assumed to be an elasto-plastic homogeneous layer and to undergo sinusoidal vertical displacement at the base of the layer due to an ascending magma reservoir. These analyses reveal that the diameter of the dome is proportional to the depth of the magma reservoir rather than to its size. The magma reservoir is estimated at 12 ~ 24 km in depth. Scale model experiments using powdered material were performed in order to reproduce a collapse basin. These three-dimensional models are reduced to a scale of 1:200,000 th of the natural size. The results of experiments show that radial and concentric cracks are produced on top of the dome and a central part encircled by concentric cracks collapses to form a basin. The boundary of the collapsed portion forms a steep cliff with a height of about 2 mm. This is equivalent to 400 m in natural size and is nearly similar to field observations.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号