首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
《Astronomy& Geophysics》2000,41(4):4.14-4.17
An RAS Meeting was held on 22 June in London to discuss joining the European Southern Observatory and the future of the UK ground-based astronomy programme. Eric Priest reports.  相似文献   

2.
Domenico solution--is it valid?   总被引:1,自引:0,他引:1  
The Domenico solution is widely used in several analytical models for simulating ground water contaminant transport scenarios. Unfortunately, many textbook as well as journal article treatments of this approximate solution are full of empirical statements that are developed without mathematical rigor. For this reason, a rigorous analysis of this solution is warranted. In this article, we present a mathematical method to derive the Domenico solution and explore its limits. Our analysis shows that the Domenico solution is a true analytical solution when the value of longitudinal dispersivity is zero. For nonzero longitudinal dispersivity values, the Domenico solution will introduce a finite amount of error. We use an example problem to quantify the nature of this error and suggest some general guidelines for the appropriate use of this solution.  相似文献   

3.
4.
5.
6.
Considerable quantities of food waste can be generated at a rapid rate in ships, particularly those with large numbers of people onboard. By virtue of the amounts involved and its nature, food waste is potentially the most difficult to manage component of a ship's garbage stream, however, in most sea areas it may be dealt with by the simple expedient of direct discharge to sea. As a consequence, only minimal attention is paid to food waste management by many ship and port operators and advisory bodies, and there is a paucity of information in the available literature. The determination that management of ships' food waste is inconsequential is, however, incorrect in many circumstances. Disposal to sea is not always possible due to restrictions imposed by MARPOL 73/78 and other marine pollution control instruments. Effective management of food waste can be critical for ships that operate in areas where disposal is restricted or totally prohibited.  相似文献   

7.
The joint occurrence of extreme hydroclimatic events, such as simultaneous precipitation deficit and high temperature, results in the so-called compound events, and has a serious impact on risk assessment and mitigation strategies. Multivariate frequency analysis (MFA) allows a probabilistic quantitative assessment of this risk under uncertainty. Analyzing precipitation and temperature records in the contiguous United States (CONUS), and focusing on the assessment of the degree of rarity of the 2014 California drought, we highlight some critical aspects of MFA that are often overlooked and should be carefully taken into account for a correct interpretation of the results. In particular, we show that an informative exploratory data analysis (EDA) devised to check the basic hypotheses of MFA, a suitable assessment of the sampling uncertainty, and a better understanding of probabilistic concepts can help to avoid misinterpretation of univariate and multivariate return periods, and incoherent conclusions concerning the risk of compound extreme hydroclimatic events. Empirical results show that the dependence between precipitation deficit and temperature across the CONUS can be positive, negative or not significant and does not exhibit significant changes in the last three decades. Focusing on the 2014 California drought as a compound event and based on the data used, the probability of occurrence strongly depends on the selected variables and how they are combined, and is affected by large uncertainty, thus preventing definite conclusions about the actual degree of rarity of this event.  相似文献   

8.
Controlled-source seismology (CSS) is the primary source of information regarding the fine structure of the lithosphere. The aim of this paper is to provide an overview of the methods that are commonly used to derive Earth models from CSS data with the focus on the wide-angle reflection/refraction method. Some outlook on the future of the CSS is presented with the special emphasis on the full-wavefield based methods like full-waveform inversion, which brings high level of objectivity into modeling, as well as significantly increases spatial resolution. It is stressed that the researchers should be aware of the limitations of how the elastic parameters transcribe into the actual rock properties which should stimulate them to go beyond the simple P-wave modeling and to build multiparameter Earth models based either on the seismic data or constrained by additional geophysical fields in order to derive sound geological interpretation of their models.  相似文献   

9.
The intensity of the geomagnetic field varies over different time scales. Yet, constraints on the maximum intensity of the field as well as for its maximum rate of change are inadequate due to poor temporal resolution and large uncertainties in the geomagnetic record. The purpose of this study is to place firm limits on these fundamental properties by constructing a high-resolution archaeointensity record of the Levant from the 11th century to the early 9th century BCE, a period over which the geomagnetic field reached its maximum intensity in Eurasia over the past 50,000 years. We investigate a 14C-dated sequence of ten layers of slag material, which accumulated within an ancient industrial waste mound of an Iron Age copper-smelting site in southern Israel. Depositional stratigraphy constrains relative ages of samples analyzed for paleointensity, and 14C dates from different horizons of the mound constrain the age of the whole sequence. The analysis yielded 35 paleointenisty data points with accuracy better than 94% and precision better than 6%, covering a period of less than 350 years, most probably 200 years. We construct a new high-resolution quasi-continuous archaeointensity curve of the Levant that displays two dramatic spikes in geomagnetic intensity, each corresponding to virtual axial dipole moment (VADM) in excess of 200 ZAm2. The geomagnetic spikes rise and fall over a period of less than 30 years and are associated with VADM fluctuations of at least 70 ZAm2. Thus, the Levantine archaeomagnetic record places new constraints on maximum geomagnetic intensity as well as for its rate of change. Yet, it is not clear whether the geomagnetic spikes are local non-dipolar features or a geomagnetic dipolar phenomenon.  相似文献   

10.
It may be possible to calculate the rate of reconnection in the corona by measuring the rate at which the temporary coronal hole formed by a coronal mass ejection (CME) disappears. This calculation is possible if the disappearance of the hole is caused by the same reconnection process which creates the giant X-ray arches associated with CMEs. These arches form just below the vertical current sheet that is created as the CME drags magnetic field lines out into interplanetary space, and they are similar in form to ‘post’-flare loops, except that they often have an upward motion that is different. Instead of continually slowing with time as ‘post’-flare loops do, they move upwards at a rate which increases, or remains nearly constant, with time. This difference has raised doubts about the relevance of reconnection to the formation and propagation of the arches. Using a two-dimensional flux rope model to calculate the size and location of the current sheet as a function of time, we find that the difference between the motion of ‘post’-flare loops and giant arches can be explained simply by the variation of the coronal Alfvén speed with height.  相似文献   

11.
This paper empirically investigates the asymptotic behaviour of the flood probability distribution and more precisely the possible occurrence of heavy tail distributions, generally predicted by multiplicative cascades. Since heavy tails considerably increase the frequency of extremes, they have many practical and societal consequences. A French database of 173 daily discharge time series is analyzed. These series correspond to various climatic and hydrological conditions, drainage areas ranging from 10 to 105 km2, and are from 22 to 95 years long. The peaks-over-threshold method has been used with a set of semi-parametric estimators (Hill and Generalized Hill estimators), and parametric estimators (maximum likelihood and L-moments). We discuss the respective interest of the estimators and compare their respective estimates of the shape parameter of the probability distribution of the peaks. We emphasize the influence of the selected number of the highest observations that are used in the estimation procedure and in this respect the particular interest of the semi-parametric estimators. Nevertheless, the various estimators agree on the prevalence of heavy tails and we point out some links between their presence and hydrological and climatic conditions.  相似文献   

12.
13.
14.
Recent observations and missions to Mars have provided us with new insight into the past habitability of Mars and its history. At the same time they have raised many questions on the planet evolution. We show that even with the few data available we can propose a scenario for the evolution of the Martian atmosphere in the last three billion years. Our model is obtained with a back integration of the Martian atmosphere, and takes into account the effects of volcanic degassing, which constitutes an input of volatiles, and atmospheric escape into space. We focus on CO2, the predominant Martian atmospheric gas.Volcanic CO2 degassing rates are obtained for different models of numerical model crust production rates [Breuer, D., Spohn, T. 2003. Early plate tectonics versus single-plate tectonics on Mars: Evidence from magnetic field history and crust evolution. J. Geophys. Res. - Planets, 108, E7, 5072, Breuer, D., Spohn, T., 2006. Viscosity of the Martian mantle and its initial temperature: Constraints from crust formation history and the evolution of the magnetic field. Planet. Space Sci. 54 (2006) 153–169; Manga, M., Wenzel, M., Zaranek, S.E., 2006. Mantle Plumes and Long-lived Volcanism on Mars as Result of a Layered Mantle. American Geophysical Union Fall Meeting 2006, Abstract #P31C-0149.] and constrained on observation. By estimating the volatile contents of the lavas, the amount of volatiles released in the atmosphere is estimated for different scenarios. Both non-thermal processes (related to the solar activity) and thermal processes are studied and non-thermal processes are incorporated in our modelling of the escape [Chassefière, E., Leblanc, F., Langlais, B., 2006, The combined effects of escape and magnetic field history at Mars. Planet. Space Sci. Volume 55, Issue 3, Pages 343–357.]. We used measurements from ASPERA and Mars Express and these models to estimate the amount of lost atmosphere.An evolution of the CO2 pressure consistent with its present state is then obtained. A crustal production rate of at least 0.01 km3/year is needed for the atmosphere to be at steady state. Moreover, we show that for most of the scenarios a rapid loss of the primary (and primordial) atmosphere due to atmospheric escape is required in the first 2 Gyr in order to obtain the present-day atmosphere. When CO2 concentration in the mantle is high enough (i.e. more than 800 ppm), our results imply that present-day atmosphere would have a volcanic origin and would have been created between 1 Gyr and 2 Gyr ago even for models with low volcanic activity. If the volcanic activity and the degassing are intense enough, then the atmosphere can even be entirely secondary and as young as 1 Gyr. However, with low activity and low CO2 concentration (less than 600 ppm), the present-day atmosphere is likely to be for the major part primordial.  相似文献   

15.
16.
PreliminarydiscusiononearthquakepredictionresearchWhetheritreliesonexperienceordeterminacyLIUQIAOWANG(王六桥)SeismologicalBu...  相似文献   

17.
ABSTRACT

From ancient times dice have been used to denote randomness. A dice throw experiment is set up in order to examine the predictability of the die orientation through time using visualization techniques. We apply and compare a deterministic-chaotic model and a stochastic model and we show that both suggest predictability in die motion that deteriorates with time, just as in hydro-meteorological processes. Namely, a die’s trajectory can be predictable for short horizons and unpredictable for long ones. Furthermore, we show that the same models can be applied, with satisfactory results, to high temporal resolution time series of rainfall intensity and wind speed magnitude, occurring during mild and strong weather conditions. The difference among the experimental and two natural processes is in the time length of the high-predictability window, which is of the order of 0.1 s, 10 min and 1 h for dice, rainfall and wind processes, respectively.  相似文献   

18.
Natural ponds are perceived as spatially and temporally highly variable ecosystems. This perception is in contrast to the often-applied sampling design with high spatial but low temporal replication. Based on a data set covering a period of six years and 20 permanently to periodically inundated ponds, we investigated whether this widely applied sampling design is sufficient to identify differences between single ponds or single years with regard to water quality and macrophyte community composition as measures of ecosystem integrity.In our study, the factor “pond”, which describes differences between individual ponds, explained 56 % and 63 %, respectively, of the variance in water quality and macrophyte composition. In contrast, the factor “year” that refers to changes between individual years, contributed less to understand the observed variability in water quality and macrophyte composition (10 % and 7 % respectively, of the variance explained). The low explanation of variance for “year” and the low year-to-year correlation for the single water quality parameter or macrophyte coverage values, respectively, indicated high but non-consistent temporal variability affecting individual pond patterns.In general, the results largely supported the ability of the widely applied sampling strategy with about one sampling date per year to capture differences in water quality and macrophyte community composition between ponds. Hence, future research can be rest upon sampling designs that give more weight to the number of ponds than the number of years in dependence on the research question and the available resources. Nonetheless, pond research would miss a substantial amount of information (7 to 10 % of the variance explained), when the sampling would generally be restricted to one year. Moreover, we expect that the importance of multiple-year sampling will likely increase in periods and regions of higher hydrological variability compared to the average hydrological conditions encountered in the studied period.  相似文献   

19.
The value of a formally defined Anthropocene for geomorphologists is discussed. Human impacts have been diachronistic, multifaceted and episodic, as demonstrated by the record of alluvial deposition in the UK. Rather than boxing time into discrete eras or periods, modern research uses calendar dates and multiple dating techniques to explore co‐trajectories for a range of human impacts. Despite the value of ‘The Anthropocene’ as an informal concept and as a prompt to useful debate, arriving at a single, generally acceptable formal definition is impractical, and has some disadvantages. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

20.
New concepts in ecological risk assessment: where do we go from here?   总被引:10,自引:0,他引:10  
Through the use of safety factors, the use of single-species test data has been adequate for use in protective hazard assessments and criteria setting but, because hazard quotients do not consider the presence of multiple species each with a particular sensitivity or the interactions that can occur between these species in a functioning community, they are ill-suited to environmental risk assessment. Significant functional redundancy occurs in most ecosystems but this is poorly considered in single-species tests conducted under laboratory conditions. A significant advance in effects assessment was the use of the microcosm as a unit within which to test interacting populations of organisms. The microcosm has allowed the measurement of the environmental effect measures such as the NOAEC(community) under laboratory or field conditions and the application of this and other similarly derived measures to ecological risk assessment (ERA). More recently, distributions of single-species laboratory test data have been used for criteria setting and, combined with distributions of exposure concentrations, for risk assessment. Distributions of species sensitivity values have been used in an a priori way for setting environmental quality criteria such as the final acute value (FAV) derived for water quality criteria. Similar distributional approaches have been combined with modeled or measured concentrations to produce estimates of the joint probability of a single species being affected or that a proportion of organisms in a community will be impacted in a posteriori risk assessments. These techniques have not been widely applied for risk assessment of dredged materials, however, with appropriate consideration of bioavailability and spatial and nature of the data these techniques can be applied to soils and sediments.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号