首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The increasing production––and therefore sea traffic––of vegetable oils has regularly led to spillages during the past 40 years. The accident of Allegra, on October,1st, 1997, in the English Channel gave rise to a spillage of 900 tonnes of palm nut oil. The drift of this solid vegetable oil was followed by aerial observations. Samples of oil were collected in order to analyse its chemical evolution. This study, associated with several bibliographic cases of pollution by non-petroleum oils, shows that drifting oils can mix with floating material to sink or form a crust. They can also be oxidized or disperse and/or be degraded by bacteria. They may also polymerise. The coating properties of vegetable oils act as crude oils to affect sea life, tourism and yachting. As a result, it is necessary to quickly collect the oil after a spillage, using usual equipment (booms and pumps).  相似文献   

2.
A database of earthquake-induced landslides has been compiled which extends the work of Keefer (Keefer DK. Landslides caused by earthquakes. Bulletin of the Geological Society of America 1984;95:406–421) who covered the period 1811–1980 to 1997. A total of 36 earthquakes world-wide are included, the new database having about the same number of earthquakes as reported by Keefer. Correlations evolving from the new database are compared with those of Keefer. Generally the results are very similar, though the presence of extreme outliers in some of the correlations emphasises the need to be aware of special cases, particularly those involving quick clay landslides. Seismological features, including multiple earthquakes and simultaneous arrival of different phases of seismic waves, also influence the outliers. The correlations between earthquake magnitude and total landslide area, however, differ somewhat from Keefer's. For the intermediate magnitude range 5.3–7.0, a modified correlation is suggested. The scatter of the data from which the correlations are derived is greater than found by Keefer. This is ascribed to the different geographic locations of the earthquakes in the two data sets.  相似文献   

3.
A new approach for streamflow simulation using nonparametric methods was described in a recent publication (Sharma et al. 1997). Use of nonparametric methods has the advantage that they avoid the issue of selecting a probability distribution and can represent nonlinear features, such as asymmetry and bimodality that hitherto were difficult to represent, in the probability structure of hydrologic variables such as streamflow and precipitation. The nonparametric method used was kernel density estimation, which requires the selection of bandwidth (smoothing) parameters. This study documents some of the tests that were conduced to evaluate the performance of bandwidth estimation methods for kernel density estimation. Issues related to selection of optimal smoothing parameters for kernel density estimation with small samples (200 or fewer data points) are examined. Both reference to a Gaussian density and data based specifications are applied to estimate bandwidths for samples from bivariate normal mixture densities. The three data based methods studied are Maximum Likelihood Cross Validation (MLCV), Least Square Cross Validation (LSCV) and Biased Cross Validation (BCV2). Modifications for estimating optimal local bandwidths using MLCV and LSCV are also examined. We found that the use of local bandwidths does not necessarily improve the density estimate with small samples. Of the global bandwidth estimators compared, we found that MLCV and LSCV are better because they show lower variability and higher accuracy while Biased Cross Validation suffers from multiple optimal bandwidths for samples from strongly bimodal densities. These results, of particular interest in stochastic hydrology where small samples are common, may have importance in other applications of nonparametric density estimation methods with similar sample sizes and distribution shapes. Received: November 12, 1997  相似文献   

4.
A new approach for streamflow simulation using nonparametric methods was described in a recent publication (Sharma et al. 1997). Use of nonparametric methods has the advantage that they avoid the issue of selecting a probability distribution and can represent nonlinear features, such as asymmetry and bimodality that hitherto were difficult to represent, in the probability structure of hydrologic variables such as streamflow and precipitation. The nonparametric method used was kernel density estimation, which requires the selection of bandwidth (smoothing) parameters. This study documents some of the tests that were conduced to evaluate the performance of bandwidth estimation methods for kernel density estimation. Issues related to selection of optimal smoothing parameters for kernel density estimation with small samples (200 or fewer data points) are examined. Both reference to a Gaussian density and data based specifications are applied to estimate bandwidths for samples from bivariate normal mixture densities. The three data based methods studied are Maximum Likelihood Cross Validation (MLCV), Least Square Cross Validation (LSCV) and Biased Cross Validation (BCV2). Modifications for estimating optimal local bandwidths using MLCV and LSCV are also examined. We found that the use of local bandwidths does not necessarily improve the density estimate with small samples. Of the global bandwidth estimators compared, we found that MLCV and LSCV are better because they show lower variability and higher accuracy while Biased Cross Validation suffers from multiple optimal bandwidths for samples from strongly bimodal densities. These results, of particular interest in stochastic hydrology where small samples are common, may have importance in other applications of nonparametric density estimation methods with similar sample sizes and distribution shapes. Received: November 12, 1997  相似文献   

5.
Geomagnetism and Aeronomy - The Forbush decreases for the period from 1997 to 2020 were studied based on data from the database on Forbush effects and interplanetary disturbances created and...  相似文献   

6.
Geomagnetism and Aeronomy - Forbush decreases occurring from 1997 to 2017 (1055 events in total) have been analyzed with the use of a database of Forbush effects and interplanetary disturbances...  相似文献   

7.
《Journal of Geodynamics》2010,49(3-5):299-304
The Global Geodynamics Project (GGP) started on July 1, 1997 and is now in its 11th year of operation. It has a relatively small number of stations (24), compared to seismic (GSN) or geodetic (GPS) networks, but it is the only database that is accumulating relative gravity measurements worldwide. As any scientific organization matures, there is a change in the culture of the project and the people involved. To remain viable, it is necessary not only to maintain the original goals, but also to incorporate new ideas and applications on the science involved. The main challenges within GGP are to ensure: (a) that the instruments are properly calibrated, (b) that data is being recorded with the highest accuracy, and with appropriate hydrological instrumentation, and (c) that the flow of data from all recording stations to the ICET database continues as agreed in within the GGP framework. These practical matters are the basis for providing high quality recordings that will extend the usefulness of the network into the future to meet new challenges in geosciences. Several new stations have been brought into operation in the past few years, but the data availability from some of these stations still leaves room for improvement. Nevertheless, the core group of stations established more than 10 years ago has been able to maintain the high standards of the original concept, and much research has been published using network data in areas as diverse as hydrology, polar motion, and Earth's normal modes. GGP will also participate in some of the scientific tasks of the Global Geodetic Observing System program, at least initially by providing relative gravity measurements for collocation with other high precision geodetic measurements.  相似文献   

8.
《中国地震科学主题词表》目前在地震科技信息数据库建设以及地震期刊出版等工作中得到了广泛的应用。作者在十多年的“中国地震科技文献库(英文版)”建设中,使用该表对13 000余篇文献进行主题词标引,并与《汉语主题词表》作了对比研究。在此基础上提出对《中国地震科学主题词表》的修订意见,认为该表的重新修订再版已迫在眉睫。  相似文献   

9.
This paper presents the EOF analysis results of the lightning density (LD) anomalies for the different seasons in southeastern China and Indochina Peninsula by using the OTD/LIS database (June 1995 to Feb. 2003) of the global LD with 2.5Ü×2.5× resolution offered by Global Hydrology Resource Center. It is shown that the LD positive anomalies in the region occurred at the same time of NINO3 SSTA steep increase in the spring of 1997 and remained to be a higher level till the next spring, as well the corresponding anomaly percent maximum in different seasons was 89%, 30%, 45%, 498% and 55% successively from the beginning to the end of the 1997/98 El Niño event (ENSO). The centre of the LD positive anomalies for the spring or winter season is located at southeastern China and the adjacent coastal areas, but it for the summer or autumn season is located at the southern Indochina Peninsula and Gulf of Thailand, whose position for each season in the ENSO as contrasted with the normal years has a westward shift, and especially for winter or spring season a northward shift at the same time. In addition, an analysis of the interannual variations in the LD anomaly percent, convective precipitation and H-CAPE days in southern China shows that each among the three anomaly percents is correlative with the other for the positive anomaly zone and Kuroshio area. The relative variation of LD during the El Niño period is the highest among the three rates and is larger than that during the non-El Niño period, meaning that the response of lightning activities to the ENSO is the most sensitive in both areas. But the response of lightning activities and precipitation to the ENSO appears to be more complex and diversified either in Kuroshio area or in the Qinghai-Tibet Plateau and northwestern and northeastern China.  相似文献   

10.
The article deals with the analysis of worldwide research patterns concerning ground penetrating radar (GPR) during 1995–2014. To do this, the Thomson Reuters’ Science Citation Index Expanded (SCI-EXPANDED) and the Social Sciences Citation Index accessed via the Web of Science Core Collection were the two bibliographic databases taken as a reference. We pay attention to the document typology and language, the publication trend and citations, the subject categories and journals, the collaborations between authors, the productivity of the authors, the most cited articles, the countries and the institutions involved, and other hot issues. Concerning the main research subfields involving GPR use, there were five, physical–mathematical, sedimentological–stratigraphical, civil engineering/engineering geology/cultural heritage, hydrological (HD), and glaciological (GL), subfields.  相似文献   

11.
Strong-motion data from eight significant well-documented earthquakes in Iran have been simulated using a stochastic modeling technique for finite faults proposed by Beresnev and Atkinson [Bull Seismol Soc Am 87 (1997) 67–84; Seism Res Lett 69 (1998) 27–32]. The database consists of 61 three-component records from eight earthquakes of magnitude ranging from M 6.3 to M 7.4, recorded at hypocentral distances up to 200 km. The model predictions are in good agreement with available Iranian strong-motion data as evidenced by near-zero average of differences between logarithms of the observed and predicted values for all frequencies. The strength factor, sfact, a quantity that controls the high-frequency radiation from the source is determined, on an event-by-event basis, by fitting simulated to observed response spectra.  相似文献   

12.
The first phase of the Next Generation Attenuation (NGA) project has now finished, resulting in the publication of five new sets of empirical ground-motion models for PGA, PGV and response spectral ordinates. These models mark a significant advancement in the state-of-the-art in empirical ground-motion modelling and include many effects that are not accounted for in existing European equations. Under the assumption that the Euro-Mediterranean database from which the European relationships are derived is unlikely to drastically change in the near future, a prudent question to ask is: can the NGA models be applied in Europe? In order to answer this question, the NGA model of Boore and Atkinson (PEER Report 2007/01, Pacific Earthquake Engineering Research Center, Berkeley, CA, 234 pp., 2007), which is shown to be representative of the NGA models as a suite, is compared with the dataset used for the development of the most recent European empirical ground-motion models for response spectral ordinates and peak ground velocity. The comparisons are made using analyses of model residuals and the likelihood approach of Scherbaum et al. (Bull Seism Soc Am 94(6):2164–2185, 2004). The analyses indicate that for most engineering applications, and particularly for displacement-based approaches to seismic design, the NGA models may confidently be applied within Europe. Furthermore, it is recommended that they be used in conjunction with existing European models to provide constraint on finite-fault effects and non-linear site response within logic-tree frameworks. The findings also point to the potential benefits of merging the NGA and European datasets.  相似文献   

13.
 Satellite data offer a means of supplementing ground-based monitoring during volcanic eruptions, especially at times or locations where ground-based monitoring is difficult. Being directly and freely available several times a day, data from the advanced very high resolution radiometer (AVHRR) offers great potential for near real-time monitoring of all volcanoes across large (3000×3000 km) areas. Herein we describe techniques to detect and locate activity; estimate lava area, thermal flux, effusion rates and cumulative volume; and distinguish types of activity. Application is demonstrated using data for active lavas at Krafla, Etna, Fogo, Cerro Negro and Erebus; a pyroclastic flow at Lascar; and open vent systems at Etna and Stromboli. Automated near real-time analysis of AVHRR data could be achieved at existing, or cheap to install, receiving stations, offering a supplement to conventional monitoring methods. Received: 21 January 1997 / Accepted: 3 April 1997  相似文献   

14.
15.
Scherbaum et al. [(2004) Bull Seismolo Soc Am 94(6): 2164–2185] proposed a likelihood-based approach to select and rank ground-motion models for seismic hazard analysis in regions of low-seismicity. The results of their analysis were first used within the PEGASOS project [Abrahamson et al. (2002), In Proceedings of the 12 ECEE, London, 2002, Paper no. 633] so far the only application of a probabilistic seismic hazard analysis (PSHA) in Europe which was based on a SSHAC Level 4 procedure [(Budnitz et al. 1997, Recommendations for PSHA: guidance on uncertainty and use of experts. No. NUREG/CR-6372-V1). The outcome of this project have generated considerable discussion (Klügel 2005, Eng Geol 78:285–307, 2005b) Eng Geol 78: 285–307, (2005c) Eng Geol 82: 79–85 Musson et al. (2005) Eng Geol 82(1): 43–55]; Budnitz et al. (2005), Eng Geol 78(3–4): 285–307], a central part of which is related to the issue of ground-motion model selection and ranking. Since at the time of the study by Scherbaum et al. [(2004.) Bull Seismolo Soc Am 94(6): 2164–2185], only records from one earthquake were available for the study area, here we test the stability of their results using more recent data. Increasing the data set from 12 records of one earthquake in Scherbaum et al. [(2004) Bull Seismolo Soc Am 94(6): 2164–2185] to 61 records of 5 earthquakes, which have mainly occurred since the publication of the original study, does not change the set of the three top-ranked ground-motion models [Abrahamson and Silva (1997) Seismolo Res Latt 68(1): 94–127; Lussou et al. (2001) J Earthquake Eng 5(1):13–33; Berge-Thierry et al. (2003) Bull Seismolog Soc Am 95(2): 377–389. Only for the lower-ranked models do we obtain modifications in the ranking order. Furthermore, the records from the Waldkirch earthquake (Dec, 5th, 2004, M w = 4.9) enabled us to develop a new stochastic model parameter set for the application of Campbell’s [(2003) Bull Seismolo Soc Am 93(3): 1012–1033] hybrid empirical model to SW Germany and neighbouring regions.  相似文献   

16.
General database for ground water site information   总被引:3,自引:0,他引:3  
In most cases, analysis and modeling of flow and transport dynamics in ground water systems require long-term, high-quality, and multisource data sets. This paper discusses the structure of a multisite database (the H+ database) developed within the scope of the ERO program (French Environmental Research Observatory, http://www.ore.fr). The database provides an interface between field experimentalists and modelers, which can be used on a daily basis. The database structure enables the storage of a large number of data and data types collected from a given site or multiple-site network. The database is well suited to the integration, backup, and retrieval of data for flow and transport modeling in heterogeneous aquifers. It relies on the definition of standards and uses a templated structure, such that any type of geolocalized data obtained from wells, hydrological stations, and meteorological stations can be handled. New types of platforms other than wells, hydrological stations, and meteorological stations, and new types of experiments and/or parameters could easily be added without modifying the database structure. Thus, we propose that the database structure could be used as a template for designing databases for complex sites. An example application is the H+ database, which gathers data collected from a network of hydrogeological sites associated with the French Environmental Research Observatory.  相似文献   

17.
A database of intensity observations from instrumentally recorded earthquakes in South Africa has been compiled as a contribution to the characterisation of seismic hazard. The database contains about 1,000 intensity data points (IDPs) that have been assigned from macroseismic observations retrieved from newspaper reports and questionnaires, and also digitised from previously published isoseismal maps. The database includes IDPs from 57 earthquakes with magnitudes in the range of M w 2.2 to 6.4, at epicentral distances up to 1,000 km. Sixteen events have 20 or more IDPs, with half of these events having more than 80 IDPs. The database is dominated by relatively low intensity values, mostly determined from human perception of shaking rather than structural damage. However, 19 IDPs correspond to intensity values greater than VI MMI-56. Using geological maps of South Africa, the sites of 60 % the IDPs were geologically classified as either ‘rock’ or ‘soil’, the uncertainty in locations precluding such a classification for the remaining data points. A few of the IDPs identified as being from soil sites appear to be strongly influenced by site effects, and these were removed from the trimmed database created for exploring ground-motion levels. The trimmed database includes 15 earthquakes which have a minimum of five useful IDPs, excluding those with intensity MMI?=?I and those based on a single observation. After removing such points, and those identified as clear ‘outliers’, a total of 436 useful IDPs were selected.  相似文献   

18.
Over the last decades, cosmogenic exposure dating has permitted major advances in many fields of Earth surface sciences and particularly in paleoglaciology. Yet, exposure age calculation remains a complicated and dense procedure. It requires numerous choices of parameterization and the use of an accurate production rate.This study describes the CREp program (http://crep.crpg.cnrs-nancy.fr) and the ICE-D production rate online database (http://calibration.ice-d.org). This system is designed so that the CREp calculator will automatically reflect the current state of this global calibration database production rate, ICE-D. ICE-D will be regularly updated in order to incorporate new calibration data and reflect the current state of the available literature.CREp is a Octave/Matlab© online code that computes Cosmic Ray Exposure (CRE) ages for 3He and 10Be. A stand-alone version of the CREp code is also released with the present article. Note however that only the online version is connected to the online database ICE-D. The CREp program offers the possibility to calculate ages with two scaling models: i.e. the empirical Lal-Stone time-dependent model (Balco et al., 2008; Lal, 1991; Stone, 2000) with the muon parameters of Braucher et al. (2011), and the Lifton-Sato-Dunai (LSD) theoretical model (Lifton et al., 2014). The default atmosphere model is the ERA-40 database (Uppala et al., 2005), but one may also use the standard atmosphere for comparison (N.O.A.A, 1976). To perform the time-dependent correction, users may import their own geomagnetic database for paleomagnetic corrections or opt for one of the three proposed datasets (Lifton, 2016; Lifton et al., 2014; Muscheler et al., 2005).For the important choice of the production rate, CREp is linked to a database of production rate calibration data that is part of the ICE-D (Informal Cosmogenic-nuclide Exposure-age Database) project (http://calibration.ice-d.org). This database includes published empirical calibration rate studies that are publicly available at present, comprising those of the CRONUS-Earth and CRONUS-EU projects, as well as studies from other projects. In the present study, the efficacy of the different scaling models has also been evaluated looking at the statistical dispersion of the computed Sea Level High Latitude (SLHL) production rates. Lal/Stone and LSD models have comparable efficacies, and the impact of the tested atmospheric model and the geomagnetic database is also limited.Users however have several possibilities to select the production rate: 1) using a worldwide mean value, 2) a regionally averaged value (not available in regions with no data), 3) a local unique value, which can be chosen among the existing dataset or imported by the user, or 4) any combination of multiple calibration data.If a global mean is chosen, the 1σ uncertainty arising from the production rate is about 5% for 10Be and 10% for 3He. If a regional production rate is picked, these uncertainties are potentially lower.CREp is able to calculate a large number of ages in a reasonable time (typically < 30 s for 50 samples). The user may export a summary table of the computed ages and the density probability function associated with each age (in the form of a spreadsheet).  相似文献   

19.
The interaction between a gaining stream and a water-table aquifer is studied at an outwash plain. The aquifer is hydraulically well connected to the stream. Pumping tests were carried out in 1997 and 1998 in two wells 60 m from the stream, screening different depths of the aquifer. Drawdown was measured on both sides of the stream. Hydraulic head, drawdown, and stream depletion data were analyzed using numerical flow models. Similar models were fitted to each of two different data sets: Model A was fitted to steady-state hydraulic head and streamflow gain data not influenced by pumping; and model B was fitted to drawdown data measured during the 1998 pumping test. Each calibrated model closely fits its calibration data; however, predictions were biased if model A was used to predict the calibration data of model B, and vice versa. To further test the models, they were used to predict streamflow depletion during the two pumping tests as well as the drawdown during the 1997 test. Neither of these data were used for calibration. Model A predicted the measured depletions fairly accurately during both tests, whereas the predicted drawdowns in 1997 were significantly larger than actually measured. Contrary to this, the 1997 drawdowns predicted by model B were nearly unbiased; the predicted depletions deviate significantly from the measured depletions in 1997, but they compare well with the observations in 1998. Thus, although field work and analyses were extensive and done carefully to develop a ground water flow model that could predict both drawdown and streamflow depletion, the model predictions are biased. Analyses indicate that the deviations between model and data may be because of error in the models' representations of either the release of water from storage or of the hydrology in the riparian zone.  相似文献   

20.
世界应力图2000版( WSM2000)介绍及使用说明   总被引:5,自引:0,他引:5  
全球地壳应力状态对于地学及相关领域科学研究意义重大,因此测量全球地壳应力并绘制全球地壳应力图一直受到国际地学界广泛关注.1986年开始实施的世界应力图计划(WSMP:World Stress Map Project,以下简称WSM)经各国专家多年合作和努力工作,现已出版了最新WSM2000版本应力数据库.该数据库在WSMl997版基础上新增收录了4216个观测数据,使其总共包含了全球10920个可靠应力测量数据,并绘制了全球应力图和部分地区的局部应力图,为地质地球物理研究提供了准确详细的应力数据.本文结合WSM官方网站的资料从数据库结构、数据来源、数据质量、数据摘要描述等方面介绍WSM2000数据库,以使国内研究迅速利用此宝贵的数据资源.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号