首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 812 毫秒
1.
Six national-scale, or near national-scale, geochemical data sets for soils or stream sediments exist for the United States. The earliest of these, here termed the ‘Shacklette’ data set, was generated by a U.S. Geological Survey (USGS) project conducted from 1961 to 1975. This project used soil collected from a depth of about 20 cm as the sampling medium at 1323 sites throughout the conterminous U.S. The National Uranium Resource Evaluation Hydrogeochemical and Stream Sediment Reconnaissance (NURE-HSSR) Program of the U.S. Department of Energy was conducted from 1975 to 1984 and collected either stream sediments, lake sediments, or soils at more than 378,000 sites in both the conterminous U.S. and Alaska. The sampled area represented about 65% of the nation. The Natural Resources Conservation Service (NRCS), from 1978 to 1982, collected samples from multiple soil horizons at sites within the major crop-growing regions of the conterminous U.S. This data set contains analyses of more than 3000 samples. The National Geochemical Survey, a USGS project conducted from 1997 to 2009, used a subset of the NURE-HSSR archival samples as its starting point and then collected primarily stream sediments, with occasional soils, in the parts of the U.S. not covered by the NURE-HSSR Program. This data set contains chemical analyses for more than 70,000 samples. The USGS, in collaboration with the Mexican Geological Survey and the Geological Survey of Canada, initiated soil sampling for the North American Soil Geochemical Landscapes Project in 2007. Sampling of three horizons or depths at more than 4800 sites in the U.S. was completed in 2010, and chemical analyses are currently ongoing. The NRCS initiated a project in the 1990s to analyze the various soil horizons from selected pedons throughout the U.S. This data set currently contains data from more than 1400 sites. This paper (1) discusses each data set in terms of its purpose, sample collection protocols, and analytical methods; and (2) evaluates each data set in terms of its appropriateness as a national-scale geochemical database and its usefulness for national-scale geochemical mapping.  相似文献   

2.
欧美地下水有机污染调查评价进展   总被引:5,自引:0,他引:5  
刘菲  王苏明  陈鸿汉 《地质通报》2010,29(5):907-917
1999年中国地质调查局启动了第一个地下水有机污染调查项目,当时检测指标只有20个(包括11个挥发性有机污染物、8种有机氯农药和1种多环芳烃)。“十一五”的地下水有机污染调查必测项目包含了38项(挥发性指标28项、有机氯农药9项和1种多环芳烃),取得了地下水有机污染的基本资料。但从对国外文献的调研来看,地下水中有机污染的种类远远超过38种。为了更全面地掌握中国地下水的质量,有必要对不同地区或不同类型的地下水中典型的有机污染物的种类进行研究,为后续地下水有机污染调查的增项做准备。通过检索美国环保局(USEPA)、美国地质调查局(USGS)和欧盟(EU)近年来的地下水质量年度报告和相关文献,调研了地下水中典型有机污染物的类型,选出最常检出的有机污染物,形成最常检出的有机污染物的检出率排序表,列出了检出率高的前50个污染物的名单。  相似文献   

3.
欧美地下水有机污染调查评价进展   总被引:11,自引:0,他引:11  
刘菲  王苏明  陈鸿汉 《地质通报》2010,29(6):907-917
1999年中国地质调查局启动了第一个地下水有机污染调查项目,当时检测指标只有20个(包括11个挥发性有机污染物、8种有机氯农药和1种多环芳烃)。"十一五"的地下水有机污染调查必测项目包含了38项(挥发性指标28项、有机氯农药9项和1种多环芳烃),取得了地下水有机污染的基本资料。但从对国外文献的调研来看,地下水中有机污染的种类远远超过38种。为了更全面地掌握中国地下水的质量,有必要对不同地区或不同类型的地下水中典型的有机污染物的种类进行研究,为后续地下水有机污染调查的增项做准备。通过检索美国环保局(USEPA)、美国地质调查局(USGS)和欧盟(EU)近年来的地下水质量年度报告和相关文献,调研了地下水中典型有机污染物的类型,选出最常检出的有机污染物,形成最常检出的有机污染物的检出率排序表,列出了检出率高的前50个污染物的名单。  相似文献   

4.
In 1997, the Federal Emergency Management Agency (FEMA), National Oceanic and Atmospheric Administration (NOAA), U.S. Geological Survey (USGS), and the five western States of Alaska, California, Hawaii, Oregon, and Washington joined in a partnership called the National Tsunami Hazard Mitigation Program (NTHMP) to enhance the quality and quantity of seismic data provided to the NOAA tsunami warning centers in Alaska and Hawaii. The NTHMP funded a seismic project that now provides the warning centers with real-time seismic data over dedicated communication links and the Internet from regional seismic networks monitoring earthquakes in the five western states, the U.S. National Seismic Network in Colorado, and from domestic and global seismic stations operated by other agencies. The goal of the project is to reduce the time needed to issue a tsunami warning by providing the warning centers with high-dynamic range, broadband waveforms in near real time. An additional goal is to reduce the likelihood of issuing false tsunami warnings by rapidly providing to the warning centers parametric information on earthquakes that could indicate their tsunamigenic potential, such as hypocenters, magnitudes, moment tensors, and shake distribution maps. New or upgraded field instrumentation was installed over a 5-year period at 53 seismic stations in the five western states. Data from these instruments has been integrated into the seismic network utilizing Earthworm software. This network has significantly reduced the time needed to respond to teleseismic and regional earthquakes. Notably, the West Coast/Alaska Tsunami Warning Center responded to the 28 February 2001 Mw 6.8 Nisqually earthquake beneath Olympia, Washington within 2 minutes compared to an average response time of over 10 minutes for the previous 18 years.  相似文献   

5.
Concentration data on 73 individual constituents in United States Geological Survey (USGS) Geochemical Exploration Reference Materials GXR-1 to GXR-6 have been collected from 131 journal articles and technical reports. These data are summarized as consensus (mean) values with uncertainties expressed as one standard deviation. Mean values are also calculated as a function of analytical procedure and all raw data are given in the tables. Recommended values are proposed based upon data criteria used by NIST (National Institute of Standards and Technology, formerly the National Bureau of Standards or NBS).  相似文献   

6.
This paper contains the results of an extensive isotopic study of United States Geological Survey GSD‐1G and MPI‐DING reference glasses. Thirteen different laboratories were involved using high‐precision bulk (TIMS, MC‐ICP‐MS) and microanalytical (LA‐MC‐ICP‐MS, LA‐ICP‐MS) techniques. Detailed studies were performed to demonstrate the large‐scale and small‐scale homogeneity of the reference glasses. Together with previously published isotopic data from ten other laboratories, preliminary reference and information values as well as their uncertainties at the 95% confidence level were determined for H, O, Li, B, Si, Ca, Sr, Nd, Hf, Pb, Th and U isotopes using the recommendations of the International Association of Geoanalysts for certification of reference materials. Our results indicate that GSD‐1G and the MPI‐DING glasses are suitable reference materials for microanalytical and bulk analytical purposes.  相似文献   

7.
美国能源部正在实施干热岩“地热能前沿瞭望台研究计划”(FORGE计划)。它是以经典干热岩定义的干热岩勘查开发为约束,通过增强型地热系统(EGS)示范工程建设实践,形成新一代EGS试验平台。美国本着“可复制的结果=巨大的潜力”的理念,实现干热岩勘查开发技术新突破,以满足美国1亿家庭绿色电力供应为实际应用目标。中美典型EGS场地勘查现状对比结果表明:犹他州米尔福德与青海省共和县恰卜恰两个典型EGS场地具可比性,大致处于“并跑”的水平;在天然裂隙系统、原位地应力场、压裂参数获取与压裂方案制定等方面,米尔福德EGS场地有所超前。据此建议有关部门加快青海省共和县恰卜恰EGS场地进入勘查开发阶段,以提高我国干热岩勘查开发技术水平,早日实现EGS工程化。  相似文献   

8.
Reference samples of soils from the Institute of Applied Physics, Irkutsk (RIAP), the Institute of Geochemistry, Irkutsk (IGI) and the United States Geological Survey, Reston (USGS) were analysed with the aim of determining Ag, B, Ge, Mo, Sn, Tl and W abundances by an atomic emission method with air-stabilised D.C. arc excitation. Two series of reference samples of soils and bottom sediments, GSS-1-8 and GSD-1-12 (IGGE), were used to ensure the traceability link for the analytical results. Traceability was also demonstrated through the comparison of measured results by AES and ICP-MS methods. It is shown that the reference samples GSS-1-8 and GSD-1-12 satisfied the "fitness-for-purpose" criterion (uncertainty U of the certified value should be one-third to one-tenth the magnitude of routine laboratory data uncertainty S (S/U > 3-10)) and can be applied for calibrating AES techniques.  相似文献   

9.
Based on criteria developed by the international Atomic Energy Agency (IAEA), potential disposal sites for defueled, decommissioned nuclear submarines appear to exist in deep water south of the Mendocino Fracture Zone within 200 nautical miles of the United States Oceanographic measurements in the water column and at the sea floor in a study area (W-N) at 39 5°N, 127 5°W will allow the operational and radiological consequences of deep-sea disposal to be compared with land burial of old submarines. The W-N studies also are yielding new data that will provide insights to the deposition and early diagenesis of distal hemipelagic sediments Royalty-free reproduction of this article by the U S Government or by the authors for U S. Government purposes is permitted.  相似文献   

10.
The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is a State/Federal partnership created to reduce tsunami hazards along U.S. coastlines. Established in 1996, NTHMP coordinates the efforts of five Pacific States: Alaska, California, Hawaii, Oregon, and Washington with the three Federal agencies responsible for tsunami hazard mitigation: the National Oceanic and Atmospheric Administration (NOAA), the Federal Emergency Management Agency (FEMA), and the U.S. Geological Survey (USGS). In the 7 years of the program it has, 1. established a tsunami forecasting capability for the two tsunami warning centers through the combined use of deep ocean tsunami data and numerical models; 2. upgraded the seismic network enabling the tsunami warning centers to locate and size earthquakes faster and more accurately; 3. produced 22 tsunami inundation maps covering 113 coastal communities with a population at risk of over a million people; 4. initiated a program to develop tsunami-resilient communities through awareness, education, warning dissemination, mitigation incentives, coastal planning, and construction guidelines; 5. conducted surveys that indicate a positive impact of the programs activities in raising tsunami awareness. A 17-member Steering Group consisting of representatives from the five Pacific States, NOAA, FEMA, USGS, and the National Science Foundation (NSF) guides NTHMP. The success of the program has been the result of a personal commitment by steering group members that has leveraged the total Federal funding by contributions from the States and Federal Agencies at a ratio of over six matching dollars to every NTHMP dollar. Twice yearly meetings of the steering group promote communication between scientists and emergency managers, and among the State and Federal agencies. From its initiation NTHMP has been based on the needs of coastal communities and emergency managers and has been results driven because of the cycle of year-to-year funding for the first 5 years. A major impact of the program occurred on 17 November 2003, when an Alaskan tsunami warning was canceled because real-time, deep ocean tsunami data indicated the tsunami would be non-damaging. Canceling this warning averted an evacuation in Hawaii, avoiding a loss in productivity valued at $68M.  相似文献   

11.
This paper presents a combined approach to achieving best practice volcano monitoring through a review of New Zealand’s volcano-monitoring capability as established under the GeoNet project. A series of benchmark, consultation and network performance studies were undertaken to provide a comprehensive review of volcano monitoring in New Zealand and to establish plans for future improvements in capability. The United States Geological Survey National Volcano Early Warning System method was applied to benchmark the established monitoring networks against recommendations for instrumentation based on a volcano’s threat level. Next, a consultative study of New Zealand’s volcanology research community was undertaken to canvass opinions on what future directions GeoNet volcano monitoring should take. Once the seismic network infrastructure had been built, a noise floor analysis was conducted to identify stations with poor site noise characteristics. Noise remediation for poor sites has been implemented by either re-locating the site or placing sensors in boreholes. Quality control of Global Navigation Satellite System networks is undertaken through the use of multipath parameters derived from routine processing. Finally, the performance of the monitoring networks is assessed against two recent eruptions at Mount Tongariro and White Island. This combined approach can be used as a model to assess the need for future monitoring levels on any volcano.  相似文献   

12.
Bond distances and angles in isostructural, ordered clinopyroxenes are compared for eight compositions, based on five new and three published crystal-structure refinements from X-ray diffraction data. Unit-cell parameters and configuration of the silicate chains are directly correlated with cation composition and distribution in the M2 and M1 sites.Publication authorized by the Director, U. S. Geological Survey.Our thanks go to D. B. Stewart, C. Milton, and C. S. Ross, U. S. Geological Survey, who supplied crystals of the minerals, to D. B. Stewart for synthesis of iron spodumene, to Dr. L. Fuchs, Argonne National Laboratory, for synthesis of ureyite, and to Dr. B. Mason, U. S. National Museum, who transmitted the ureyite crystals.  相似文献   

13.
Radon buildup in homes is now recognized throughout the world as a potentially major health hazard. The U.S. Nuclear Regulatory Commission and the U.S. Environmental Protection Agency estimate 8,000–30,000 fatalities per year in the United States due to indoor radon. The Albuquerque, New Mexico area was chosen for study because it is representative of metropolitan areas in the southwestern United States where slightly uraniferous source rocks (Sandia granite) have provided the very immature soil for much of the area. The granite contains 4.7 ppm U, and limestone capping the granite 5.7 ppm U. Soils in the area average 4.24 ppm U, and Th/U ratios average 3.2. These data suggest some removal of U from the source rocks, but fixation of the U in the soils (that is, as opposed to widespread removal of the U by solution), thus providing a ready source for soil radon. A pilot study of soil radon in the area in winter of 1983–1984 shows high values, 180 pCi/l, relative to the U.S. average (about 100 pCi/l). In the winter of 1986–1987, 180 dwellings were surveyed for their indoor radon levels, including 20 that had been surveyed in summer of 1986. Twenty-eight percent of those in the winter study yielded indoor radon above the EPA suggested maximum permissible level of 4 pCi/l air, well above the EPA estimate of 10–15 dwellings for the U.S. The indoor radon levels show positive correlation with closeness to the Sandia Mountains, to soil radon, to excess insulation, to homes with solar capacities, and other factors. Building materials may provide a very minor source of some indoor radon. Summer readings are lower than winter readings except when the houses possess refrigerated air conditioning.  相似文献   

14.
Relations between the United States and Europe have been quite volatile over the past five years. This volatility is not just a product of disagreements over the American invasion of Iraq. It is tied to a set of fundamental challenges to the geopolitical arrangements and understandings that emerged in the wake of World War II. Three challenges were of particular importance: the fall of the Iron Curtain, the Balkan crisis of the 1990s, and the election of a presidential administration in Washington, DC, which adopted a neoconservative geopolitical agenda. The global impacts of this agenda were heightened by the September 11, 2001, attacks on the United States of America. The U.S. response exposed fundamental differences between the U.S. and Europe on the use of international military forces in the “war on terrorism,” the role of NATO, and the U.S. government’s effort to force “regime change” in Iraq. Europe’s reaction to U.S. policy has not been uniform, however. At the governmental level, fundamental differences have emerged among European countries. The United States has sought to highlight those differences, suggesting that the U.S. favors “disaggregation” in Europe, even as it trumpets the virtues of a uniform response to the threat of terrorism. The future trajectory of U.S.-European relations is likely to be shaped by intersections between Europe’s struggles with integration and the U.S.’s evolving global geopolitical posture, which could move in either a hegemonic or a globalist direction.  相似文献   

15.
Probabilistic methodology used by the U.S. Geological Survey is described for estimating the quantity of undiscovered recoverable conventional resources of oil and gas in the United States. A judgmental probability distribution of the quantity of resource and its properties is determined for a geologic province or basin. From this distribution, point and interval estimates of the quantity of undiscovered resource are obtained. Distributions and their properties are established for each of the following resources: (1) oil and nonassociated gas from estimates of the probability of the resource being present and the conditional probability distribution of the quantity of resource given that the resource is present, (2) associated-dissolved gas from its corresponding oil distribution, (3) total gas, (4) oil and total gas in two or more provinces. Computer graphics routines are illustrated with examples from the U.S. Geological Survey Circular 860.  相似文献   

16.
New dating in the Carson Sink at the termini of the Humboldt and Carson rivers in the Great Basin of the western United States indicates that lakes reached elevations of 1204 and 1198 m between 915 and 652 and between 1519 and 1308 cal yr B.P., respectively. These dates confirm Morrison's original interpretation (Lake Lahontan: Geology of the Southern Carson Desert, Professional Paper 40, U.S. Geol. Survey, 1964) that these shorelines are late Holocene features, rather than late Pleistocene as interpreted by later researchers. Paleohydrologic modeling suggests that discharge into the Carson Sink must have been increased by a factor of about four, and maintained for decades, to account for the 1204-m lake stand. The hydrologic effects of diversions of the Walker River to the Carson Sink were probably not sufficient, by themselves, to account for the late Holocene lake-level rises. The decadal-long period of increased runoff represented by the 1204-m lake is also reflected in other lake records and in tree ring records from the western United States.  相似文献   

17.
Ice-core samples from Upper Fremont Glacier (UFG), Wyoming, were used as proxy records for the chemical composition of atmospheric deposition. Results of analysis of the ice-core samples for stable isotopes of nitrogen (δ15N, ) and sulfur (δ34S, ), as well as and deposition rates from the late-1940s thru the early-1990s, were used to enhance and extend existing National Atmospheric Deposition Program/National Trends Network (NADP/NTN) data in western Wyoming. The most enriched δ34S value in the UFG ice-core samples coincided with snow deposited during the 1980 eruption of Mt. St. Helens, Washington. The remaining δ34S values were similar to the isotopic composition of coal from southern Wyoming. The δ15N values in ice-core samples representing a similar period of snow deposition were negative, ranging from -5.9 to -3.2 ‰ and all fall within the δ15N values expected from vehicle emissions. Ice-core nitrate and sulfate deposition data reflect the sharply increasing U.S. emissions data from 1950 to the mid-1970s.  相似文献   

18.
A disproportionate increase in precipitation coming from intense rain events, in the situation of general warming (thus, an extension of the vegetation period with intensive transpiration) and an insignificant change in total precipitation could lead to an increase in the frequency of potentially serious type of extreme events: prolonged periods without precipitation (even when the mean seasonal rainfall totals increase). This paper investigates whether this development is already occurring during the past several decades over North America south of 55°N, for the same period when changes in frequency of intense precipitation events are being observed. Lengthy strings of “dry” days without sizeable (>1.0 mm) precipitation were assessed only during the warm season (defined as a period when mean daily temperature is above the 5℃ threshold) when water is intensively used for transpiration and prolonged periods without sizable rainfall represent a hazard for terrestrial ecosystem's health and agriculture. During the past four decades, the mean duration of prolonged dry episodes (20 days or longer in southeastern Canada, 1 month or longer in the Eastern United States and along the Gulf Coast of Mexico and 2 months or longer in the Southwestern United States and Northern Mexico) has significantly increased. As a consequence, the return period of 1 month long dry episodes over the Eastern U.S. has been reduced more than twofold from 15 to 6~7 years. The longer average duration of dry episodes has occurred during a relatively wet period around most of the continent south of 55°N but is not observed over the Northwestern U.S. and adjacent regions of Southern Canada.   相似文献   

19.
Atmospheric deposition of nitrogen (AD-N) is a significant source of nitrogen enrichment to nitrogen (N)-limited estuarine and coastal waters downwind of anthropogenic emissions. Along the eastern U.S. coast and eastern Gulf of Mexico, AD-N currently accounts for 10% to over 40% of new N loading to estuaries. Extension of the regional acid deposition model (RADM) to coastal shelf waters indicates that 11, 5.6, and 5.6 kg N ha−1 may be deposited on the continental shelf areas of the northeastern U.S. coast, southeast U.S. coast, and eastern Gulf of Mexico, respectively. AD-N approximates or exceeds riverine N inputs in many coastal regions. From a spatial perspective, AD-N is a unique source of N enrichment to estuarine and coastal waters because, for a receiving water body, the airshed may exceed the watershed by 10–20 fold. AD-N may originate far outside of the currently managed watersheds. AD-N may increase in importance as a new N source by affecting waters downstream of the oligohaline and mesohaline estuarine nutrient filters where large amounts of terrestrially-supplied N are assimilated and denitrified. Regionally and globally, N deposition associated with urbanization (NOx, peroxyacetyl nitrate, or PAN) and agricultural expansion (NH4 + and possibly organic N) has increased in coastal airsheds. Recent growth and intensification of animal (poultry, swine, cattle) operations in the midwest and mid-Atlantic regions have led to increasing amounts of NH4 + emission and deposition, according to a three decadal analysis of the National Acid Deposition Program network. In western Europe, where livestock operations have dominated agricultural production for the better part of this century, NH4 + is the most abundant form of AD-N. AD-N deposition in the U.S. is still dominated by oxides of N (NOx) emitted from fossil fuel combustion; annual NH4 + deposition is increasing, and in some regions is approaching total NO3 deposition. In receiving estuarine and coastal waters, phytoplankton community structural and functional changes, associated water quality, and trophic and biogeochemical alterations (i.e, algal blooms, hypoxia, food web, and fisheries habitat disruption) are frequent consequences of N-driven eutrophication. Increases in and changing proportions of various new N sources regulate phytoplankton competitive interactions, dominance, and successional patterns. These quantitative and qualitative aspects of AD-N and other atmospheric nutrient sources (e.g., iron) may promote biotic changes now apparent in estuarine and coastal waters, including the proliferation of harmful algal blooms, with cascading impacts on water quality and fisheries.  相似文献   

20.
Mercury is a federated metadata harvesting, search and retrieval tool based on both open source packages and custom software developed at Oak Ridge National Laboratory (ORNL). It was originally developed for the National Aeronautics and Space Administration (NASA), and the consortium now includes funding from NASA, U.S. Geological Survey (USGS), and U.S. Department of Energy (DOE). Mercury is itself a reusable software application which uses a service-oriented architecture (SOA) approach to capturing and managing metadata in support of twelve Earth science projects. Mercury also supports the reuse of metadata by enabling searches across a range of metadata specification and standards including XML, Z39.50, FGDC, Dublin-Core, Darwin-Core, EML, and ISO-19115. It collects metadata and key data from contributing project servers distributed around the world and builds a centralized index. The Mercury search interfaces allows the users to perform simple, fielded, spatial, temporal and other hierarchical searches across these metadata sources. This centralized repository of metadata with distributed data sources provides extremely fast search results (Table 1) to the user, while allowing data providers to advertise the availability of their data and yet maintain complete control and ownership of that data.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号