首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   47篇
  免费   1篇
测绘学   1篇
大气科学   2篇
地球物理   7篇
地质学   17篇
海洋学   9篇
天文学   3篇
自然地理   9篇
  2017年   2篇
  2016年   3篇
  2014年   5篇
  2012年   1篇
  2011年   1篇
  2010年   1篇
  2009年   3篇
  2008年   4篇
  2007年   2篇
  2005年   5篇
  2004年   3篇
  2002年   3篇
  2001年   1篇
  1999年   2篇
  1997年   2篇
  1991年   1篇
  1987年   1篇
  1986年   1篇
  1984年   1篇
  1983年   1篇
  1981年   1篇
  1976年   1篇
  1975年   1篇
  1972年   1篇
  1970年   1篇
排序方式: 共有48条查询结果,搜索用时 312 毫秒
1.
Although most of the world's uranium exists as pitchblende or uraninite, this mineral can be weathered to a great variety of secondary uranium minerals, most containing the uranyl cation. Anthropogenic uranium compounds can also react in the environment, leading to spatial–chemical alterations that could be useful for nuclear forensics analyses. Soft X‐ray absorption spectroscopy (XAS) has the advantages of being non‐destructive, element‐specific and sensitive to electronic and physical structure. The soft X‐ray probe can also be focused to a spot size on the order of tens of nanometres, providing chemical information with high spatial resolution. However, before XAS can be applied at high spatial resolution, it is necessary to find spectroscopic signatures for a variety of uranium compounds in the soft X‐ray spectral region. To that end, we collected the near edge X‐ray absorption fine structure (NEXAFS) spectra of a variety of common uranyl‐bearing minerals, including uranyl carbonates, oxyhydroxides, phosphates and silicates. We find that uranyl compounds can be distinguished by class (carbonate, oxyhydroxide, phosphate or silicate) based on their oxygen K‐edge absorption spectra. This work establishes a database of reference spectra for future spatially resolved analyses. We proceed to show scanning X‐ray transmission microscopy (STXM) data from a schoepite particle in the presence of an unknown contaminant.  相似文献   
2.
At the Earth Physics Branch in Ottawa, storage and retrieval of gravity data has progressed from the pencil and paper system used until the late 1950’s to the current file-oriented gravity library operated on a large, high speed computer. The main impetus for the development of this computer library has been the requirements of the petroleum and mineral exploration industry, but military requirements and those of physical geodesy, geology and geophysics have had a large influence on the system at some or all stages of its development. The current system is user-oriented and tends to be complex internally to preserve a simple and convenient interface to the user. The results of approximately150,000 gravity measurements are contained on the main output file: the principal factors file. These observations are mainly those of the Earth Physics Branch, but a small percentage have been provided by commerical geophysical companies, other government agencies and universities. Current plans call for more emphasis on extending the system to serve as a national library for gravity and related data. Data reduction within the system is a partitionened process which employs a static model as a basis for the computation. Only a relatively minor change is required to develop a dynamic model for data reduction by eliminating the partitioned fashion in which field data are reduced. The capability of recognizing and computing accurately the magnitude of changes in the value of gravity and their geographic distribution will be a tremendous asset to studies of the dynamics of the earth.  相似文献   
3.
Sulfide oxidizing bacterial mats are common in regions of the continental shelves characterized by high primary production and the resultant oxygen minimum zone. These mats are made up of several species of Beggiatoa and/or Thioploca, which oxidize sulfide that is generated in the sediment. Thioploca spp. inhabit a large polysaccharide sheath that encompasses bundles of 1–20 filaments (trichomes), each ranging from 3 to 60 μm in diameter. This sheath has been shown to be a critical component of the autecology of Thioploca. Analysis of Thioploca from cold seeps in Monterey Bay using light and transmission electron microscopy identified new and diverse microbial assemblages associated with interstitial spaces between trichomes, inside the sheath. Small diameter, non‐vacuolate, filamentous prokaryotes were numerous. Amoebae, euglenozoa, ciliates and other protists of unknown affiliation were observed in sheaths. Most of the protists possessed food vacuoles and some protists showed ultrastructural evidence of endosymbionts. These observations suggest that Thioploca sheaths may serve as oases on the sea floor, providing nutritional and detoxification services to previously unrecognized microbial partners.  相似文献   
4.
5.
Carrying assorted cargo and covered with paints of varying toxicity, lost intermodal containers may take centuries to degrade on the deep seafloor. In June 2004, scientists from Monterey Bay Aquarium Research Institute (MBARI) discovered a recently lost container during a Remotely Operated Vehicle (ROV) dive on a sediment-covered seabed at 1281 m depth in Monterey Bay National Marine Sanctuary (MBNMS). The site was revisited by ROV in March 2011. Analyses of sediment samples and high-definition video indicate that faunal assemblages on the container’s exterior and the seabed within 10 m of the container differed significantly from those up to 500 m. The container surface provides hard substratum for colonization by taxa typically found in rocky habitats. However, some key taxa that dominate rocky areas were absent or rare on the container, perhaps related to its potential toxicity or limited time for colonization and growth. Ecological effects appear to be restricted to the container surface and the benthos within ∼10 m.  相似文献   
6.
Accurate and realistic characterizations of flood hazards on desert piedmonts and playas are increasingly important given the rapid urbanization of arid regions. Flood behavior in arid fluvial systems differs greatly from that of the perennial rivers upon which most conventional flood hazard assessment methods are based. Additionally, hazard assessments may vary widely between studies or even contradict other maps. This study's chief objective was to compare and evaluate landscape interpretation and hazard assessment between types of maps depicting assessments of flood risk in Ivanpah Valley, NV, as a case study. As a secondary goal, we explain likely causes of discrepancy between data sets to ameliorate confusion for map users. Four maps, including three different flood hazard assessments of Ivanpah Valley, NV, were compared: (i) a regulatory map prepared by FEMA, (ii) a soil survey map prepared by NRCS, (iii) a surficial geologic map, and (iv) a flood hazard map derived from the surficial geologic map, both of which were prepared by NBMG. GIS comparisons revealed that only 3.4% (33.9 km2) of Ivanpah Valley was found to lie within a FEMA floodplain, while the geologic flood hazard map indicated that ~ 44% of Ivanpah Valley runs some risk of flooding (Fig. 2D). Due to differences in mapping methodology and scale, NRCS data could not be quantitatively compared, and other comparisons were complicated by differences in flood hazard class criteria and terminology between maps. Owing to its scale and scope of attribute data, the surficial geologic map provides the most useful information on flood hazards for land-use planning. This research has implications for future soil geomorphic mapping and flood risk mitigation on desert piedmonts and playas. The Ivanpah Valley study area also includes the location of a planned new international airport, thus this study has immediate implications for urban development and land-use planning near Las Vegas, NV.  相似文献   
7.
Field experiments were conducted in Nellis Dunes Recreational Area (Clark County, Nevada, USA) to investigate emission of dust produced by off-road driving. Experiments were carried out with three types of vehicles: 4-wheelers (quads), dirt bikes (motorcycles) and dune buggies, on 17 soil types characteristic for a desert environment. Tests were done at various driving speeds, and emissions were measured for a large number of grain size fractions. This paper reports the results for two size fractions of emissions: PM10 (particles < 10 μm) and PM60 (particles < 60 μm). The latter was considered in this study to be sufficiently representative of the total suspendable fraction (TSP). Off-road driving was found to be a significant source of dust. However, the amounts varied greatly with the type of soil and the characteristics of the top layer. Models predicting emission of dust by off-road driving should thus consider a number of soil parameters and not just one key parameter. Vehicle type and driving speed are additional parameters that affect emission. In general, 4-wheelers produce more dust than dune buggies, and dune buggies, more than dirt bikes. Higher speeds also result in higher emissions. Dust emitted by off-road driving is less coarse than the parent sediment on the road surface. Off-road driving thus results in a progressive coarsening of the top layer. Exceptions to this are silty surfaces with no, or almost no, vegetation. For such surfaces no substantial differences were observed between the grain size distribution of road dust and emitted dust. Typical emission values for off-road driving on dry desert soils are: for sandy areas, 30–40 g km− 1 (PM10) and 150–250 g km− 1 (TSP); for silty areas, 100–200 g km− 1 (PM10) and 600–2000 g km− 1 (TSP); for drainages, 30–40 g km− 1 (PM10) and 100–400 g km− 1 (TSP); and for mixed terrain, 60–100 g km− 1 (PM10) and 300–800 g km− 1 (TSP). These values are for the types of vehicles tested in this study and do not refer to cars or trucks, which produce significantly more dust.  相似文献   
8.
9.
We describe a new approach that allows for systematic causal attribution of weather and climate-related events, in near-real time. The method is designed so as to facilitate its implementation at meteorological centers by relying on data and methods that are routinely available when numerically forecasting the weather. We thus show that causal attribution can be obtained as a by-product of data assimilation procedures run on a daily basis to update numerical weather prediction (NWP) models with new atmospheric observations; hence, the proposed methodology can take advantage of the powerful computational and observational capacity of weather forecasting centers. We explain the theoretical rationale of this approach and sketch the most prominent features of a “data assimilation–based detection and attribution” (DADA) procedure. The proposal is illustrated in the context of the classical three-variable Lorenz model with additional forcing. The paper concludes by raising several theoretical and practical questions that need to be addressed to make the proposal operational within NWP centers.  相似文献   
10.
The physical, chemical and biological perturbations in central California waters associated with the strong 1997–1998 El Niño are described and explained on the basis of time series collected from ships, moorings, tide gauges and satellites. The evolution of El Niño off California closely followed the pattern observed in the tropical Pacific. In June 1997 an anomalous influx of warm southerly waters, with weak signatures on coastal sea level and thermocline depth, marked the onset of El Niño in central California. The timing was consistent with propagation from the tropics via the equatorial and coastal wave-guide. By late 1997, the classical stratified ocean condition with a deep thermocline, high sea level, and warm sea surface temperature (SST) commonly associated with El Niño dominated the coastal zone. During the first half of 1998 the core of the California Current, which is normally detected several hundred kilometers from shore as a river of low salinity, low nutrient water, was hugging the coast. High nutrient, productive waters that occur in a north–south band from the coast to approximately 200 km offshore during cool years disappeared during El Niño. The nitrate in surface waters was less than 20% of normal and new production was reduced by close to 70%. The La Niña recovery phase began in the fall of 1998 when SSTs dropped below normal, and ocean productivity rebounded to higher than normal levels. The reduction in coastal California primary productivity associated with El Niño was estimated to be 50 million metric tons of carbon (5×1013 g C). This reduction certainly had deleterious effects on zooplankton, fish, and marine mammals. The 1992–1993 El Niño was more moderate than the 1997–1998 event, but because its duration was longer, its overall chemical and biological impact may have been comparable. How strongly the ecosystem responds to El Niño appears related to the longer-term background climatic state of the Pacific Ocean. The 1982–1983 and 1992–1993 El Niños occurred during the warm phase of the Pacific Decadal Oscillation (PDO). The PDO may have changed sign during the 1997–1998 El Niño, resulting in weaker ecological effects than would otherwise have been predicted based on the strength of the temperature anomaly.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号