首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   306篇
  免费   10篇
  国内免费   1篇
测绘学   24篇
大气科学   25篇
地球物理   140篇
地质学   69篇
海洋学   17篇
天文学   27篇
自然地理   15篇
  2021年   2篇
  2018年   5篇
  2017年   2篇
  2016年   9篇
  2015年   6篇
  2014年   3篇
  2013年   13篇
  2012年   7篇
  2011年   9篇
  2010年   10篇
  2009年   16篇
  2008年   8篇
  2007年   16篇
  2006年   8篇
  2005年   4篇
  2004年   11篇
  2003年   5篇
  2002年   7篇
  2001年   5篇
  2000年   8篇
  1998年   4篇
  1997年   4篇
  1996年   5篇
  1995年   4篇
  1994年   3篇
  1993年   7篇
  1992年   5篇
  1991年   3篇
  1989年   3篇
  1988年   4篇
  1987年   5篇
  1986年   6篇
  1985年   12篇
  1984年   11篇
  1983年   11篇
  1982年   18篇
  1981年   9篇
  1980年   7篇
  1979年   13篇
  1978年   3篇
  1977年   5篇
  1976年   3篇
  1975年   2篇
  1974年   2篇
  1972年   2篇
  1970年   2篇
  1953年   1篇
  1952年   1篇
  1936年   1篇
  1928年   1篇
排序方式: 共有317条查询结果,搜索用时 328 毫秒
51.
We present the first statistical analysis of 27 Ultraviolet Optical Telescope (UVOT) optical/ultraviolet light curves of gamma-ray burst (GRB) afterglows. We have found, through analysis of the light curves in the observer's frame, that a significant fraction rise in the first 500 s after the GRB trigger, all light curves decay after 500 s, typically as a power law with a relatively narrow distribution of decay indices, and the brightest optical afterglows tend to decay the quickest. We find that the rise could be either produced physically by the start of the forward shock, when the jet begins to plough into the external medium, or geometrically where an off-axis observer sees a rising light curve as an increasing amount of emission enters the observers line of sight, which occurs as the jet slows. We find that at 99.8 per cent confidence, there is a correlation, in the observed frame, between the apparent magnitude of the light curves at 400 s and the rate of decay after 500 s. However, in the rest frame, a Spearman rank test shows only a weak correlation of low statistical significance between luminosity and decay rate. A correlation should be expected if the afterglows were produced by off-axis jets, suggesting that the jet is viewed from within the half-opening angle θ or within a core of a uniform energy density  θc  . We also produced logarithmic luminosity distributions for three rest-frame epochs. We find no evidence for bimodality in any of the distributions. Finally, we compare our sample of UVOT light curves with the X-ray Telescope (XRT) light-curve canonical model. The range in decay indices seen in UVOT light curves at any epoch is most similar to the range in decay of the shallow decay segment of the XRT canonical model. However, in the XRT canonical model, there is no indication of the rising behaviour observed in the UVOT light curves.  相似文献   
52.
Fluctuations of glaciers during the 20th century in Garibaldi Provincial Park, in the southern Coast Mountains of British Columbia, were reconstructed from historical documents, aerial photographs, and fieldwork. Over 505 km2, or 26%, of the park, was covered by glacier ice at the beginning of the 18th century. Ice cover decreased to 297 km2 by 1987–1988 and to 245 km2 (49% of the early 18th century value) by 2005. Glacier recession was greatest between the 1920s and 1950s, with typical frontal retreat rates of 30 m/a. Many glaciers advanced between the 1960s and 1970s, but all glaciers retreated over the last 20 years. Times of glacier recession coincide with warm and relatively dry periods, whereas advances occurred during relatively cold periods. Rapid recession between 1925 and 1946, and since 1977, coincided with the positive phase of the Pacific Decadal Oscillation (PDO), whereas glaciers advanced during its negative phase (1890–1924 and 1947–1976). The record of 20th century glacier fluctuations in Garibaldi Park is similar to that in southern Europe, South America, and New Zealand, suggesting a common, global climatic cause. We conclude that global temperature change in the 20th century explains much of the behaviour of glaciers in Garibaldi Park and elsewhere.  相似文献   
53.
D. Markovic  M. Koch 《水文研究》2014,28(4):2202-2211
Long‐term variations and temporal scaling of mean monthly time series of river flow, precipitation, temperature, relative humidity, air pressure, duration of bright sunshine, degree of cloud cover, short wave radiation, wind speed and potential evaporation within or in vicinity of the German part of the Elbe River Basin are analyzed. Statistically significant correlations between the 2–15 year scale‐averaged wavelet spectra of the hydroclimatic variables and the North Atlantic Oscillation‐ and Arctic Oscillation index are found which suggests that such long‐term patterns in hydroclimatic time series are externally forced. The Hurst parameter estimates (H) based on the Detrended Fluctuation Analysis (DFA) indicate persistence for discharge, precipitation, wind speed, air pressure and the degree of cloud cover, all having an annual cycle and a broad low‐frequency distribution. Also, DFA H parameter estimates are higher for discharge than for precipitation. The major long‐term quasi‐periodic variability modes of precipitation detected using Singular Spectrum Analysis coincide with those detected in the discharge time series. Upon subtraction of these low‐frequency quasi‐periodic modes, the DFA H parameter estimates suggest absence of the persistence for both precipitation and discharge. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   
54.
Climate change impact assessments form the basis for the development of suitable climate change adaptation strategies. For this purpose, ensembles consisting of stepwise coupled models are generally used [emission scenario → global circulation model → downscaling approach (DA) → bias correction → impact model (hydrological model)], in which every item is affected by considerable uncertainty. The aim of the current study is (1) to analyse the uncertainty related to the choice of the DA as well as the hydrological model and its parameterization and (2) to evaluate the vulnerability of the studied catchment, a subcatchment of the highly anthropogenically impacted Spree River catchment, to hydrological change. Four different DAs are used to drive four different model configurations of two conceptually different hydrological models (Water Balance Simulation Model developed at ETH Zürich and HBV‐light). In total, 452 simulations are carried out. The results show that all simulations compute an increase in air temperature and potential evapotranspiration. For precipitation, runoff and actual evapotranspiration, opposing trends are computed depending on the DA used to drive the hydrological models. Overall, the largest source of uncertainty can be attributed to the choice of the DA, especially regarding whether it is statistical or dynamical. The choice of the hydrological model and its parameterization is of less importance when long‐term mean annual changes are compared. The large bandwidth at the end of the modelling chain may exacerbate the formulation of suitable climate change adaption strategies on the regional scale. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   
55.
56.
Although agriculture could contribute substantially to European emission reductions, its mitigation potential lies untapped and dormant. Market-based instruments could be pivotal in incentivizing cost-effective abatement. However, sector specificities in transaction costs, leakage risks and distributional impacts impede its implementation. The significance of such barriers critically hinges on the dimensions of policy design. This article synthesizes the work on emissions pricing in agriculture together with the literature on the design of market-based instruments. To structure the discussion, an options space is suggested to map policy options, focusing on three key dimensions of policy design. More specifically, it examines the role of policy coverage, instruments and transfers to farmers in overcoming the barriers. First, the results show that a significant proportion of agricultural emissions and mitigation potential could be covered by a policy targeting large farms and few emission sources, thereby reducing transaction costs. Second, whether an instrument is voluntary or mandatory influences distributional outcomes and leakage. Voluntary instruments can mitigate distributional concerns and leakage risks but can lead to subsidy lock-in and carbon price distortion. Third, the impact on transfers resulting from the interaction of the Common Agricultural Policy (CAP) with emissions pricing will play a key role in shaping political feasibility and has so far been underappreciated.

POLICY RELEVANCE

Following the 2015 Paris Agreement, European climate policy is at a crossroads. Achieving cost-effectively the 2030 and 2050 European targets requires all sectors to reduce their emissions. Yet, the cornerstone of European climate policy, the European Union Emissions Trading System (EU ETS), covers only about half of European emissions. Major sectors have been so far largely exempted from carbon pricing, in particular transport and agriculture. While transport has been increasingly under the spotlight as a possible candidate for an EU ETS sectoral expansion, policy discussions on pricing agricultural emissions have been virtually absent. This article attempts to fill this gap by investigating options for market-based instruments to reduce agricultural emissions while taking barriers to implementation into account.  相似文献   

57.
Climate policy uncertainty significantly hinders investments in low-carbon technologies, and the global community is behind schedule to curb carbon emissions. Strong actions will be necessary to limit the increase in global temperatures, and continued delays create risks of escalating climate change damages and future policy costs. These risks are system-wide, long-term and large-scale and thus hard to diversify across firms. Because of its unique scale, cost structure and near-term availability, Reducing Emissions from Deforestation and forest Degradation in developing countries (REDD+) has significant potential to help manage climate policy risks and facilitate the transition to lower greenhouse gas emissions. ‘Call’ options contracts in the form of the right but not the obligation to buy high-quality emissions reduction credits from jurisdictional REDD+ programmes at a predetermined price per ton of CO2 could help unlock this potential despite the current lack of carbon markets that accept REDD+ for compliance. This approach could provide a globally important cost-containment mechanism and insurance for firms against higher future carbon prices, while channelling finance to avoid deforestation until policy uncertainties decline and carbon markets scale up.

Key policy insights

  • Climate policy uncertainty discourages abatement investments, exposing firms to an escalating systemic risk of future rapid increases in emission control expenditures.

  • This situation poses a risk of an abatement ‘short squeeze,’ paralleling the case in financial markets when prices jump sharply as investors rush to square accounts on an investment they have sold ‘short’, one they have bet against and promised to repay later in anticipation of falling prices.

  • There is likely to be a willingness to pay for mechanisms that hedge the risks of abruptly rising carbon prices, in particular for ‘call’ options, the right but not the obligation to buy high-quality emissions reduction credits at a predetermined price, due to the significantly lower upfront capital expenditure compared to other hedging alternatives.

  • Establishing rules as soon as possible for compliance market acceptance of high-quality emissions reductions credits from REDD+ would facilitate REDD+ transactions, including via options-based contracts, which could help fill the gap of uncertain climate policies in the short and medium term.

  相似文献   
58.
Using the database provided by the Reviewed Event Bulletins (REBs) for the first 2.5 years of the Group of Scientific Experts Technical Test-3 (GSETT-3) experiment, we compiled mislocation vectors for both arrays and selected three-component stations of the primary network from the published slowness and azimuth information gained through f-k- and polarization analysis. Imposing constraints such as a minimum signal-to-noise ratio (SNR) and number of defining phases, we aim at eliminating location bias as the hypocentral parameters are taken from the REBs. Results from 14 arrays with apertures from about 1 km to more than 20 km are presented as well as from 18 three-component stations, which indicate that the mislocation vectors in many cases can improve location accuracy considerably. If these mislocation vectors are compiled to provide coverage of a sufficient portion of the slowness domain these empirical corrections can easily be applied prior to location processing. In the context of the Comprehensive Nuclear Test-Ban Treaty (CTBT), these mislocation patterns could be essential for providing accurate event location of suspicious low-magnitude events, as these location parameters will be used to pinpoint the area where to conduct an on-site inspection.  相似文献   
59.
Expectation Maximization algorithm and its minimal detectable outliers   总被引:1,自引:0,他引:1  
Minimal Detectable Biases (MDBs) or Minimal Detectable Outliers for the Expectation Maximization (EM) algorithm based on the variance-inflation and the mean-shift model are determined for an example. A Monte Carlo method is applied with no outlier and with one, two and three randomly chosen outliers. The outliers introduced are recovered and the corresponding MDBs are almost independent from the number of outliers. The results are compared to the MDB derived earlier by the author. This MDB approximately agrees with the MDB for one outlier of the EM algorithm. The MDBs for two and three outliers are considerably larger than MDBs of the EM algorithm.  相似文献   
60.
Quantities like tropospheric zenith delays or station coordinates are repeatedly measured at permanent VLBI or GPS stations so that time series for the quantities at each station are obtained. The covariances of these quantities can be estimated in a multivariate linear model. The covariances are needed for computing uncertainties of results derived from these quantities. The covariance matrix for many permanent stations becomes large, the need for simplifying it may therefore arise under the condition that the uncertainties of derived results still agree. This is accomplished by assuming that the different time series of a quantity like the station height for each permanent station can be combined to obtain one time series. The covariance matrix then follows from the estimates of the auto- and cross-covariance functions of the combined time series. A further approximation is found, if compactly supported covariance functions are fitted to an estimated autocovariance function in order to obtain a covariance matrix which is representative of different kinds of measurements. The simplification of a covariance matrix estimated in a multivariate model is investigated here for the coordinates of points of a grid measured repeatedly by a laserscanner. The approximations are checked by determining the uncertainty of the sum of distances to the points of the grid. To obtain a realistic value for this uncertainty, the covariances of the measured coordinates have to be considered. Three different setups of measurements are analyzed and a covariance matrix is found which is representative for all three setups. Covariance matrices for the measurements of laserscanners can therefore be determined in advance without estimating them for each application.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号