首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Developing economy greenhouse gas emissions are growing rapidly relative to developed economy emissions (Boden et al. 2010) and developing economies as a group have greater emissions than developed economies. These developments are expected to continue (U.S. Energy Information Administration 2010), which has led some to question the effectiveness of emissions mitigation in developed economies without a commitment to extensive mitigation action from developing economies. One often heard argument against proposed U.S. legislation to limit carbon emissions to mitigate climate change is that, without participation from large developing economies like China and India, stabilizing temperature at 2 degrees Celsius above preindustrial (United Nations 2009), or even reducing global emissions levels, would be impossible (Driessen 2009; RPC Energy Facts 2009) or prohibitively expensive (Clarke et al. 2009). Here we show that significantly delayed action by rapidly developing countries is not a reason to forgo mitigation efforts in developed economies. This letter examines the effect of a scenario with no explicit international climate policy and two policy scenarios, full global action and a developing economy delay, on the probability of exceeding various global average temperature changes by 2100. This letter demonstrates that even when developing economies delay any mitigation efforts until 2050 the effect of action by developed economies will appreciably reduce the probability of more extreme levels of temperature change. This paper concludes that early carbon mitigation efforts by developed economies will considerably affect the distribution over future climate change, whether or not developing countries begin mitigation efforts in the near term.  相似文献   

2.
Expert elicitation studies have become important barometers of scientific knowledge about future climate change (Morgan and Keith, Environ Sci Technol 29(10), 1995; Reilly et al., Science 293(5529):430–433, 2001; Morgan et al., Climate Change 75(1–2):195–214, 2006; Zickfeld et al., Climatic Change 82(3–4):235–265, 2007, Proc Natl Acad Sci 2010; Kriegler et al., Proc Natl Acad Sci 106(13):5041–5046, 2009). Elicitations incorporate experts’ understanding of known flaws in climate models, thus potentially providing a more comprehensive picture of uncertainty than model-driven methods. The goal of standard elicitation procedures is to determine experts’ subjective probabilities for the values of key climate variables. These methods assume that experts’ knowledge can be captured by subjective probabilities—however, foundational work in decision theory has demonstrated this need not be the case when their information is ambiguous (Ellsberg, Q J Econ 75(4):643–669, 1961). We show that existing elicitation studies may qualitatively understate the extent of experts’ uncertainty about climate change. We designed a choice experiment that allows us to empirically determine whether experts’ knowledge about climate sensitivity (the equilibrium surface warming that results from a doubling of atmospheric CO2 concentration) can be captured by subjective probabilities. Our results show that, even for this much studied and well understood quantity, a non-negligible proportion of climate scientists violate the choice axioms that must be satisfied for subjective probabilities to adequately describe their beliefs. Moreover, the cause of their violation of the axioms is the ambiguity in their knowledge. We expect these results to hold to a greater extent for less understood climate variables, calling into question the veracity of previous elicitations for these quantities. Our experimental design provides an instrument for detecting ambiguity, a valuable new source of information when linking climate science and climate policy which can help policy makers select decision tools appropriate to our true state of knowledge.  相似文献   

3.
We discuss here a mistake in the analysis of Previdi and Liepert (Clim Dyn, 2011). In that article, the surface albedo radiative kernels were calculated incorrectly. We present in this brief comment the corrected albedo kernels. We then use these kernels to compute the surface albedo radiative feedback in climate model simulations driven by increasing carbon dioxide, as in Previdi and Liepert (Clim Dyn, 2011). We find that the use of the corrected albedo kernels does not change the conclusions of our earlier work.  相似文献   

4.
Kleidon (2009) concludes that warm climates impose important constraints on the evolution of large brains relative to body size, confirming our previous hypothesis (Schwartzman and Middendorf 2000). Here we update the case for our hypothesis and present a first approximation estimate of the cooling required for hominin brain size increase using a simple model of heat loss. We conclude that Pleistocene glacial episodes were likely sufficient to serve as prime releasers for emergence of Homo habilis and Homo erectus. In addition, we propose that atmospheric oxygen levels may been an analogous constraint on insect encephalization.  相似文献   

5.
Fifty-four broadband models for computation of solar diffuse irradiation on horizontal surface were tested in Romania (South-Eastern Europe). The input data consist of surface meteorological data, column integrated data, and data derived from satellite measurements. The testing procedure is performed in 21 stages intended to provide information about the sensitivity of the models to various sets of input data. There is no model to be ranked “the best” for all sets of input data. However, some of the models performed better than others, in the sense that they were ranked among the best for most of the testing stages. The best models for solar diffuse radiation computation are, on equal footing, ASHRAE 2005 model (ASHRAE 2005) and King model (King and Buckius, Solar Energy 22:297–301, 1979). The second best model is MAC model (Davies, Bound Layer Meteor 9:33–52, 1975). Details about the performance of each model in the 21 testing stages are found in the Electronic Supplementary Material.  相似文献   

6.
Gary Yohe 《Climatic change》2010,99(1-2):295-302
Article 2 of the United Nations Framework Convention on Climate Change commits its parties to stabilizing greenhouse gas concentrations in the atmosphere at a level that “would prevent dangerous anthropogenic interference with the climate system.” Authors of the Third Assessment Report of the Intergovernmental Panel on Climate Change (IPCC 2001a, b) offered some insight into what negotiators might consider dangerous by highlighting five “reasons for concern” (RFC’s) and tracking concern against changes in global mean temperature; they illustrated their assessments in the now iconic “burning embers” diagram. The Fourth Assessment Report reaffirmed the value of plotting RFC’s against temperature change (IPCC 2007a, b), and Smith et al. (2009) produced an unpated embers visualization for the globe. This paper applies the same assessment and communication strategies to calibrate the comparable RFC’s for the United States. It adds “National Security Concern” as a sixth RFC because many now see changes in the intensity and/or frequency of extreme events around the world as “risk enhancers” that deserve attention at the highest levels of the US policy and research communities. The US embers portrayed here suggest that: (1) US policy-makers will not discover anything really “dangerous” over the near to medium term if they consider only economic impacts that are aggregated across the entire country but that (2) they could easily uncover “dangerous anthropogenic interference with the climate system” by focusing their attention on changes in the intensities, frequencies, and regional distributions of extreme weather events driven by climate change.  相似文献   

7.
A new approach is proposed to predict concentration fluctuations in the framework of one-particle Lagrangian stochastic models. The approach is innovative since it allows the computation of concentration fluctuations in dispersing plumes using a Lagrangian one-particle model with micromixing but with no need for the simulating of background particles. The extension of the model for the treatment of chemically reactive plumes is also accomplished and allows the computation of plume-related chemical reactions in a Lagrangian one-particle framework separately from the background chemical reactions, accounting for the effect of concentration fluctuations on chemical reactions in a general, albeit approximate, manner. These characteristics should make the proposed approach an ideal tool for plume-in-grid calculations in chemistry transport models. The results are compared to the wind-tunnel experiments of Fackrell and Robins (J Fluid Mech, 117:1–26, 1982) for plume dispersion in a neutral boundary layer and to the measurements of Legg et al. (Boundary-Layer Meteorol, 35:277–302, 1986) for line source dispersion in and above a model canopy. Preliminary reacting plume simulations are also shown comparing the model with the experimental results of Brown and Bilger (J Fluid Mech, 312:373–407, 1996; Atmos Environ, 32:611–628, 1998) to demonstrate the feasibility of computing chemical reactions in the proposed framework.  相似文献   

8.
Communicating information about consistency in projections is crucial to the successful understanding, interpretation and appropriate application of information from climate models about future climate and its uncertainties. However, mapping the consistency of model projections in such a way that this information is communicated clearly remains a challenge that several recently published papers have sought to address in the run up to the IPCC AR5. We highlight that three remaining issues have not been fully addressed by the literature to date. Allen and Ingram (Nature 419:224, 2002) While additional information about regions where projected changes in rainfall are not ‘statistically significant’ can provide useful information for policy, the spatial scale at which changes are assessed has a substantial impact on the signal-to-noise ratio, and thus the detectability of changes. We demonstrate that by spatially smoothing the model projections we can provide more information about the nature of the signal for larger regions of the world. Christensen et al. (2007) Combining information about magnitude, consistency and statistical significance of projected changes in a single map can cause reduced legibility. We demonstrate the difficulty in finding a ‘universal’ method suitable for a wide range of audiences DEFRA (2012) We highlight that regions where projected changes in average rainfall are not statistically significant, changes in variability may still cause significant impacts. We stress the need to communicate this effectively in order to avoid mis-leading users. Finally, we comment on regions of the world where messages for users of climate information about ensemble consistency have changed since AR4, noting that these changes are due largely to changes in the methods of measuring consistency rather than any discernable differences between the CMIP3 and CMIP5 ensembles.  相似文献   

9.
For many decades, attempts have been made to find the universal value of the critical bulk Richardson number ( $Ri_{Bc}$ ; defined over the entire stable boundary layer). By analyzing an extensive large-eddy simulation database and various published wind-tunnel data, we show that $Ri_{Bc}$ is not a constant, rather it strongly depends on bulk atmospheric stability. A (qualitatively) similar dependency, based on the well-known resistance laws, was reported by Melgarejo and Deardorff (J Atmos Sci 31:1324–1333, 1974) about forty years ago. To the best of our knowledge, this result has largely been ignored. Based on data analysis, we find that the stability-dependent $Ri_{Bc}$ estimates boundary-layer height more accurately than the conventional constant $Ri_{Bc}$ approach. Furthermore, our results indicate that the common practice of setting $Ri_{Bc}$ as a constant in numerical modelling studies implicitly constrains the bulk stability of the simulated boundary layer. The proposed stability-dependent $Ri_{Bc}$ does not suffer from such an inappropriate constraint.  相似文献   

10.
The scientific community is developing new global, regional, and sectoral scenarios to facilitate interdisciplinary research and assessment to explore the range of possible future climates and related physical changes that could pose risks to human and natural systems; how these changes could interact with social, economic, and environmental development pathways; the degree to which mitigation and adaptation policies can avoid and reduce risks; the costs and benefits of various policy mixes; and the relationship of future climate change adaptation and mitigation policy responses with sustainable development. This paper provides the background to and process of developing the conceptual framework for these scenarios, as described in the three subsequent papers in this Special Issue (Van Vuuren et al., 2013; O’Neill et al., 2013; Kriegler et al., Submitted for publication in this special issue). The paper also discusses research needs to further develop, apply, and revise this framework in an iterative and open-ended process. A key goal of the framework design and its future development is to facilitate the collaboration of climate change researchers from a broad range of perspectives and disciplines to develop policy- and decision-relevant scenarios and explore the challenges and opportunities human and natural systems could face with additional climate change.  相似文献   

11.
We determine the parameters of the semi-empirical link between global temperature and global sea level in a wide variety of ways, using different equations, different data sets for temperature and sea level as well as different statistical techniques. We then compare projections of all these different model versions (over 30) for a moderate global warming scenario for the period 2000–2100. We find the projections are robust and are mostly within ±20% of that obtained with the method of Vermeer and Rahmstorf (Proc Natl Acad Sci USA 106:21527–21532, 2009), namely ~1 m for the given warming of 1.8°C. Lower projections are obtained only if the correction for reservoir storage is ignored and/or the sea level data set of Church and White (Surv Geophys, 2011) is used. However, the latter provides an estimate of the base temperature T 0 that conflicts with the constraints from three other data sets, in particular with proxy data showing stable sea level over the period 1400–1800. Our new best-estimate model, accounting also for groundwater pumping, is very close to the model of Vermeer and Rahmstorf (Proc Natl Acad Sci USA 106:21527–21532, 2009).  相似文献   

12.
With very few exceptions, just about all limited area models (LAMs) used in operational NWP and regional climate modeling use the Davies (Q J R Meteorol Soc 102:405–418, 1976) relaxation lateral boundary conditions (LBCs), even though they make no effort to respect the basic mathematics of the problem. While in the early stages of the primitive equation LAM development in the seventies numerous schemes have been proposed and tested, LAM communities have eventually for the most part settled on the relaxation LBCs with few questions asked. An exception is the Eta model used extensively at NCEP and several other centers, in which the Mesinger (Contrib Atmos Phys 50:200–210, 1977) LBCs are used, designed and based on knowledge available before the introduction of the relaxation scheme. They prescribe variables along the outermost row of grid points only; all of them at the inflow points and one less at the outflow points where the tangential velocity components are extrapolated from inside of the model domain. Additional schemes are in place to suppress separation of gravity-wave solutions on C-subgrids of the model’s E-grid. A recent paper of Veljovic et al. (Meteor Zeitschrift 19:237–246, 2010) included three 32-day forecasts done with both the Eta and the relaxation LBCs and the comparison of some of their verification results. Here we extend this experiment by three additional forecasts to arrive at an ensemble of six members run with both schemes, and present a more complete discussion of results. We in addition show results of one of these forecasts in which the linear change of relaxation coefficients was replaced by the change following the recommendation of Lehmann (Meteorol Atmos Phys 52:1–14, 1993). We feel that the results of our two verification schemes strongly suggest the advantage of the Eta over the conventional relaxation scheme, thereby raising doubts as to the justification for its use.  相似文献   

13.
Chris Hope 《Climatic change》2013,117(3):531-543
PAGE09 is an updated version of the PAGE2002 integrated assessment model (Hope 2011a). The default PAGE09 model gives a mean estimate of the social cost of CO2 (SCCO2) of $106 per tonne of CO2, compared to $81 from the PAGE2002 model used in the Stern review (Stern 2007). The increase is the net result of several improvements that have been incorporated into the PAGE09 model in response to the critical debate around the Stern review: the adoption of the A1B socio-economic scenario, rather than A2 whose population assumptions are now thought to be implausible; the use of ranges for the two components of the discount rate, rather than the single values used in the Stern review; a distribution for the climate sensitivity that is consistent with the latest estimates from IPCC 2007a; less adaptation than in PAGE2002, particularly in the economic sector, which was criticised for possibly being over-optimistic; and a more theoretically-justified basis of valuation that gives results appropriate to a representative agent from the focus region, the EU. The effect of each of these adjustments is quantified and explained.  相似文献   

14.
Greenhouse gases emission inventories are computed with rather low precision. Moreover, their uncertainty distributions may be asymmetric. This should be accounted for in the compliance and trading rules. In this paper we model the uncertainty of inventories as intervals or using fuzzy numbers. The latter allows us to better shape the uncertainty distributions. The compliance and emission trading rules obtained generalize the results for the symmetric uncertainty distributions that were considered in the earlier papers by the present authors (Nahorski et al., Water Air & Soil Pollution. Focus 7(4–5):539–558, 2007; Nahorski and Horabik, 2007, J Energy Eng 134(2):47–52, 2008). However, unlike in the symmetric distribution, in the asymmetric fuzzy case it is necessary to apply approximations because of nonlinearities in the formulas. The final conclusion is that the interval uncertainty rules can be applied, but with a much higher substitutional noncompliance risk, which is a parameter of the rules.  相似文献   

15.
Heat flux density at the soil surface (G 0) was evaluated hourly on a vegetal cover 0.08 m high, with a leaf area index of 1.07 m2 m?2, during daylight hours, using Choudhury et al. (Agric For Meteorol 39:283–297, 1987) ( $ G_0^{\text{rn}} $ ), Santanello and Friedl (J Appl Meteorol 42:851–862, 2003) ( $ G_0^{\text{s}} $ ), and force-restore ( $ G_0^{\text{fr}} $ ) models and the plate calorimetry methodology ( $ G_0^{\text{pco}} $ ), where the gradient calorimetry methodology (G 0R ) served as a reference for determining G 0. It was found that the peak of G 0R was at 1 p.m., with values that ranged between 60 and 100 W m?2 and that the G 0/Rn relation varied during the day with values close to zero in the early hours of the morning and close to 0.25 in the last hours of daylight. The $ G_0^{\text{s}} $ model presented the best performance, followed by the $ G_0^{\text{rn}} $ and $ G_0^{\text{fr}} $ models. The plate calorimetry methodology showed a similar behavior to that of the gradient calorimetry referential methodology.  相似文献   

16.
The Weibull distribution is commonly used to describe climatological wind-speed distributions in the atmospheric boundary layer. While vertical profiles of mean wind speed in the atmospheric boundary layer have received significant attention, the variation of the shape of the wind distribution with height is less understood. Previously we derived a probabilistic model based on similarity theory for calculating the effects of stability and planetary boundary-layer depth upon long-term mean wind profiles. However, some applications (e.g. wind energy estimation) require the Weibull shape parameter (k), as well as mean wind speed. Towards the aim of improving predictions of the Weibull- \(k\) profile, we develop expressions for the profile of long-term variance of wind speed, including a method extending our probabilistic wind-profile theory; together these two profiles lead to a profile of Weibull-shape parameter. Further, an alternate model for the vertical profile of Weibull shape parameter is made, improving upon a basis set forth by Wieringa (Boundary-Layer Meteorol, 1989, Vol. 47, 85–110), and connecting with a newly-corrected corollary of the perturbed geostrophic-drag theory of Troen and Petersen (European Wind Atlas, 1989, Risø National Laboratory, Roskilde). Comparing the models for Weibull-k profiles, a new interpretation and explanation is given for the vertical variation of the shape of wind-speed distributions. Results of the modelling are shown for a number of sites, with a discussion of the models’ efficacy and applicability. The latter includes a comparative evaluation of Wieringa-type empirical models and perturbed-geostrophic forms with regard to surface-layer behaviour, as well as for heights where climatological wind-speed variability is not dominated by surface effects.  相似文献   

17.
Laboratory Experiments on Convective Entrainment Using a Saline Water Tank   总被引:1,自引:1,他引:0  
Entrainment fluxes in a shear-free convective boundary layer have been measured with a saline water tank set-up. The experiments were targeted towards measuring the entrainment behaviour for medium to high Richardson numbers and use a two-layer design, i.e. two stacked non-stratified (neutral) layers with different densities. With laser induced fluorescence (LIF), the entrainment flux of a fluorescent dye is measured for bulk Richardson numbers in the range 30–260. It is proposed that a carefully chosen combination of top-down and bottom-up processes improves the accuracy of LIF-based entrainment observations. The observed entrainment fluxes are about an order of magnitude lower than reported for thermal water tanks: the derived buoyancy entrainment ratio, $A$ , is found to be $A \approx 0.02$ , which is to be compared with $A\approx 0.25$ for a thermal convection tank (Deardorff et al., J Fluid Mech 100:41–64, 1980). An extensive discussion is devoted to the influence of the Reynolds and Prandtl numbers in laboratory experiments on entrainment.  相似文献   

18.
We evaluate the claim by Gay et al. (Clim Change 94:333–349, 2009) that “surface temperature can be better described as a trend stationary process with a one-time permanent shock” than efforts by Kaufmann et al. (Clim Change 77:249–278, 2006) to model surface temperature as a time series that contains a stochastic trend that is imparted by the time series for radiative forcing. We test this claim by comparing the in-sample forecast generated by the trend stationary model with a one-time permanent shock to the in-sample forecast generated by a cointegration/error correction model that is assumed to be stable over the 1870–2000 sample period. Results indicate that the in-sample forecast generated by the cointegration/error correction model is more accurate than the in-sample forecast generated by the trend stationary model with a one-time permanent shock. Furthermore, Monte Carlo simulations of the cointegration/error correction model generate time series for temperature that are consistent with the trend-stationary-with-a-break result generated by Gay et al. (Clim Change 94:333–349, 2009), while the time series for radiative forcing cannot be modeled as trend stationary with a one-time shock. Based on these results, we argue that modeling surface temperature as a time series that shares a stochastic trend with radiative forcing offers the possibility of greater insights regarding the potential causes of climate change and efforts to slow its progression.  相似文献   

19.
Static flux chamber measurements of CCl4 uptake by soils in boreal, subtropical and tropical forests have been used to reassess the sink strength for this ozone depleting chemical. Happell and Roche (Geophys. Res. Lett. 30(2), 1088–1091, 2003) used flux estimates from soil concentration gradients to calculate a partial CCl4 atmospheric lifetime (τsoil) of 90 years. More recently, it is has been assumed that a better estimate of τsoil is 195 years (Montzka et al. 2011). In the work here, the rate of CCl4 uptake was calculated from 453 flux chamber measurements using an exponential fit to the chamber CCl4 concentration change with time. This analysis indicated that the flux rate estimate in Happell and Roche (Geophys. Res. Lett. 30(2) 1088–1091, 2003) was overestimated by 2.75, yielding a new estimate of τsoil for CCl4 of 245 years. Significant correlations of CCl4 uptake to temperature, soil moisture, or time of year were not observed. This work provides additional evidence that CCl4 uptake by soils is a common process and needs to be considered when developing an atmospheric budget for this compound.  相似文献   

20.
The high quality inventory is an important step to greenhouse gas emission mitigation. The inventory quality is estimated by means of the uncertainty analysis. The level of uncertainty depends upon the reliability of activity data and the parameters used. An attempt has been made to improve the accuracy of the estimates through a shift from production-based method (IPCC Tier 1) (IPCC 2000) to enhanced combination of production-based and mass balance methods (IPCC Tier 2) (IPCC 2006) in the estimation of emissions from operations with oil that are key in the national greenhouse gas inventory of the Russian Federation. The IPCC Tier 2 (IPCC 2006) was adapted for the national conditions. The greenhouse gas emissions were calculated for 1990 to 2009 with the use of both methods. The quantitative uncertainty assessment of the calculations was performed, and the outcomes were compared. The comparison showed that the estimates made with the use of higher tier method resulted in higher accuracy and lower uncertainties (26 % respectively compared to previously derived 54 %).  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号