Limiting global warming to ‘well below’ 2°C above pre-industrial levels and pursuing efforts to limit the temperature increase even further to 1.5°C is an integral part of the 2015 Paris Agreement. To achieve these aims, cumulative global carbon emissions after 2016 should not exceed 940 – 390?Gt of CO2 (for the 2°C target) and 167 – ?48?Gt of CO2 (for the 1.5°C target) by the end of the century. This paper analyses the EU’s cumulative carbon emissions in different models and scenarios (global models, EU-focused models and national carbon mitigation scenarios). Due to the higher reductions in energy use and carbon intensity of the end-use sectors in the national scenarios, we identify an additional mitigation potential of 26–37 Gt cumulative CO2 emissions up to 2050 compared to what is currently included in global or EU scenarios. These additional reductions could help to both reduce the need for carbon dioxide removals and bring cumulative emissions in global and EU scenarios in line with a fairness-based domestic EU budget for a 2°C target, while still remaining way above the budget for 1.5°C.Key policy insights
Models used for policy advice such as global integrated assessment models or EU models fail to consider certain mitigation potential available at the level of sectors.
Global and EU models assume significant levels of CO2 emission reductions from carbon capture and storage to reach the 1.5°C target but also to reach the 2°C target.
Global and EU model scenarios are not compatible with a fair domestic EU share in the global carbon budget either for 2°C or for 1.5°C.
Integrating additional sectoral mitigation potential from detailed national models can help bring down cumulative emissions in global and EU models to a level comparable to a fairness-based domestic EU share compatible with the 2°C target, but not the 1.5°C aspiration.
AbstractFinding the shortest path through open spaces is a well-known challenge for pedestrian routing engines. A common solution is routing on the open space boundary, which causes in most cases an unnecessarily long route. A possible alternative is to create a subgraph within the open space. This paper assesses this approach and investigates its implications for routing engines. A number of algorithms (Grid, Spider-Grid, Visibility, Delaunay, Voronoi, Skeleton) have been evaluated by four different criteria: (i) Number of additional created graph edges, (ii) additional graph creation time, (iii) route computation time, (iv) routing quality. We show that each algorithm has advantages and disadvantages depending on the use case. We identify the algorithms Visibility with a reduced number of edges in the subgraph and Spider-Grid with a large grid size to be a good compromise in many scenarios. 相似文献
The risk of flooding in Venice has increased strongly since the beginning of the century. To reduce the damage to the city and the negative impact on the activities in the lagoon, an accurate flood warning system is necessary. This system will also be fundamental during the construction and for the efficient operation of storm surge barriers covering the three existing inlets of the lagoon. In this context new operational statistical and hydrodynamic models have been developed. Forecast winds and pressure fields which constitute basic information for the warning system have been obtained through an ad hoc Limited Area Meteorological model. It has been demonstrated that, provided that this information is available on an operational basis, the implementation of a flood warning system for Venice using the models developed is feasible. The statistical model, which is based on a multiple regression technique, extends the forecasting range of the model presently in operation at the Centro Previsioni e Segnalazioni Maree del Comune di Venezia, from 3 hrs up to 24 hrs, and presents good accuracy (estimated mean absolute errors smaller than 10 cm) for short-term forecasts up to 9 hrs. The hydrodynamic model includes all the physical processes important for the simulation of water levels and currents in coastal and marine environments. The model set-up adopted covers the entire Adriatic Sea, with a grid spacing of 6 km. Special attention has been given to the positioning of the open boundary and to the correct reproduction of the main free oscillation of the Adriatic, which is responsible for the possible recurrence of flooding after the main storm has passed. The inclusion of this model in a flood warning system is mainly intended for long-term forecasts (> 24 hrs), and can typically be used to forecast up to 3–4 days ahead, with an estimated mean absolute error smaller than 20 cm. 相似文献
The biogeochemistry of sedimentary sulfur was investigated on the continental shelf off central Chile at water depths between 24 and 88 m under partial influence of an oxygen minimum zone. Dissolved and solid iron and sulfur species, including the sulfur intermediates sulfite, thiosulfate, and elemental sulfur, were analyzed at high resolution in the top 20 cm. All stations were characterized by high rates of sulfate reduction, but only the sediments within the Bay of Concepción contained dissolved sulfide. Due to advection and/or in-situ reoxidation of sulfide, dissolved sulfate was close to bottom water values. Whereas the concentrations of sulfite and thiosulfate were mostly in the submicromolar range, elemental sulfur was by far the dominant sulfur intermediate. Although the large nitrate- and sulfur-storing bacteria Thioploca were abundant, the major part of S0 was located extracellularly. The distribution of sulfur species and dissolved iron suggests the reaction of sulfide with FeOOH as an important pathway for sulfide oxidation and sulfur intermediate formation. This is in agreement with the sulfur isotope composition of co-existing elemental sulfur and iron monosulfides. In the Bay of Concepción, sulfur isotope data suggest that pyrite formation proceeds via the reaction of FeS with polysulfides or H2S. At the shelf stations, on the other hand, pyrite was significantly depleted in 34S relative to its potential precursors FeS and S0. Isotope mass balance considerations suggest further that pyritization at depth includes light sulfide, potentially originating from bacterial sulfur disproportionation. The δ34S-values of pyrite down to −38‰ vs. V-CDT are among the lightest found in organic-rich marine sediments. Seasonal variations in the sulfur isotope composition of dissolved sulfate indicated a dynamic non-steady-state sulfur cycle in the surface sediments. The 18O content of porewater sulfate increased with depth at all sites compared to the bottom water composition due to intracellular isotope exchange reactions during microbial sulfur transformations. 相似文献
At the beginning of the twenty-first century, a technological change took place in geodetic astronomy by the development of
Digital Zenith Camera Systems (DZCS). Such instruments provide vertical deflection data at an angular accuracy level of 0.̋1
and better. Recently, DZCS have been employed for the collection of dense sets of astrogeodetic vertical deflection data in
several test areas in Germany with high-resolution digital terrain model (DTM) data (10–50 m resolution) available. These
considerable advancements motivate a new analysis of the method of astronomical-topographic levelling, which uses DTM data
for the interpolation between the astrogeodetic stations. We present and analyse a least-squares collocation technique that
uses DTM data for the accurate interpolation of vertical deflection data. The combination of both data sets allows a precise
determination of the gravity field along profiles, even in regions with a rugged topography. The accuracy of the method is
studied with particular attention on the density of astrogeodetic stations. The error propagation rule of astronomical levelling
is empirically derived. It accounts for the signal omission that increases with the station spacing. In a test area located
in the German Alps, the method was successfully applied to the determination of a quasigeoid profile of 23 km length. For
a station spacing from a few 100 m to about 2 km, the accuracy of the quasigeoid was found to be about 1–2 mm, which corresponds
to a relative accuracy of about 0.05−0.1 ppm. Application examples are given, such as the local and regional validation of
gravity field models computed from gravimetric data and the economic gravity field determination in geodetically less covered
regions. 相似文献