首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Wildfire is a natural component of sagebrush (Artemisia spp.) steppe rangelands that induces temporal shifts in plant community physiognomy, ground surface conditions, and erosion rates. Fire alteration of the vegetation structure and ground cover in these ecosystems commonly amplifies soil losses by wind- and water-driven erosion. Much of the fire-related erosion research for sagebrush steppe has focused on either erosion by wind over gentle terrain or water-driven erosion under high-intensity rainfall on complex topography. However, many sagebrush rangelands are geographically positioned in snow-dominated uplands with complex terrain in which runoff and sediment delivery occur primarily in winter months associated with cold-season hydrology. Current understanding is limited regarding fire effects on the interaction of wind- and cold-season hydrologic-driven erosion processes for these ecosystems. In this study, we evaluated fire impacts on vegetation, ground cover, soils, and erosion across spatial scales at a snow-dominated mountainous sagebrush site over a 2-year period post-fire. Vegetation, ground cover, and soil conditions were assessed at various plot scales (8 m2 to 3.42 ha) through standard field measures. Erosion was quantified through a network of silt fences (n = 24) spanning hillslope and side channel or swale areas, ranging from 0.003 to 3.42 ha in size. Sediment delivery at the watershed scale (129 ha) was assessed by suspended sediment samples of streamflow through a drop-box v-notch weir. Wildfire consumed nearly all above-ground live vegetation at the site and resulted in more than 60% bare ground (bare soil, ash, and rock) in the immediate post-fire period. Widespread wind-driven sediment loading of swales was observed over the first month post-fire and extensive snow drifts were formed in these swales each winter season during the study. In the first year, sediment yields from north- and south-facing aspects averaged 0.99–8.62 t ha−1 at the short-hillslope scale (~0.004 ha), 0.02–1.65 t ha−1 at the long-hillslope scale (0.02–0.46 ha), and 0.24–0.71 t ha−1 at the swale scale (0.65–3.42 ha), and watershed scale sediment yield was 2.47 t ha−1. By the second year post fire, foliar cover exceeded 120% across the site, but bare ground remained more than 60%. Sediment yield in the second year was greatly reduced across short- to long-hillslope scales (0.02–0.04 t ha−1), but was similar to first-year measures for swale plots (0.24–0.61 t ha−1) and at the watershed scale (3.05 t ha−1). Nearly all the sediment collected across all spatial scales was delivered during runoff events associated with cold-season hydrologic processes, including rain-on-snow, rain-on-frozen soils, and snowmelt runoff. Approximately 85–99% of annual sediment collected across all silt fence plots each year was from swales. The high levels of sediment delivered across hillslope to watershed scales in this study are attributed to observed preferential loading of fine sediments into swale channels by aeolian processes in the immediate post-fire period and subsequent flushing of these sediments by runoff from cold-season hydrologic processes. Our results suggest that the interaction of aeolian and cold-season hydrologic-driven erosion processes is an important component for consideration in post-fire erosion assessment and prediction and can have profound implications for soil loss from these ecosystems. © 2019 John Wiley & Sons, Ltd.  相似文献   

2.
Large wildfires can have profound and lasting impacts not only from direct consumption of vegetation but also longer-term effects such as persistent soil erosion. The 2002 Hayman Fire burned in one of the watersheds supplying water to the Denver metropolitan area; thus there was concern regarding hillslope erosion and sedimentation in the reservoirs. The efficacy of various treatments for reducing erosion was tested, including hand scarification on contour, agricultural straw mulch, wood mulch, burned controls and unburned reference plots. Simulated rill erosion experiments were used both immediately after the fire and again 10 years post fire. To better understand untreated recovery, the same experiments were applied to control plots in post-fire years 1, 2, 3 and 4, and in unburned reference plots in years 4 and 10. Results indicate that control and scarified plots produced significantly greater sediment flux rates – 1.9 and 2.8 g s−1 respectively – than the straw and wood mulch treatments – 0.9 and 1.1 g s−1 – immediately after the fire. Mulch treatments reduced runoff rate, runoff velocity, and sediment concentration and flux rate. The straw mulch cover was no longer present, whereas the wood mulch was still there in year 10. Vegetation regrowth was slow and mulch treatments provided effective cover to reduce sediment right after the fire. In post-fire year 10, there were no significant differences in sediment flux rates across treatments; it is notable, however, that the wood mulch treatment (0.09 g s−1) most closely approached the unburned condition (0.07 g s−1). The burned control plots had high sediment flux rates until post-fire year 3, when flux rates significantly decreased and were statistically no longer higher than the unburned levels from year 4 and 10. These results will inform managers of the longer-term post-fire sediment delivery rates and of the ability of post-fire emergency hillslope treatments to mitigate erosion rates. Published 2019. This article is a U.S. Government work and is in the public domain in the USA.  相似文献   

3.
Sediment delivery following post-fire logging is a concern relative to water quality. While studies have assessed the effect of post-fire logging on sediment yields at different spatial scales, none have explicitly identified sediment sources. Our goal was to quantify post-fire and post-salvage logging sediment yields and use rill patterns to identify sediment sources. We measured the extent and type of logging disturbance, length of rills per unit area or “rill density”, ground cover, and sediment yields in nine logged and five control small catchments or “swales”, 0.09 to 0.81 ha, for 5 years after the 2013 Rim Fire in California's Sierra Nevada. The logged swales had a mean ground disturbance of 31%. After the first wet season following logging, there was no difference in either mean rill density (0.071 and 0.088 m m−2, respectively) or mean transformed, normalized sediment yields between the control and logged swales. Untransformed mean sediment yields across three sites ranged from 0.11–11.8 and 1.1–3.2 Mg ha−1 for the controls and salvage-logged swales, respectively. Rill density was strongly related to sediment yield and increased significantly with the amount of high-traffic skid trail disturbance in logged swales. Rill density was not significantly related to the amount of bare soil despite a significant relationship between sediment yields and bare soil. Rills usually initiated in bare soil and frequently connected high traffic skid trails to the drainage network after being diverted by waterbars. Rill connectivity and sediment yields decreased in control and logged swales where vegetation or other surface cover was high, suggesting this cover disconnected rills from the drainage network. Increasing ground cover on skid trails and between areas disturbed by post-fire logging and stream channels may reduce sediment yields as well as the hydrologic connectivity between hillslopes and the drainage network.  相似文献   

4.
Forest land affected by deforestation yields high soil and water losses.Suitable management practices need to be found that can reduce these losses and achieve ecological and hydrological sustainability of the deforested areas.Mulch has been found to be effective in reducing soil losses;straw mulch is easy to apply,contributes soil organic matter,and is efficient since the day of application.However,the complex effects of rice straw mulch with different application rates and lengths on surface runoff and soil loss have not been clarified in depth.The current paper evaluates the efficiency of rice straw mulch in reducing the hydrological response of a silty clay loam soil under high intensity and low frequency rainfall events(tap water with total depth of 49 mm and intensity of 98 mm/h)simulated in the laboratory.Surface runoff and soil loss at three lengths of the straw(10,30,and 200 mm)and three application rates(1,2,and 3 Mg/ha)were measured in 50 cm(width)×100 cm(length)×10 cm(depth)plots with disturbed soil samples(aggregate soil size<4 mm)collected in a deforested area.Bare soil was used as control experiment.Runoff volume and erosion were significantly(at p<0.05)lower in mulched soils compared to control plots.These reductions were ascribed to the water absorption capacity of the rice straw and the protection cover of the mulch layer.The minimum runoff was observed for a mulch layer of3 Mg/ha of straw with a length of 200 mm.The lowest soil losses were found with straw length of10 mm.The models developed predict runoff and erosion based on simple linear functions of mulch application rate and length,and can be used for a suitable hydrological management of soil.It is concluded that,thanks to rice straw mulch used as an organic soil conditioner,soil erosion and surface runoff are significantly(at p<0.05)reduced,and the mulch protection contributes to reduce the risk of soil degradation.Further research is,however,needed to analyze the upscaling of the hydrological effects of mulching from the plot to the hillslope scale.  相似文献   

5.
Continuing long and extensive wildfire seasons in the Western US emphasize the need for better understanding of wildfire impacts including post-fire management scenarios. Advancements in our understanding of post-fire hillslope erosion and watershed response such as flooding, sediment yield, and debris flows have recently received considerable attention. The potential impacts of removing dead trees, called salvage logging, has been studied, however the use of remotely sensed imagery after salvage logging to evaluate spatial patterns and recovery is novel. The 2015 North Star Fire provided an opportunity to evaluate hillslope erosion reduction using two field experiments and coincidental remotely sensed imagery over 3 years. Simulated rill experiments with four flow rates were used to quantify hillslope erosion on skidder trails with and without added logging slash compared with a burned-only control. Seven replicated hillslope silt fence plots with the same treatments were also evaluated for natural rainfall events. WorldView-2 satellite imagery was used to relate ground cover and erodible bare soil between the two experiments using multi-temporal Normalized Differenced Vegetation Index (NDVI) values. Results indicate that the skid trails produced significantly more sediment (0.70 g s−1) than either the slash treated skid trail (0.34 g s−1) or controls (0.04 g s−1) with the simulated rill experiment. Similarly, under natural rainfall conditions sediment yield from hillslope silt fence plots was significantly greater for the skid trail (3.42 Mg ha−1) than either the slash treated skid trail (0.18 Mg ha−1) or controls (0 Mg ha−1). An NDVI value of 0.32 on all plots over all years corresponded to a ground cover of about 60% which is an established threshold for erosion reduction. Significant relationships between NDVI, ground cover, and sediment values suggest that NDVI may help managers evaluate ground cover and erosion potential remotely after disturbances such as a wildfire or salvage logging.  相似文献   

6.
Reliable quantitative data on the extent and rates of soil erosion are needed to understand the global significance of soil‐erosion induced carbon exchange and to underpin the development of science‐based mitigation strategies, but large uncertainties remain. Existing estimates of agricultural soil and soil organic carbon (SOC) erosion are very divergent and span two orders of magnitude. The main objective of this study was to test the assumptions underlying existing assessments and to reduce the uncertainty associated with global estimates of agricultural soil and SOC erosion. We parameterized a simplified erosion model driven by coarse global databases using an empirical database that covers the conterminous USA. The good agreement between our model results and empirical estimates indicate that the approach presented here captures the essence of agricultural erosion at the scales of continents and that it may be used to predict the significance of erosion for the global carbon cycle and its impact on soil functions. We obtained a global soil erosion rate of 10.5 Mg ha‐1 y‐1 for cropland and 1.7 Mg ha‐1 y‐1 for pastures. This corresponds to SOC erosion rates of 193 kg C ha‐1 y‐1 for cropland and 40.4 kg C ha‐1 y‐1 for eroding pastures and results in a global flux of 20.5 (±10.3) Pg y‐1 of soil and 403.5 (±201.8) Tg C y‐1. Although it is difficult to accurately assess the uncertainty associated with our estimates of global agricultural erosion, mainly due to the lack of model testing in (sub‐)tropical regions, our estimates are significantly lower than former assessments based on the extrapolation of plot experiments or global application of erosion models. Our approach has the potential to quantify the rate and spatial signature of the erosion‐induced disturbance at continental and global scales: by linking our model with a global soil profile database, we estimated soil profile modifications induced by agriculture. This showed that erosion‐induced changes in topsoil SOC content are significant at a global scale (an average SOC loss of 22% in 50 years) and agricultural soils should therefore be considered as dynamic systems that can change rapidly. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

7.
Post‐fire rehabilitation treatments are commonly implemented after high‐severity wildfires, but few data are available about the efficacy of these treatments. This study assessed post‐fire erosion rates and the effectiveness of seeding, straw mulching, and contour felling in reducing erosion after a June 2000 wildfire northwest of Loveland, Colorado. Site characteristics and sediment yields were measured on 12 burned and untreated control plots and 22 burned and treated plots from 2000 to 2003. The size of the hillslope plots ranged from 0·015 to 0·86 ha. Sediment yields varied significantly by treatment and were most closely correlated with the amount of ground cover. On the control plots the mean sediment yield declined from 6–10 Mg ha?1 in the first two years after burning to 1·2 Mg ha?1 in 2002 and 0·7 Mg ha?1 in 2003. Natural regrowth caused the amount of ground cover on the control plots to increase progressively from 33% in fall 2000 to 88% in fall 2003. Seeding had no effect on either the amount of ground cover or sediment yields. Mulching reduced sediment yields by at least 95% relative to the control plots in 2001, 2002, and 2003, and the lower sediment yields are attributed to an immediate increase in the amount of ground cover in the mulched plots. The contour‐felling treatments varied considerably in the quality of installation, and sediment storage capacities ranged from 7 to 32 m3 ha?1. The initial contour‐felling treatment did not reduce sediment yields when subjected to a very large storm event, but sediment yields were significantly reduced by a contour‐felling treatment installed after this large storm. The results indicate that contour felling may be able to store much of the sediment generated in an average year, but will not reduce sediment yields from larger storms. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

8.
Changes in land use are common in Mediterranean areas and are reported as having produced changes in the intensity of soil erosion. Dehesas are rangelands with a disperse tree cover, widespread in the south-western part of the Iberian Peninsula and similar ecosystems are also common in other areas with a Mediterranean climate. The aim of the present study is to analyse temporal and spatial variations of soil erosion rates estimated along three hillsides, located in two farms (Buitrera and Parapuños) in southwest Spain. To understand the temporal variation, soil erosion rates were studied in light of land use-management changes that took place during the last few centuries. Results indicate very low erosion rates prior to the 18th century in both farms. In Buitrera, a first increase of soil loss rates was identified during the period 1831-1897, amounting to 7.4 t ha-1 y-1. A further increase took place during the 20th century, reaching a mean erosion rate of 29.1 t ha-1 y-1. In Parapuños, data points to a significant increase from 1881 onwards, with an estimated mean erosion rate of 18.5 t ha-1 y-1. Those increases were presumably connected with an intensification of land use, such as cultivation and excessive livestock populations. Regarding spatial variation, the bare surface and the erosive power of run-off along the hillsides accounts for 76% of the soil erosion rates dispersion. At a local scale, the variability of erosion rates could not be explained, because of (i) uncertainty related to the micromorphology of the past soil surface and (ii) the role of tillage erosion in the past. However, the results obtained offer valuable data on the temporal and spatial variation of erosion rates in dehesas at the hillslope scale and a similar approach could be used for other rangelands with a disperse tree cover. © 2019 John Wiley & Sons, Ltd.  相似文献   

9.
The effects of timber-cutting on sediment concentrations, soil loss and overland flow in an insigne pine (Pinus radiata) plantation were studied in a mountain watershed of the Cordillera de la Costa, central Chile. Soil formation rates for the lithological conditions of the watershed were estimated. Soil loss measurements on the plantation were taken in 100 m2 plots, equipped with Coshocton samplers, during the years 1991 and 1992. Treatments were: clear-cutting no residues/burned, clear-cutting with residues and undisturbed controls. First-year soil losses were greater from the no residues/burned (2128 kg ha?1) than from the residues (1219 kg ha?1) or undisturbed (48 kg ha?1) plots. During the second post-treatment year, soil loss was greater from the burned plots (1349 kg ha?1) than from the residues (243 kg ha?1) or the undisturbed (72 kg ha?1) plots. Sediment concentrations for the three treatments were 561, 340 and 59 mgl-1 during the first year, and 400, 150 and 83 mgl?1 in the second year. Runoff from the no residues/burned plots was greater than from residues or undisturbed plots during the two post-treatment years. Long-term soil losses were projected to average 240 kg ha?1 yr?1 from areas without residues/burned and 120 kg ha?1 yr?1 in areas with residues treatment, over a 25 year rotation period, whereas control areas were projected to average 60 kg ha?1 yr?1.  相似文献   

10.
Two principal groups of processes shape mass fluxes from and into a soil: vertical profile development and lateral soil redistribution. Periods having predominantly progressive soil forming processes (soil profile development) alternate with periods having predominantly regressive processes (erosion). As a result, short-term soil redistribution – years to decades – can differ substantially from long-term soil redistribution; i.e. centuries to millennia. However, the quantification of these processes is difficult and consequently their rates are poorly understood. To assess the competing roles of erosion and deposition we determined short- and long-term soil redistribution rates in a formerly glaciated area of the Uckermark, northeast Germany. We compared short-term erosion or accumulation rates using plutonium-239 and -240 (239+240Pu) and long-term rates using both in situ and meteoric cosmogenic beryllium-10 (10Be). Three characteristic process domains have been analysed in detail: a flat landscape position having no erosion/deposition, an erosion-dominated mid-slope, and a deposition-dominated lower-slope site. We show that the short-term mass erosion and accumulation rates are about one order of magnitude higher than long-term redistribution rates. Both, in situ and meteoric 10Be provide comparable results. Depth functions, and therefore not only an average value of the topsoil, give the most meaningful rates. The long-term soil redistribution rates were in the range of −2.1 t ha-1 yr-1 (erosion) and +0.26 t ha-1 yr-1 (accumulation) whereas the short-term erosion rates indicated strong erosion of up to 25 t ha-1 yr-1 and accumulation of 7.6 t ha-1 yr-1. Our multi-isotope method identifies periods of erosion and deposition, confirming the ‘time-split approach’ of distinct different phases (progressive/regressive) in soil evolution. With such an approach, temporally-changing processes can be disentangled, which allows the identification of both the dimensions of and the increase in soil erosion due to human influence. © 2019 John Wiley & Sons, Ltd.  相似文献   

11.
Knowledge of soil loss rates by water erosion under given climate, soil, topography, and management conditions is important for establishing soil conservation schemes. In Galicia, a region with Atlantic climatic conditions in Spain, field observations over the last decade indicate that interrill, rill and ephemeral gully erosion may be an important sediment source. The aim of this work was to assess concentrated erosion rates, describe types of rills and ephemeral gullies and determine their origin, evolution and importance as sediment sources. Soil surface state and concentrated flow erosion were surveyed on medium textured soils, developed over basic schists of the Ordenes Complex series (Coruña province, Spain) from 1997 to 2006. Soil surface state was characterized by crust development, tillage features and roughness degree. Soil erosion rate was directly measured in the field. Concentrated flow erosion took place mainly on seedbeds and recently tilled surfaces in late spring and by autumn or early winter. During the study period, erosion rates were highly variable and the following situations could be distinguished: (a) no incision or limited rill incision, i.e. below 2 Mg ha?1 year?1; (b) generalized rill and ephemeral gully incision in the class of mean values between 2·5 and 6·25 Mg ha?1 year?1, this was the most common erosion pattern; and (c) heavy erosion as observed during an extremely wet winter period, between October 2000 and February 2001, with erosion figures that may be about ten orders of magnitude higher, up to 55–60 Mg ha?1 year?1. Therefore, low values of soil losses are dominant, but also large values of rill and ephemeral gully erosion occurred during the study period. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

12.
The Brazilian savanna (cerrado) is a large and important economic and environmental region that is experiencing significant loss of its natural landscapes due to pressures of food and energy production, which in turn has caused large increases in soil erosion. However the magnitude of the soil erosion increases in this region is not well understood, in part because scientific studies of surface runoff and soil erosion are scarce or nonexistent in the cerrado as well as in other savannahs of the world. To understand the effects of deforestation we assessed natural rainfall‐driven rates of runoff and soil erosion on an undisturbed tropical woodland classified as ‘cerrado sensu stricto denso’ and bare soil. Results were evaluated and quantified in the context of the cover and management factor (C‐factor) of the Universal Soil Loss Equation (USLE). Replicated data on precipitation, runoff, and soil loss on plots (5 × 20 m) under undisturbed cerrado and bare soil were collected for 77 erosive storms that occurred over 3 years (2012 through 2014). C‐factor was computed annually using values of rainfall erosivity and soil loss rate. We found an average runoff coefficient of ~20% for the plots under bare soil and less than 1% under undisturbed cerrado. The mean annual soil losses in the plots under bare soil and cerrado were 12.4 t ha‐1 yr‐1 and 0.1 t ha‐1 yr‐1, respectively. The erosivity‐weighted C‐factor for the undisturbed cerrado was 0.013. Surface runoff, soil loss and C‐factor were greatest in the summer and fall. Our results suggest that shifts in land use from the native to cultivated vegetation result in orders of magnitude increases in soil loss rates. These results provide benchmark values that will be useful to evaluate past and future land use changes using soil erosion models and have significance for undisturbed savanna regions worldwide. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

13.
After the Valley Complex Fire burned 86 000 ha in western Montana in 2000, two studies were conducted to determine the effectiveness of contour‐felled log, straw wattle, and hand‐dug contour trench erosion barriers in mitigating postfire runoff and erosion. Sixteen plots were located across a steep, severely burned slope, with a single barrier installed in 12 plots (four per treatment) and four plots left untreated as controls. In a rainfall‐plus‐inflow simulation, 26 mm h?1 rainfall was applied to each plot for 1 h and 48 L min?1 of overland flow was added for the last 15 min. Total runoff from the contour‐felled log (0·58 mm) and straw wattle (0·40 mm) plots was significantly less than from the control plots (2·0 mm), but the contour trench plots (1·3 mm) showed no difference. The total sediment yield from the straw wattle plots (0·21 Mg ha?1) was significantly less than the control plots (2·2 Mg ha?1); the sediment yields in the contour‐felled log plots (0·58 Mg ha?1) and the contour trench plots (2·5 Mg ha?1) were not significantly different. After the simulations, sediment fences were installed to trap sediment eroded by natural rainfall. During the subsequent 3 years, sediment yields from individual events increased significantly with increasing 10 min maximum intensity and rainfall amounts. High‐intensity rainfall occurred early in the study and the erosion barriers were filled with sediment. There were no significant differences in event or annual sediment yields among treated and control plots. In 2001, the overall mean annual sediment yield was 21 Mg ha?1; this value declined significantly to 0·6 Mg ha?1 in 2002 and 0·2 Mg ha?1 in 2003. The erosion barrier sediment storage used was less than the total available storage capacity; runoff and sediment were observed going over the top and around the ends of the barriers even when the barriers were less than half filled. Published in 2007 by John Wiley & Sons, Ltd.  相似文献   

14.
A sediment budget was developed for the 1.7 km2 Maluna Creek drainage basin located in the Hunter Valley, New South Wales, Australia, for the period 1971-86. the impact of viticulture, which commenced at Maluna in 1971, was studied using erosion plots, with caesium-137 as an indicator of both soil erosion and sedimentation. Two methods were used to estimate vineyard soil losses from caesium-137 measurements. Sediment output from the catchment was measured for three years, and extrapolated from readings taken at a nearby long-term stream flow gauging station for the remaining 13 years. Relative amounts of soil loss from forest (60 per cent basin area), grazing land (30 per cent) and vineyards (10 per cent) were calculated. Soil losses by rain splash detachment were ten times greater from bare/cultivated sufaces than from the forest. Erosion plots of area 2 m2 showed no significant differences in soil loss between forest and grassland but, under bare soil, losses were 100 times greater. the 137Cs method was employed to calculate net soil loss from all vineyard blocks using both a previously established calibration curve and a proportional model. the latter method gave estimates of soil loss which were 3-9 times greater than by the calibration curve, and indicated that average soil losses from the vineyard were equivalent to 62 t ha?1 y?1 (1971-86). It was estimated that the forest contributed 1-8 per cent, the grazing land 1.6 per cent, and the vineyard 96.6 per cent of the total soil loss during that period. Sediment storages within the fluvial system adjacent to the vineyard ws 9460 t for the period, whereas sediment output was equivalent to 215 t km?1 y?1. Independent measurements of soil erosion, storage, and output showed that 56 per cent of the eroded sediment remained in the catchment, and 34 per cent was transported out by Maluna Creek. the budget was able to be balanced to within 10 per cent.  相似文献   

15.
Fire severity is recognized as a key factor in explaining post‐fire soil erosion. However, the relationship between soil burn severity and soil loss has not been fully established until now. Sediment availability may also affect the extent of post‐fire soil erosion. The objective of this study was to determine whether soil burn severity, estimated by an operational classification system based on visual indicators, can significantly explain soil loss in the first year after wildfire in shrubland and other areas affected by crown fires in northwest (NW) Spain. An additional aim was to establish indicators of sediment availability for use as explanatory variables for post‐fire soil loss. For these purposes, we measured hillslope‐scale sediment production rates and site characteristics during the first year after wildfire in 15 experimental sites using 65 plots. Sediment yields varied from 0.2 Mg ha?1 to 50.1 Mg ha?1 and soil burn severity ranged from low (1.8) to very high (4.5) in the study period. A model that included soil burn severity, annual precipitation and a land use factor (as a surrogate for sediment availability) as explanatory variables reasonably explained the erosion losses measured during the first year after fire. Model validation confirmed the usefulness of this empirical model. The proposed empirical model could be used by forest managers to help evaluate erosion risks and to plan post‐fire stabilization activities. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

16.
The prediction of wind erosion and dust emissions is important for controlling erosion and identifying dust sources in arid and semiarid regions of the world. This study predicts quantitatively wind erosion and dust emissions in Xinjiang Province, central Asia. The wind erosion prediction system (WEPS) was used to simulate annual soil and PM10 (particulate matter ≤10 μm in aerodynamic diameter) loss at 64 meteorological stations across the province. Soil and PM10 loss were simulated from bare surfaces at all 64 stations and from cotton and wheat fields at 11 stations. Simulated annual bare soil and PM10 loss were lowest in the Junggar (soil and PM10 loss were, respectively, 121.7 and 7.6 kg m-2) and Tarim basins (soil loss was 78.2 kg ha-1 and PM10 loss was 6.5 kg m-2) and highest in the Tu-ha Basin (soil and PM10 loss were, respectively, 638.2 and 37.7 kg m-2). Stations with the highest annual soil loss in the Tarim and Tu-ha basins also had the highest number of days with wind speeds >8 m s-1. This indicated wind influenced erosion, but other factors such as soil type also affect wind erosion. The maximum monthly bare soil and PM10 loss occurred in May in the three basins, substantiating that dust storms occur most frequently during spring in the region. Simulated soil and PM10 loss were lower for cotton and wheat than bare soil, thus suggesting that maintaining vegetative cover during a portion of the year provided some protection to the soil surface from wind erosion. © 2018 John Wiley & Sons, Ltd.  相似文献   

17.
Runoff and erosion processes can increase after wildfire and post-fire salvage logging, but little is known about the specific effects of soil compaction and surface cover after post-fire salvage logging activities on these processes. We carried out rainfall simulations after a high-severity wildfire and post-fire salvage logging to assess the effect of compaction (uncompacted or compacted by skid traffic during post-fire salvage logging) and surface cover (bare or covered with logging slash). Runoff after 71 mm of rainfall across two 30-min simulations was similar for the bare plots regardless of the compaction status (mean 33 mm). In comparison, runoff in the slash-covered plots averaged only 22 mm. Rainsplash in the downslope direction averaged 30 g for the bare plots across compaction levels and decreased significantly by 70% on the slash-covered plots. Sediment yield totalled 460 and 818 g m−2 for the uncompacted and compacted bare plots, respectively, and slash significantly reduced these amounts by an average rate of 71%. Our results showed that soil erosion was still high two years after the high severity burning and the effect of soil compaction nearly doubled soil erosion via nonsignificant increases in runoff and sediment concentration. Antecedent soil moisture (dry or wet) was the dominant factor controlling runoff, while surface cover was the dominant factor for rainsplash and sediment yield. Saturated hydraulic conductivity and interrill erodibility calculated from these rainfall simulations confirmed previous laboratory research and will support hydrologic and erosion modelling efforts related to wildfire and post-fire salvage logging. Covering the soil with slash mitigated runoff and significantly reduced soil erosion, demonstrating the potential of this practise to reduce sediment yield and soil degradation from burned and logged areas.  相似文献   

18.
Four techniques for soil erosion assessment were compared over two consecutive seasons for bare-fallow plots and a maize-cowpea sequence in 1985 at IITA, Ibadan, Nigeria. The techniques used were: tracer (aluminium paint), nails (16 and 25), the rill method, and the Universal Soil Loss Equation (USLE). Soil loss estimated by these techniques was compared with that determined using the runoff plot technique. There was significantly more soil loss (P < 0·01) in bare-fallow than in plots under maize (Zea mays) or cowpea (Vigna unguiculata). In the first season, soil loss from plots sown to maize was 40·2 Mg ha?1 compared with 153·3 Mg ha?1 from bare-fallow plots. In the second season, bare-fallow plots lost 87·5 Mg ha?1 against 39·4 Mg ha?1 lost from plots growing cowpea. The techniques used for assessing erosion had no influence on the magnitude of soil erosion and did not interfere with the processes of erosion. There was no significant difference (P < 0·05) between soil erosion determined by the nails and the runoff plot technique. Soil loss determined on six plots (three under maize, three bare-fallow) by the rill technique, at the end of the season, was significantly lower (P < 0·05) than that determined by the runoff plot technique. The soil loss estimated by the rill method was 143·2, 108·8 and 121·9 Mg ha?1 for 11, 11, and 8 per cent slopes respectively, in comparison with 201·5, 162·0, and 166·4 Mg ha?1 measured by the runoff plot method. Soil loss measured on three bare-fallow plots on 10 different dates by the rill technique was also significantly lower (P < 0·01) than that measured by the runoff plot. In the first season the USLE significantly underestimated soil loss. On 11, 11, and 8 per cent slopes, respectively, soil loss determined by the USLE was 77, 92, and 63 per cent of that measured by the runoff plot. However, in the second season there was no significant difference between soil loss determined by the USLE and that determined by the conventional runoff plot technique.  相似文献   

19.
Accelerated runoff and erosion commonly occur following forest fires due to combustion of protective forest floor material, which results in bare soil being exposed to overland flow and raindrop impact, as well as water repellent soil conditions. After the 2000 Valley Complex Fires in the Bitterroot National Forest of west‐central Montana, four sets of six hillslope plots were established to measure first‐year post‐wildfire erosion rates on steep slopes (greater than 50%) that had burned with high severity. Silt fences were installed at the base of each plot to trap eroded sediment from a contributing area of 100 m2. Rain gauges were installed to correlate rain event characteristics to the event sediment yield. After each sediment‐producing rain event, the collected sediment was removed from the silt fence and weighed on site, and a sub‐sample taken to determine dry weight, particle size distribution, organic matter content, and nutrient content of the eroded material. Rainfall intensity was the only significant factor in determining post‐fire erosion rates from individual storm events. Short duration, high intensity thunderstorms with a maximum 10‐min rainfall intensity of 75 mm h?1 caused the highest erosion rates (greater than 20 t ha?1). Long duration, low intensity rains produced little erosion (less than 0·01 t ha?1). Total C and N in the collected sediment varied directly with the organic matter; because the collected sediment was mostly mineral soil, the C and N content was small. Minimal amounts of Mg, Ca, and K were detected in the eroded sediments. The mean annual erosion rate predicted by Disturbed WEPP (Water Erosion Prediction Project) was 15% less than the mean annual erosion rate measured, which is within the accuracy range of the model. Published in 2007 by John Wiley & Sons, Ltd.  相似文献   

20.
Reliable assessment of the spatial distribution of soil erosion is important for making land management decisions, but it has not been thoroughly evaluated in karst geo‐environments. The objective of this study was to modify a physically based, spatially distributed erosion model, the revised Morgan, Morgan and Finney (RMMF) model, to estimate the superficial (as opposed to subsurface creep) soil erosion rates and their spatial patterns in a 1022 ha karst catchment in northwest Guangxi, China. Model parameters were calculated using local data in a raster geographic information system (GIS) framework. The cumulative runoff on each grid cell, as an input to the RMMF model for erosion computations, was computed using a combined flow algorithm that allowed for flow into multiple cells with a transfer grid considering infiltration and runoff seepage to the subsurface. The predicted spatial distributions of soil erosion rates were analyzed relative to land uses and slope zones. Results showed that the simulated effective runoff and annual soil erosion rates of hillslopes agreed well with the field observations and previous quantified redistribution rates with caesium‐137 (137Cs). The estimated average effective runoff and annual erosion rate on hillslopes of the study catchment were 18 mm and 0.27 Mg ha?1 yr?1 during 2006–2007. Human disturbances played an important role in accelerating soil erosion rates with the average values ranged from 0.1 to 3.02 Mg ha?1 yr?1 for different land uses. The study indicated that the modified model was effective to predict superficial soil erosion rates in karst regions and the spatial distribution results could provide useful information for developing local soil and water conservation plans. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号