首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 991 毫秒
1.
After the Valley Complex Fire burned 86 000 ha in western Montana in 2000, two studies were conducted to determine the effectiveness of contour‐felled log, straw wattle, and hand‐dug contour trench erosion barriers in mitigating postfire runoff and erosion. Sixteen plots were located across a steep, severely burned slope, with a single barrier installed in 12 plots (four per treatment) and four plots left untreated as controls. In a rainfall‐plus‐inflow simulation, 26 mm h?1 rainfall was applied to each plot for 1 h and 48 L min?1 of overland flow was added for the last 15 min. Total runoff from the contour‐felled log (0·58 mm) and straw wattle (0·40 mm) plots was significantly less than from the control plots (2·0 mm), but the contour trench plots (1·3 mm) showed no difference. The total sediment yield from the straw wattle plots (0·21 Mg ha?1) was significantly less than the control plots (2·2 Mg ha?1); the sediment yields in the contour‐felled log plots (0·58 Mg ha?1) and the contour trench plots (2·5 Mg ha?1) were not significantly different. After the simulations, sediment fences were installed to trap sediment eroded by natural rainfall. During the subsequent 3 years, sediment yields from individual events increased significantly with increasing 10 min maximum intensity and rainfall amounts. High‐intensity rainfall occurred early in the study and the erosion barriers were filled with sediment. There were no significant differences in event or annual sediment yields among treated and control plots. In 2001, the overall mean annual sediment yield was 21 Mg ha?1; this value declined significantly to 0·6 Mg ha?1 in 2002 and 0·2 Mg ha?1 in 2003. The erosion barrier sediment storage used was less than the total available storage capacity; runoff and sediment were observed going over the top and around the ends of the barriers even when the barriers were less than half filled. Published in 2007 by John Wiley & Sons, Ltd.  相似文献   

2.
Soil erosion by water is a pressing environmental problem caused and suffered by agriculture in Mediterranean environments. Soil conservation practices can contribute to alleviating this problem. The aim of this study is to gain more profound knowledge of the effects of conservation practices on soil losses by linking crop management and soil status to runoff and sediment losses measured at the outlet of a catchment during seven years. The catchment has 27.42 ha and is located in a commercial farm in southern Spain, where a package of soil conservation practices is an essential component of the farming system. The catchment is devoted to irrigated annual crops with maize–cotton–wheat as the primary rotation. Mean annual rainfall‐induced runoff coefficient was 0.14 and mean annual soil loss was 2.4 Mg ha?1 y?1. Irrigation contributed to 40% of the crop water supply, but the amount of runoff and sediment yield that it generated was negligible. A Principal Components Analysis showed that total soil loss is determined by the magnitude of the event (rainfall and runoff depths, duration) and by factors related to the aggressiveness of the events (rainfall intensity and preceding soil moisture). A third component showed the importance of crop coverage to reduce sediment losses. Cover crops grown during autumn and early winter and crop residues protecting the soil surface enhanced soil conservation notably. The role of irrigation to facilitate growing cover crops in Mediterranean environments is discussed. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

3.
The 3-D spatial distributions of vegetation are of great significance for water and soil conservation but are rarely concerned in literatures. The live vegetation volume (LVV) was used to relate to water/soil loss under 144 natural erosive rainfall events from 2007 to 2010 in a typical water-eroded area of southern China. Quadratic polynomial regression models were established for five pure tree (Pinus massoniana Lamb) plots between LVV and water (rtmoff)/soil conservation effects (RE/SE). RE/SE corresponds to the ratios of runoff depth/soil loss of the pure tree plots to that of the control plot under each rainfall event. Increasing LVV exhibits descending (DS), descending-ascending (DA), ascending-descending (AD), and ascending (AS) trends in the LVV-RE and LVV-SE curves. The effects of soil conservation on the plots were generally more noticeable than the effects of water conservation, and most of the RE and SE values reflected the positive effects of water and soil conservation. The effects were mainly positive under heavy rains (e.g., rainfall erosivity, R = 140 MJ mm ha-l h, maximum 30 min intensity, I30 = 16 mm h-l), whereas the effects were mainly negative under light rains (e.g., R = 45 MJ mm ha-1 h, I30 = 8 mm h-l). The trees' water/soil conservation effects notably transformed when rainfall erosivity and intensity were lower than the positive or negative effects to a certain threshold. About 50% rainfall events led to obvious transform effects when LVVs were near 0.5 or 0.6. These results are able to aid in the decision making on the forest reconstruction in water-eroded areas.  相似文献   

4.
The Brazilian savanna (cerrado) is a large and important economic and environmental region that is experiencing significant loss of its natural landscapes due to pressures of food and energy production, which in turn has caused large increases in soil erosion. However the magnitude of the soil erosion increases in this region is not well understood, in part because scientific studies of surface runoff and soil erosion are scarce or nonexistent in the cerrado as well as in other savannahs of the world. To understand the effects of deforestation we assessed natural rainfall‐driven rates of runoff and soil erosion on an undisturbed tropical woodland classified as ‘cerrado sensu stricto denso’ and bare soil. Results were evaluated and quantified in the context of the cover and management factor (C‐factor) of the Universal Soil Loss Equation (USLE). Replicated data on precipitation, runoff, and soil loss on plots (5 × 20 m) under undisturbed cerrado and bare soil were collected for 77 erosive storms that occurred over 3 years (2012 through 2014). C‐factor was computed annually using values of rainfall erosivity and soil loss rate. We found an average runoff coefficient of ~20% for the plots under bare soil and less than 1% under undisturbed cerrado. The mean annual soil losses in the plots under bare soil and cerrado were 12.4 t ha‐1 yr‐1 and 0.1 t ha‐1 yr‐1, respectively. The erosivity‐weighted C‐factor for the undisturbed cerrado was 0.013. Surface runoff, soil loss and C‐factor were greatest in the summer and fall. Our results suggest that shifts in land use from the native to cultivated vegetation result in orders of magnitude increases in soil loss rates. These results provide benchmark values that will be useful to evaluate past and future land use changes using soil erosion models and have significance for undisturbed savanna regions worldwide. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

5.
Although the impact of sheet erosion on the evolution of soils, soil properties and associated ecosystem services across landscapes is undisputed, there are still large uncertainties in the estimation of sheet erosion, as the results obtained are highly scale dependent. Consequently, there is a need to develop a scale‐explicit understanding of sediment erosion yields, from microplot to hillslope through to plot, to surmount actual erosion modelling flaws and to improve guidance for erosion mitigation. The main objective of this study was to compare sediment yields from small and large plots installed under different environmental conditions and to interpret these results in terms of the main mechanisms and controlling factors of sheet erosion. Fifteen 1 × 1 m² and ten 2 × 5 m² plots were installed on a hillslope in the foothills of the Drakensberg, South Africa. Data of runoff, sediment concentration (SC), soil loss (SL) and rainfall characteristics obtained during the 2009–2010 rainy season at the two spatial scales and from different soils, vegetation cover, geology and topographic conditions were used to identify the main controlling factors of sheet erosion. Scale ratios for SC and SL were subsequently calculated to assess the level of contribution of rain‐impacted flow (RIF) to overall sheet erosion. The average runoff rate (n = 17 events) ranged between 4.9 ± 0.4 L m‐2 on 1 m2 and 5.4 ± 0.6 L m2 on 10 m2, which did not correspond to significant differences at P < 0.05 level. Sediment losses were significantly higher on the 10 m2 plots, compared with the 1 m2 plots (2.2 ± 0.4 vs 1.5 ± 0.2 g L‐1 for SC; 9.8 ± 1.8 vs 3.2 ± 0.3 g m‐2 for SL), which illustrated a greater efficiency of sheet erosion on longer slopes. Results from a principal component analysis, whose two first axes explained 60% of the data variance, suggested that sheet erosion is mainly controlled by rainfall characteristics (rainfall intensity and amount) and soil surface features (crusting and vegetation coverage). The contribution of RIF to sheet erosion was the lowest at high soil clay content (r = 0.26) and the highest at high crusting and bulk density (r = 0.22), cumulative rainfall amount in the season and associated rise in soil water table (r = 0.29). Such an explicit consideration of the role of scale on sediment yields and process domination by either in situ (soil and soil surface conditions) or ex situ (rainfall characteristics and antecedent rainfall) factors, is expected to contribute to process‐based modelling and erosion mitigation. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

6.
Ten representative research sites were selected in eastern Spain to assess soil erosion rates and processes in new citrus orchards on sloping soils. The experimental plots were located at representatives sites on limestone, in areas with 498 to 715 mm year?1 mean annual rainfall, north‐facing slopes, herbicide treated, and new (less than 3 years old) plantations. Ten rainfall simulation experiments (1 h at 55 mm h?1 on 0·25 m2 plots) were carried out at each of the 10 selected study sites to determine the interill soil erosion and runoff rates. The 100 rainfall simulation tests (10 × 10 m) showed that ponding and runoff occurred in all the plots, and quickly: 121 and 195 s, respectively, following rainfall initiation. Runoff discharge was one third of the rainfall, and sediment concentration reached 10·4 g L?1. The soil erosion rates were 2·4 Mg ha?1 h?1 under 5‐year return period rainfall thunderstorms. These are among the highest soil erosion rates measured in the western Mediterranean basin, similar to badland, mine spoil and road embankment land surfaces. The positive relationship between runoff discharge and sediment concentration (r2 = 0·83) shows that the sediment availability is very high. Soil erosion rates on new citrus orchards growing on sloped soils are neither tolerable nor sustainable. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

7.
Abstract

A field experiment was conducted on a sloping grassland soil in southwest England to investigate the downslope transport of nitrogen in soil water following the application of cattle manure, slurry and inorganic fertilizer. Transport of nitrogen (N) species was monitored on hydrologically isolated plots. Manure (50 t ha?1), slurry (50 m3 ha?1) and fertilizer (250 kg N ha?1) were applied in February/March 1992. Subsurface water movement, by both matrix and preferential flow, was the dominant flow route during the experiment. Subsurface and surface nutrient flow pathways were monitored by analysing soil water and surface runoff for NO3-N, NH4-N and total N. Subsurface flow chemistry was dominated by NO3-N, with concentrations usually between 2 and 5 mg NO3 ?N dm?3. Differences between fertilizer and manure treatments and the untreated control were not significant. Significantly elevated NO3-N concentrations were observed in soil water in the buffer zone, indicating the importance of a buffer zone at least 10 m wide between manure spreading zones and an adjacent water course.  相似文献   

8.
Surfactants are chemical compounds that can change the contact angle of a water drop on solid surfaces and are commonly used to increase infiltration into water repellent soil. Since production fields with water repellent soil often contain areas of wettable soil, surfactants applied to such fields worldwide will likely be applied to wettable soil, with unknown consequences for irrigation‐induced erosion, runoff, or soil water relations. We evaluated surfactant and simulated sprinkler irrigation effects on these responses for three wettable, Pacific Northwest soils, Latahco and Rad silt loams, and Quincy sand. Along with an untreated control, we studied three surfactants: an alkyl polyglycoside (APG) in solution at a concentration of 18 g active ingredient (AI) kg?1, a block copolymer at 26 g kg?1, and a blend of the two at 43 g kg?1. From 2005 to 2009 in the laboratory, each surfactant was sprayed at a rate of 46·8 l ha?1 onto each soil packed by tamping into 1·2‐ by 1·5‐m steel boxes. Thereafter, each treated soil was irrigated twice at 88 mm h?1 with surfactant‐free well water. After each irrigation, runoff and sediment loss were measured and soil samples were collected. While measured properties differed among soils and irrigations, surfactants had no effect on runoff, sediment loss, splash loss, or tension infiltration, compared to the control. Across all soils, however, the APG increased volumetric water contents by about 3% (significant at p≤0·08) at matric potentials from 0 to ? 20 kPa compared to the control. With a decrease in the liquid–solid contact angle on treated soil surfaces, surfactant‐free water appeared able to enter, and be retained in pores with diameters ≥ 15 µm. All told, surfactants applied at economic rates to these wettable Pacific Northwest soils posed little risk of increasing either runoff or erosion or harming soil water relations. Moreover, by increasing water retention at high potentials, surfactants applied to wettable soils may allow water containing pesticides or other agricultural chemicals to better penetrate soil pores, thereby increasing the efficacy of the co‐applied materials. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

9.
Exceptional rainfall events cause significant losses of soil, although few studies have addressed the validation of model predictions at field scale during severe erosive episodes. In this study, we evaluate the predictive ability of the enhanced Soil Erosion and Redistribution Tool (SERT‐2014) model for mapping and quantifying soil erosion during the exceptional rainfall event (~235 mm) that affected the Central Spanish Pyrenees in October 2012. The capacity of the simulation model is evaluated in a fallow cereal field (1.9 ha) at a high spatial scale (1 × 1 m). Validation was performed with field‐quantified rates of soil loss in the rills and ephemeral gullies and also with a detailed map of soil redistribution. The SERT‐2014 model was run for the six rainfall sub‐events that made up the exceptional event, simulating the different hydrological responses of soils with maximum runoff depths ranging between 40 and 1017 mm. Predicted average and maximum soil erosion was 11 and 117 Mg ha?1 event?1, respectively. Total soil loss and sediment yield to the La Reina gully amounted to 16.3 and 9.0 Mg event?1. These rates are in agreement with field estimations of soil loss of 20.0 Mg event?1. Most soil loss (86%) occurred during the first sub‐event. Although soil accumulation was overestimated in the first sub‐event because of the large amount of detached soil, the enhanced SERT‐2014 model successfully predicted the different spatial patterns and values of soil redistribution for each sub‐event. Further research should focus on stream transport capacity. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

10.
This study examines runoff generated under simulated rainfall on Summerford bajada in the Jornada Basin, New Mexico, USA. Forty‐five simulation experiments were conducted on 1 m2 and 2 m2 runoff plots on grassland, degraded grassland, shrub and intershrub environments located in grassland and shrubland communities. Average hydrographs generated for each environment show that runoff originates earlier on the vegetated plots than on the unvegetated plots. This early generation of runoff is attributed to soil infiltration rates being overwhelmed by the rapid concentration of water at the base of plants by stemflow. Hydrographs from the degraded grassland and intershrub plots rise continuously throughout the 30 min simulation events indicating that these plots do not achieve equilibrium runoff. This continuously rising form is attributed to the progressive development of raindrop‐induced surface seals. Most grassland and shrub plots level out after the initial early rise indicating equilibrium runoff is achieved. Some shrub plots, however, display a decline in discharge after the early rise. The delayed infiltration of water into macropores beneath shrubs with vegetation in their understories is proposed to explain this declining form. Water yields predicted at the community level indicate that the shrubland sheds 150 per cent more water for a given storm event than the grassland. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

11.
Crop residue burning and imbalanced use of chemical fertilizers in intensive cereal–cereal rotations are present ecological threats in any agro‐ecosystem of the world. Therefore, identification of best suitable agricultural practices can be a feasible option. The present experiment was initiated in 2013 and consisted of four residue levels (0, 2, 4, and 6 Mg ha?1) and five potassium (K) levels (0, 50, 100, 150% recommended dose of K and 50%RDK+K solubilizing bacteria, KSB). Crop residue (CR) and K management significantly improve crop and soil quality associated parameters. Among the treatments, maximum increase in crop growth, physiological parameters, grain yield, quality aspects, and water productivity are recorded with the application of 4–6 Mg ha?1 CR. Application of 50%RDK+KSB also significantly increases crop and soil related parameters. Soil quality indicators (bulk density, pH, electrical conductivity, and available micronutrients) do not vary significantly with CR and K management. Change in soil organic carbon status, soil enzymes, and potassium‐solubilizing bacterial count are significantly increased with 4–6 Mg ha?1 CR and application of 50%RDK+KSB, and this is in accordance with correlation study carried out. Therefore, it is concluded that CR retention (4–6 Mg ha?1) and reduction of inorganic K fertilizer by 50% and inoculation of KSB enhance the soil quality indicators and thereby improve crop growth, physiological parameters, grain yield, and quality aspects along with water productivity under zero till maize–wheat rotation.  相似文献   

12.
Accelerated runoff and erosion commonly occur following forest fires due to combustion of protective forest floor material, which results in bare soil being exposed to overland flow and raindrop impact, as well as water repellent soil conditions. After the 2000 Valley Complex Fires in the Bitterroot National Forest of west‐central Montana, four sets of six hillslope plots were established to measure first‐year post‐wildfire erosion rates on steep slopes (greater than 50%) that had burned with high severity. Silt fences were installed at the base of each plot to trap eroded sediment from a contributing area of 100 m2. Rain gauges were installed to correlate rain event characteristics to the event sediment yield. After each sediment‐producing rain event, the collected sediment was removed from the silt fence and weighed on site, and a sub‐sample taken to determine dry weight, particle size distribution, organic matter content, and nutrient content of the eroded material. Rainfall intensity was the only significant factor in determining post‐fire erosion rates from individual storm events. Short duration, high intensity thunderstorms with a maximum 10‐min rainfall intensity of 75 mm h?1 caused the highest erosion rates (greater than 20 t ha?1). Long duration, low intensity rains produced little erosion (less than 0·01 t ha?1). Total C and N in the collected sediment varied directly with the organic matter; because the collected sediment was mostly mineral soil, the C and N content was small. Minimal amounts of Mg, Ca, and K were detected in the eroded sediments. The mean annual erosion rate predicted by Disturbed WEPP (Water Erosion Prediction Project) was 15% less than the mean annual erosion rate measured, which is within the accuracy range of the model. Published in 2007 by John Wiley & Sons, Ltd.  相似文献   

13.
The loss of P in overland flow from most cultivated soils is controlled by erosion, and in‐turn soil moisture. We evaluated the effect of soil moisture on erosion and P transport in overland flow by applying rainfall (7 cm h?1) to packed soil boxes (1 m long and 0·15 m wide) and field plots (1 and 10 m long by 1 m wide) of silt loams in a central Pennsylvania (USA) catchment. Flow from packed soil boxes took longer to initiate as antecedent soil moisture decreased from field capacity (2 min) to air dried (8 to 9 min). Even in the more complex field plots (i.e. soil heterogeneity and topography), the wetter site (1 by 10 m plot; 70% field capacity) produced flow more quickly (3 min) and in greater volume (439 L) than the drier site (1 by 10 m plot; 40% field capacity, 15 min, and 214 L, respectively). However, less suspended sediment was transported from wetter soil boxes (1·6 to 2·5 g L?1) and field plots (0·9 g L?1) than drier boxes (2·9 to 4·2 g L?1) and plots (1·2 g L?1). Differences are attributed to their potential for soil aggregate breakdown, slaking and dispersion, which contribute to surface soil sealing and crusting, as dry soils are subject to rapid wetting (by rainfall). During flow, selective erosion and antecedent moisture conditions affected P transport. At field capacity, DRP and PP transport varied little during overland flow. Whereas P transport from previously dry soil decreased rapidly after the initiation of flow (6 to 1·5 mg TP L?1), owing to the greater slaking and dispersion of P‐rich particles into flow at the beginning than end of the flow event. These results indicate that soil moisture fluctuations greatly effect erosion and P transport potential and that management to decrease the potential for loss should consider practices such as conservation tillage and cover crops, particularly on areas where high soil P and erosion coincide. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

14.
To predict the long‐term sustainability of water resources on the Boreal Plain region of northern Alberta, it is critical to understand when hillslopes generate runoff and connect with surface waters. The sub‐humid climate (PET) and deep glacial sediments of this region result in large available soil storage capacity relative to moisture surpluses or deficits, leading to threshold‐dependent rainfall‐runoff relationships. Rainfall simulation experiments were conducted using large magnitude and high intensity applications to examine the thresholds in precipitation and soil moisture that are necessary to generate lateral flow from hillslope runoff plots representative of Luvisolic soils and an aspen canopy. Two adjacent plots (areas of 2·95 and 3·4 m2) of contrasting antecedent moisture conditions were examined; one had tree root uptake excluded for two months to increase soil moisture content, while the second plot allowed tree uptake over the growing season resulting in drier soils. Vertical flow as drainage and soil moisture storage dominated the water balances of both plots. Greater lateral flow occurred from the plot with higher antecedent moisture content. Results indicate that a minimum of 15–20 mm of rainfall is required to generate lateral flow, and only after the soils have been wetted to a depth of 0·75 m (C‐horizon). The depth and intensity of rainfall events that generated runoff > 1 mm have return periods of 25 years or greater and, when combined with the need for wet antecendent conditions, indicate that lateral flow generation on these hillslopes will occur infrequently. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

15.
Increased soil erosion in immediate post‐wildfire years has been well documented in the literature, but many unanswered questions remain about the factors controlling erosional responses in different regional settings. The field site for the present study was located in a closed canopy, subalpine forest in Kootenay National Park, British Columbia that was subjected to a high‐intensity crown fire in the summer of 2003. Low soil erosion values were documented at the study site in the years immediately following the 2003 wildfire, with estimates ranging from approximately 10‐1 up to 100 t ha‐1. Following the wildfire, notable duff coverage (the duff layer is the combined fermentation and humus soil organic layers) remained above the mineral soil. This finding supports earlier studies documenting only partial duff consumption by high‐intensity wildfires in the boreal forest of Canada. It is postulated that remnant duff coverage after many high‐intensity wildfires impacts the hydrological and soil erosional response to rainstorm events in post‐wildfire years. In particular, duff provides detention storage for infiltrating rainfall and, therefore, may inhibit the generation of overland flow. Furthermore, duff also provides a physical barrier to soil erosion. The Green–Ampt model of rainfall infiltration is employed to better assess how interactions between rainfall duration/intensity and soil/duff properties affect hydrological response and the generation of overland flow. Model results show that duff provides an effective zone for detention storage and that duff accommodates all rainfall intensities to which it was subjected without the occurrence of surface ponding. In addition, the penetration of the wetting front is relatively slow in duff due to its high porosity and water storage potential. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

16.
Changing fire regimes and prescribed‐fire use in invasive species management on rangelands require improved understanding of fire effects on runoff and erosion from steeply sloping sagebrush‐steppe. Small (0·5 m2) and large (32·5 m2) plot rainfall simulations (85 mm h–1, 1 h) and concentrated flow methodologies were employed immediately following burning and 1 and 2 years post‐fire to investigate infiltration, runoff and erosion from interrill (rainsplash, sheetwash) and rill (concentrated flow) processes on unburned and burned areas of a steeply sloped sagebrush site on coarse‐textured soils. Soil water repellency and vegetation were assessed to infer relationships in soil and vegetation factors that influence runoff and erosion. Runoff and erosion from rainfall simulations and concentrated flow experiments increased immediately following burning. Runoff returned to near pre‐burn levels and sediment yield was greatly reduced with ground cover recovery to 40 per cent 1 year post‐fire. Erosion remained above pre‐burn levels on large rainfall simulation and concentrated flow plots until ground cover reached 60 per cent two growing seasons post‐fire. The greatest impact of the fire was the threefold reduction of ground cover. Removal of vegetation and ground cover and the influence of pre‐existing strong soil‐water repellency increased the spatial continuity of overland flow, reduced runoff and sediment filtering effects of vegetation and ground cover, and facilitated increased velocity and transport capacity of overland flow. Small plot rainfall simulations suggest ground cover recovery to 40 per cent probably protected the site from low‐return‐interval storms, large plot rainfall and concentrated flow experiments indicate the site remained susceptible to elevated erosion rates during high‐intensity or long duration events until ground cover levels reached 60 per cent. The data demonstrate that the persistence of fire effects on steeply‐sloped, sandy sagebrush sites depends on the time period required for ground cover to recover to near 60 per cent and on the strength and persistence of ‘background’ or fire‐induced soil water repellency. Published in 2009 by John Wiley & Sons, Ltd.  相似文献   

17.
G. A. Lehrsch 《水文研究》2013,27(12):1739-1750
Surfactants may affect soil structure differently depending upon the soil or the quality of rainfall or irrigation water. This study examined whether the water‐stable aggregation of 11 wettable soils was affected by surfactants and the water in which the soils were sieved. The study also examined whether the wettable soils' water drop penetration time (WDPT) was affected by surfactants, water drop quality, and elapsed time since the surfactants were applied. Two nonionic surfactants and a surfactant‐free water control were sprayed (by misting) upon air‐dry soil, then WDPT was measured 1 and 72 h thereafter. Subsequently, this treated soil was slowly wetted with an aerosol to its water content at a matric potential of ?3 kPa, then immediately sieved for 600 s in water that contained either appreciable or few electrolytes. Water‐stable aggregation, quantified as mean weight diameter (MWD), varied widely among soils, ranging from 0.10 to 1.36 mm. The MWDs were affected (at p = 0.06) by surfactant treatments, depending upon the soil but not sieving water quality. Surfactants affected the MWD of an Adkins loamy sand and Feltham sand, two of the three coarsest‐textured soils. Although WDPTs never exceeded 5 s, depending upon the soil WDPTs were affected by surfactant treatments but not by water drop quality. After surfactant application, WDPTs generally decreased with time for three soils but increased with time for one soil. Findings suggested that surfactants interacted (1) with clay mineralogy to affect MWD and (2) with soluble calcium to affect WDPT for certain soils. Surfactant treatments but not water quality affected both MWD and WDPT for some but not all of 11 wettable, US soils. Published 2012. This article is a US Government work and is in the public domain in the USA.  相似文献   

18.
Despite widespread bench‐terracing, stream sediment yields from agricultural hillsides in upland West Java remain high. We studied the causes of this lack of effect by combining measurements at different spatial scales using an erosion process model. Event runoff and sediment yield from two 4‐ha terraced hillside subcatchments were measured and field surveys of land use, bench‐terrace geometry and storage of sediment in the drainage network were conducted for two consecutive years. Runoff was 3·0–3·9% of rainfall and sediment yield was 11–30 t ha−1 yr−1 for different years, subcatchments and calculation techniques. Sediment storage changes in the subcatchment drainage network were less than 2 t ha−1, whereas an additional 0·3–1·5 t ha−1 was stored in the gully between the subcatchment flumes and the main stream. This suggests mean annual sediment delivery ratios of 86–125%, or 80–104% if this additional storage is included. The Terrace Erosion and Sediment Transport (TEST) model developed and validated for the studied environment was parameterized using erosion plot studies, land use surveys and digital terrain analysis to simulate runoff and sediment generation on the terraced hillsides. This resulted in over‐estimates of runoff and under‐estimates of runoff sediment concentration. Relatively poor model performance was attributed to sample bias in the six erosion plots used for model calibration and unaccounted covariance between important terrain attributes such as slope, infiltration capacity, soil conservation works and vegetation cover. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

19.
Irrigation experiments on 12 instrumented field plots were used to assess the impact of dynamic soil crack networks on infiltration and run‐off. During applications of intensity similar to a heavy rainstorm, water was seen being preferentially delivered within the soil profile. However, run‐off was not observed until soil water content of the profile reached field capacity, and the apertures of surface‐connected cracks had closed >60%. Electrical resistivity measurements suggested that subsurface cracks persisted and enhanced lateral transport, even in wet conditions. Likewise, single‐ring infiltration measurements taken before and after irrigation indicated that infiltration remained an important component of the water budget at high soil water content values, despite apparent surface sealing. Overall, although the wetting and sealing of the soil profile showed considerable complexity, an emergent property at the hillslope scale was observed: all of the plots demonstrated a strikingly similar threshold run‐off response to the cumulative precipitation amount. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

20.
Water and nutrient budgets in dryland agroecosystems are difficult to manage for efficiency and water quality. This is particularly true where complex terrain and soilscapes interact with pronounced hydrologic seasonality. The purpose of this research was to understand water and hydrologic nitrogen (N) export from a hillslope dryland agroecosystem in a semiarid region where most precipitation occurs outside the growing season. We studied 13 years (2001–2013) of records of water and N inputs and outputs from a 12 ha no‐till artificially drained catchment in the semiarid Palouse Basin of eastern Washington State, USA. Fall‐ and winter‐dominated annual precipitation averaged 462 mm. About 350 mm went to evapotranspiration; crops used ~160 mm from stored soil water during the summer dry‐down season. Soil water replenishment after crop senescence, during the fall wet‐up season, delayed the threshold onset of the high‐discharge season until December. Winter‐dominated drainage fluxes averaged 111 mm or 24% of annual precipitation. Nitrate export in drainage averaged 15 kg·N·ha?1·year?1, which was about 10 times the average rate of dissolved organic N export and 15% of the average rate of N application in chemical fertilizer. Fertilizer applications to the catchment were reduced, due to cropping changes, by 1/3 during the last 5 years of the study; however, no corresponding reduction was observed in the nitrate export flux. This lack of change could not be attributed to mineralization of the soil‐organic N legacy of fertilization nor to hydrologic lag of the catchment. Likeliest explanations are (a) despite the reduction, N application continued to exceed crop uptake and accumulation in organic matter; (b) seasonal and interannual variability of catchment connectivity resulted in year‐to‐year field‐scale nitrate storage and carryover. Water and N use efficiencies observed here may be near maximum obtainable for existing crops in this climate. Substantial improvements that would also address multiple environmental issues associated with the N cascade may involve shifts to perennial systems and/or rotations in which N is fixed biologically.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号