首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The objective of this research was to characterise annual precipitation extremes in a Mediterranean vineyard region. The number of exceptional events (P > 95th percentile) and annual extreme events (P > 99th percentile), as well as their strength, erosive character and return period were analysed for 2000–2004. The erosive character was evaluated according to the R‐factor (kinetic energy × maximum intensity in 30‐min periods). Soil and nutrient losses caused by these events were evaluated by combining field sampling and a hydrological model to estimate total runoff in a vineyard plot. The results show a clear increase in the number of very wet days and extreme events (P > 95th percentile), which represented up to 88% of annual rainfall. The severity of the extreme events (TS = precipitation event P > 99th percentile) reached values higher than 50 mm almost every year. These values were far exceeded in 2000, when one extraordinary event recorded 50% of the annual rainfall, with TS of 189 mm, about 80% of total rainfall being lost as runoff. Annual erosivity was driven not only by extreme events, but also by short events of less depth but high intensity. During some of the years analysed, rainfall erosivity was two or three times the average in the area. Most soil and nutrient losses occurred in a small number of events: one or two events every year were responsible for more than 75% of the annual soil and nutrient losses on average. Antecedent soil moisture conditions, runoff rates, and events with a return period higher than two years were responsible for the higher erosion rates. Apart from an exceptional event recorded in 2000, which produced more than 200 Mg ha?1 soil losses, annual soil losses up to 25 Mg ha?1 were recorded, which are much higher than the soil loss tolerance. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

2.
David Dunkerley 《水文研究》2008,22(22):4415-4435
In hydrology and geomorphology, less attention has been paid to rain event properties such as duration, mean and peak rain rate than to rain properties such as drop size or kinetic energy. A literature review shows a lack of correspondence between natural and simulated rain events. For example, 26 studies that report event statistics from substantial records of natural rain reveal a mean rain rate of just 3·47 mm h?1 (s.d. 2·38 mm h?1). In 17 comparable studies dealing with extreme rain rates including events in cyclonic, tropical convective, and typhoon conditions, a mean maximum rain rate (either hourly or mean event rain rate) of 86·3 mm h?1 (s.d. 57·7 mm h?1) is demonstrated. However, 49 studies using rainfall simulation involve a mean maximum rain rate of 103·1 mm h?1 (s.d. 81·3 mm h?1), often sustained for > 1 h, exceeding even than of extreme rain events, and nearly 30 times the mean rain rate in ordinary, non‐exceptional, rain events. Thus rainfall simulation is often biased toward high rain rates, and many of the rates employed (in several instances exceeding 150 mm h?1) appear to have limited relevance to ordinary field conditions. Generally, simulations should resemble natural rain events in each study region. Attention is also drawn to the raindrop arrival rate at the surface. In natural rain, this is known to vary from < 100 m?2 s?1 to > 5000 m?2 s?1. Arrival rate may need to be added to the list of parameters that must be reproduced realistically in rainfall simulation studies. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

3.
This paper describes the design, operation and performance of a field‐portable ‘drip‐type’ simulator and erosion measurement system. The system was constructed specifically for soil erosion research in the humid tropics and has been used extensively in Malaysian Borneo. The simulator is capable of producing replicable storms of up to 200 mm h?1 intensity and 20–30 minutes duration with a drop‐size distribution close to that of natural storms of such intensity (D50 of simulated rainfall is 4·15 mm at 200 mm h?1 and 3·65 mm at 160 mm h?1, D50 measured during natural rainfall = 3·25 mm). The simulator is portable and simply constructed and operates without a motor or electronics, thus making it particularly useful in remote, mountainous areas. The erosion measurement system allows assessment of: (1) rainsplash detachment and net downslope transport from the erosion plot; (2) slopewash (erosion transported by overland flow); and (3) infiltration capacity and overland flow. The performance of the simulator–erosion system compared with previous systems is assessed with reference to experiments carried out in primary and regenerating tropical rainforest at Danum Valley (Malaysian Borneo). The system was found to compare favourably with previous field simulators, producing a total storm kinetic energy of 727 J m?2 (over a 20‐minute storm event) and a kinetic energy rate of 0·61 J m?2 s?1, approximately half that experienced on the ground during a natural rainfall event of similar intensity, despite the shorter distance to the ground. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

4.
Reliable assessment of the spatial distribution of soil erosion is important for making land management decisions, but it has not been thoroughly evaluated in karst geo‐environments. The objective of this study was to modify a physically based, spatially distributed erosion model, the revised Morgan, Morgan and Finney (RMMF) model, to estimate the superficial (as opposed to subsurface creep) soil erosion rates and their spatial patterns in a 1022 ha karst catchment in northwest Guangxi, China. Model parameters were calculated using local data in a raster geographic information system (GIS) framework. The cumulative runoff on each grid cell, as an input to the RMMF model for erosion computations, was computed using a combined flow algorithm that allowed for flow into multiple cells with a transfer grid considering infiltration and runoff seepage to the subsurface. The predicted spatial distributions of soil erosion rates were analyzed relative to land uses and slope zones. Results showed that the simulated effective runoff and annual soil erosion rates of hillslopes agreed well with the field observations and previous quantified redistribution rates with caesium‐137 (137Cs). The estimated average effective runoff and annual erosion rate on hillslopes of the study catchment were 18 mm and 0.27 Mg ha?1 yr?1 during 2006–2007. Human disturbances played an important role in accelerating soil erosion rates with the average values ranged from 0.1 to 3.02 Mg ha?1 yr?1 for different land uses. The study indicated that the modified model was effective to predict superficial soil erosion rates in karst regions and the spatial distribution results could provide useful information for developing local soil and water conservation plans. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

5.
Interpreting rainfall‐runoff erosivity by a process‐oriented scheme allows to conjugate the physical approach to soil loss estimate with the empirical one. Including the effect of runoff in the model permits to distinguish between detachment and transport in the soil erosion process. In this paper, at first, a general definition of the rainfall‐runoff erosivity factor REFe including the power of both event runoff coefficient QR and event rainfall erosivity index EI30 of the Universal Soil Loss Equation (USLE) is proposed. The REFe factor is applicable to all USLE‐based models (USLE, Modified USLE [USLE‐M] and Modified USLE‐M [USLE‐MM]) and it allows to distinguish between purely empirical models (e.g., Modified USLE‐M [USLE‐MM]) and those supported by applying theoretical dimensional analysis and self‐similarity to Wischmeier and Smith scheme. This last model category includes USLE, USLE‐M, and a new model, named USLE‐M based (USLE‐MB), that uses a rainfall‐runoff erosivity factor in which a power of runoff coefficient multiplies EI30. Using the database of Sparacia experimental site, the USLE‐MB is parameterized and a comparison with soil loss data is carried out. The developed analysis shows that USLE‐MB (characterized by a Nash–Sutcliffe Efficiency Index NSEI equal to 0.73 and a root mean square error RMSE = 11.7 Mg ha?1) has very similar soil loss estimate performances as compared with the USLE‐M (NSEI = 0.72 and RMSE = 12.0 Mg ha?1). However, the USLE‐MB yields a maximum discrepancy factor between predicted and measured soil loss values (176) that is much lower than that of USLE‐M (291). In conclusion, the USLE‐MB should be preferred in the context of theoretically supported USLE type models.  相似文献   

6.
Runoff and sediment lost due to water erosion were recorded for 36 (1 m2) plots with varying types of vegetative cover located on sloping gypsiferous fields in the South of Madrid. 75% of the events had maximum 30‐minute intensity (I30) less than 10 mm h?1 in the period studied (1994–2005). As for the vegetative cover, maximum correlation between runoff and soil loss was found in the least protected plots (0–40% cover) during the most intense rainfall events; however, a significant positive correlation was also observed in plots with greater coverage (40–60%). If coverage exceeded 60%, rainfall erosivity declined. The average amount of sediment produced in high‐intensity events was significantly greater (approximately 7 g m?2 per I30 event >10 mm h?1) than that produced in the rest of the moderate‐intensity events (approximately 3 g m?2 per I30 event <10 mm h?1), but due to the high rate of occurrence of the latter throughout the year sediment loss during the period studied totaled 128 g m?2. By comparison, only 40 g m?2 was produced by the I30 events greater than 10 mm h?1. Even though the amount of soil lost is relatively insignificant from a quantitative standpoint, the organic matter content lost in the sediment (six times more than in the soil) is a permanent loss that threatens the development of the surface of the soil in this area when the vegetative cover is less than 40%. The soil here experiences a chronic loss of 0·02 mm annually as a consequence of frequent, moderate events, in addition to any loss produced by extraordinary events, which, though less frequent, are much more erosive. If moderate events are ignored, an important part of soil loss will be lost in the long run. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

7.
Four techniques for soil erosion assessment were compared over two consecutive seasons for bare-fallow plots and a maize-cowpea sequence in 1985 at IITA, Ibadan, Nigeria. The techniques used were: tracer (aluminium paint), nails (16 and 25), the rill method, and the Universal Soil Loss Equation (USLE). Soil loss estimated by these techniques was compared with that determined using the runoff plot technique. There was significantly more soil loss (P < 0·01) in bare-fallow than in plots under maize (Zea mays) or cowpea (Vigna unguiculata). In the first season, soil loss from plots sown to maize was 40·2 Mg ha?1 compared with 153·3 Mg ha?1 from bare-fallow plots. In the second season, bare-fallow plots lost 87·5 Mg ha?1 against 39·4 Mg ha?1 lost from plots growing cowpea. The techniques used for assessing erosion had no influence on the magnitude of soil erosion and did not interfere with the processes of erosion. There was no significant difference (P < 0·05) between soil erosion determined by the nails and the runoff plot technique. Soil loss determined on six plots (three under maize, three bare-fallow) by the rill technique, at the end of the season, was significantly lower (P < 0·05) than that determined by the runoff plot technique. The soil loss estimated by the rill method was 143·2, 108·8 and 121·9 Mg ha?1 for 11, 11, and 8 per cent slopes respectively, in comparison with 201·5, 162·0, and 166·4 Mg ha?1 measured by the runoff plot method. Soil loss measured on three bare-fallow plots on 10 different dates by the rill technique was also significantly lower (P < 0·01) than that measured by the runoff plot. In the first season the USLE significantly underestimated soil loss. On 11, 11, and 8 per cent slopes, respectively, soil loss determined by the USLE was 77, 92, and 63 per cent of that measured by the runoff plot. However, in the second season there was no significant difference between soil loss determined by the USLE and that determined by the conventional runoff plot technique.  相似文献   

8.
The point measurement of soil properties allows to explain and simulate plot scale hydrological processes. An intensive sampling was carried out at the surface of an unsaturated clay soil to measure, on two adjacent plots of 4 × 11 m2 and two different dates (May 2007 and February–March 2008), dry soil bulk density, ρb, and antecedent soil water content, θi, at 88 points. Field‐saturated soil hydraulic conductivity, Kfs, was also measured at 176 points by the transient Simplified Falling Head technique to determine the soil water permeability characteristics at the beginning of a possible rainfall event yielding measurable runoff. The ρb values did not differ significantly between the two dates, but wetter soil conditions (by 31%) and lower conductivities (1.95 times) were detected on the second date as compared with the first one. Significantly higher (by a factor of 1.8) Kfs values were obtained with the 0.30‐m‐diameter ring compared with the 0.15‐m‐diameter ring. A high Kfs (> 100 mm h?1) was generally obtained for low θi values (< 0.3 m3m?3), whereas a high θi yielded an increased percentage of low Kfs data (1–100 mm h?1). The median of Kfs for each plot/sampling date combination was not lower than 600 mm h?1, and rainfall intensities rarely exceeded 100 mm h?1 at the site. The occurrence of runoff at the base of the plot needs a substantial reduction of the surface soil permeability characteristics during the event, probably promoted by a higher water content than the one of this investigation (saturation degree = 0.44–0.62) and some soil compaction due to rainfall impact. An intensive soil sampling reduces the risk of an erroneous interpretation of hydrological processes. In an unstable clay soil, changes in Kfs during the event seem to have a noticeable effect on runoff generation, and they should be considered for modeling hydrological processes. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

9.
Accelerated runoff and erosion commonly occur following forest fires due to combustion of protective forest floor material, which results in bare soil being exposed to overland flow and raindrop impact, as well as water repellent soil conditions. After the 2000 Valley Complex Fires in the Bitterroot National Forest of west‐central Montana, four sets of six hillslope plots were established to measure first‐year post‐wildfire erosion rates on steep slopes (greater than 50%) that had burned with high severity. Silt fences were installed at the base of each plot to trap eroded sediment from a contributing area of 100 m2. Rain gauges were installed to correlate rain event characteristics to the event sediment yield. After each sediment‐producing rain event, the collected sediment was removed from the silt fence and weighed on site, and a sub‐sample taken to determine dry weight, particle size distribution, organic matter content, and nutrient content of the eroded material. Rainfall intensity was the only significant factor in determining post‐fire erosion rates from individual storm events. Short duration, high intensity thunderstorms with a maximum 10‐min rainfall intensity of 75 mm h?1 caused the highest erosion rates (greater than 20 t ha?1). Long duration, low intensity rains produced little erosion (less than 0·01 t ha?1). Total C and N in the collected sediment varied directly with the organic matter; because the collected sediment was mostly mineral soil, the C and N content was small. Minimal amounts of Mg, Ca, and K were detected in the eroded sediments. The mean annual erosion rate predicted by Disturbed WEPP (Water Erosion Prediction Project) was 15% less than the mean annual erosion rate measured, which is within the accuracy range of the model. Published in 2007 by John Wiley & Sons, Ltd.  相似文献   

10.
Flow diversion terraces (FDT) are commonly used beneficial management practice (BMP) for soil conservation on sloped terrain susceptible to water erosion. A simple GIS‐based soil erosion model was designed to assess the effectiveness of the FDT system under different climatic, topographic, and soil conditions at a sub‐basin level. The model was used to estimate the soil conservation support practice factor (P‐factor), which inherently considered two major outcomes with its implementation, namely (1) reduced slope length, and (2) sediment deposition in terraced channels. A benchmark site, the agriculture‐dominated watershed in northwestern New Brunswick (NB), was selected to test the performance of the model and estimated P‐factors. The estimated P‐factors ranged from 0·38–1·0 for soil conservation planning objectives and ranged from 0·001 to 0·45 in sediment yield calculations for water‐quality assessment. The model estimated that the average annual sediment yield was 773 kg ha?1 yr ?1 compared with a measured value of 641 kg ha?1 yr?1. The P‐factors estimated in this study were comparable with predicted values obtained with the revised universal soil loss equation (RUSLE2). The P‐factors from this study have the potential to be directly used as input in hydrological models, such as the soil and water assessment tool (SWAT), or in soil conservation planning where only conventional digital elevation models (DEMs) are available. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

11.
The variability of rainfall in space and time is an essential driver of many processes in nature but little is known about its extent on the sub‐kilometre scale, despite many agricultural and environmental experiments on this scale. A network of 13 tipping‐bucket rain gauges was operated on a 1·4 km2 test site in southern Germany for four years to quantify spatial trends in rainfall depth, intensity, erosivity, and predicted runoff. The random measuring error ranged from 10% to 0·1% in case of 1 mm and 100 mm rainfall, respectively. The wind effects could be well described by the mean slope of the horizon at the stations. Except for one station, which was excluded from further analysis, the relative differences due to wind were in maximum ±5%. Gradients in rainfall depth representing the 1‐km2 scale derived by linear regressions were much larger and ranged from 1·0 to 15·7 mm km?1 with a mean of 4·2 mm km?1 (median 3·3 mm km?1). They mainly developed during short bursts of rain and thus gradients were even larger for rain intensities and caused a variation in rain erosivity of up to 255% for an individual event. The trends did not have a single primary direction and thus level out on the long term, but for short‐time periods or for single events the assumption of spatially uniform rainfall is invalid on the sub‐kilometre scale. The strength of the spatial trend increased with rain intensity. This has important implications for any hydrological or geomorphologic process sensitive to maximum rain intensities, especially when focusing on large, rare events. These sub‐kilometre scale differences are hence highly relevant for environmental processes acting on short‐time scales like flooding or erosion. They should be considered during establishing, validating and application of any event‐based runoff or erosion model. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

12.
In this study, we examined the year 2011 characteristics of energy flux partitioning and evapotranspiration of a sub‐alpine spruce forest underlain by permafrost on the Qinghai–Tibet Plateau (QPT). Energy balance closure on a half‐hourly basis was H + λE = 0.81 × (Rn ? G ? S) + 3.48 (W m?2) (r2 = 0.83, n = 14938), where H, λE, Rn, G and S are the sensible heat, latent heat, net radiation, soil heat and air‐column heat storage fluxes, respectively. Maximum H was higher than maximum λE, and H dominated the energy budget at midday during the whole year, even in summer time. However, the rainfall events significantly affected energy flux partitioning and evapotranspiration. The mean value of evaporative fraction (Λ = λE/(λE + H)) during the growth period on zero precipitation days and non‐zero precipitation days was 0.40 and 0.61, respectively. The mean daily evapotranspiration of this sub‐alpine forest during summer time was 2.56 mm day?1. The annual evapotranspiration and sublimation was 417 ± 8 mm year?1, which was very similar to the annual precipitation of 428 mm. Sublimation accounted for 7.1% (30 ± 2 mm year?1) of annual evapotranspiration and sublimation, indicating that the sublimation is not negligible in the annual water balance in sub‐alpine forests on the QPT. The low values of the Priestley–Taylor coefficient (α) and the very low value of the decoupling coefficient (Ω) during most of the growing season suggested low soil water content and conservative water loss in this sub‐alpine forest. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

13.
Improving empirical prediction of plot soil erosion at the event temporal scale has both scientific and practical importance. In this investigation, 492 runoff and soil loss data from plots of different lengths, λ (11 ≤ λ ≤ 44 m), and steepness, s (14.9 ≤ s ≤ 26.0%), established at the Sparacia experimental station, in Sicily, South Italy, were used to derive a new version of Universal Soil Loss Equation (USLE)‐MM model, by only assuming a value of one for the topographic length, L, and steepness, S, factors for λ = 22 m and s = 9%, respectively. An erosivity index equal to (QREI30)b1, QR and EI30 being the runoff coefficient and the event rainfall erosivity index, respectively, with b1 > 1 was found to be an appropriate choice for the Sparacia area. The specifically developed functions for L and S did not differ appreciably from other, more widely accepted relationships (maximum differences by a factor of 1.22 for L and 1.09 for S). The new version of the USLE‐MM performed particularly well for highly erosive events, because predicted soil loss differed by not more than a factor of 1.19 from the measured soil loss for measured values of more than 100 Mg ha?1. The choice of the relationships to predict topographic effects on plot soil loss should not represent a point of particular concern in the application of the USLE‐MM in other environments. However, tests of the empirical approach should be carried out in other experimental areas in an attempt to develop analytical tools, usable at the event temporal scale, reasonably simple and of wide validity. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

14.
Leaf litter interception of water is an integral component of the water budget for some vegetated ecosystems. However, loss of rainfall to litter receives considerably less attention than canopy interception due to lack of suitable sensors to measure changes in litter water content. In this study, a commercially available leaf wetness sensor was calibrated to the gravimetric water content of eastern redcedar (Juniperus virginiana ) litter and used to estimate litter interception in a subhumid eastern redcedar woodland in north‐central Oklahoma. Under controlled laboratory conditions, a strong positive correlation between the leaf wetness sensor output voltage (mV) and measured gravimetric litter water content (? g) was determined: ? g = (.0009 × mV2) ? (0.14 × mV) ? 11.41 (R 2 = .94, p  < .0001). This relationship was validated with field sampling and the output voltage (mV) accounted for 48% of the observed variance in the measured water content. The maximum and minimum interception storage capacity ranged between 1.16 and 12.04 and 1.12 and 9.62 mm, respectively. The maximum and minimum amount of intercepted rain was positively correlated to rainfall amount and intensity. The continuous field measurements demonstrated that eastern redcedar litter intercepted approximately 8% of the gross rainfall that fell between December 16, 2014 and May 31, 2015. Therefore, rainfall loss to litter can constitute a substantial component of the annual water budget. Long‐term in situ measurement of litter interception loss is necessary to gain a better estimate of water availability for streamflow and recharge. This is critical to manage water resources in the south‐central Great Plains, USA where grasslands are rapidly being transformed to woodland or woody dominated savanna.  相似文献   

15.
In most regions of the world overgrazing plays a major role in land degradation and thus creates a major threat to natural ecosystems. Several feedbacks exist between overgrazing, vegetation, soil infiltration by water and soil erosion that need to be better understood. In this study of a sub‐humid overgrazed rangeland in South Africa, the main objective was to evaluate the impact of grass cover on soil infiltration by water and soil detachment. Artificial rains of 30 and 60 mm h?1 were applied for 30 min on 1 m2 micro‐plots showing similar sandy‐loam Acrisols with different proportions of soil surface coverage by grass (Class A: 75–100%; B: 75–50%; C: 50–25%; D: 25–5%; E: 5–0% with an outcropping A horizon; F: 0% with an outcropping B horizon) to evaluate pre‐runoff rainfall (Pr), steady state water infiltration (I), sediment concentration (SC) and soil losses (SL). Whatever the class of vegetal cover and the rainfall intensity, with the exception of two plots probably affected by biological activity, I decreased regularly to a steady rate <2 mm h?1 after 15 min rain. There was no significant correlation between I and Pr with vegetal cover. The average SC computed from the two rains increased from 0·16 g L?1 (class A) to 48·5 g L?1 (class F) while SL was varied between 4 g m?2 h?1 for A and 1883 g m?2 h?1 for F. SL increased significantly with decreasing vegetal cover with an exponential increase while the removal of the A horizon increased SC and SL by a factor of 4. The results support the belief that soil vegetation cover and overgrazing plays a major role in soil infiltration by water but also suggest that the interrill erosion process is self‐increasing. Abandoned cultivated lands and animal preferred pathways are more vulnerable to erosive processes than simply overgrazed rangelands. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

16.
A catalogue of historical landslides, 1951–2002, for three provinces in the Emilia‐Romagna region of northern Italy is presented and its statistical properties studied. The catalogue consists of 2255 reported landslides and is based on historical archives and chronicles. We use two measures for the intensity of landsliding over time: (i) the number of reported landslides in a day (DL) and (ii) the number of reported landslides in an event (Sevent), where an event is one or more consecutive days with landsliding. From 1951–2002 in our study area there were 1057 days with 1 ≤ DL ≤?45 landslides per day, and 596 events with 1 ≤ Sevent ≤ 129 landslides per event. In the first set of analyses, we find that the probability density of landslide intensities in the time series are power‐law distributed over at least two‐orders of magnitude, with exponent of about ?2·0. Although our data is a proxy for landsliding built from newspaper reports, it is the first tentative evidence that the frequency‐size of triggered landslide events over time (not just the landslides in a given triggered event), like earthquakes, scale as a power‐law or other heavy‐tailed distributions. If confirmed, this could have important implications for risk assessment and erosion modelling in a given area. In our second set of analyses, we find that for short antecedent rainfall periods, the minimum amount of rainfall necessary to trigger landslides varies considerably with the intensity of the landsliding (DL and Sevent); whereas for long antecedent periods the magnitude is largely independent of the cumulative amount of rainfall, and the largest values of landslide intensity are always preceded by abundant rainfall. Further, the analysis of the rainfall trend suggests that the trigger of landslides in the study area is related to seasonal rainfall. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

17.
Nonionic surfactants have been well researched in turf grass environments as a tool to ameliorate water‐repellant conditions. However, few studies have evaluated the risks and benefits of nonionic surfactant applications in row‐crop agricultural systems. The objective of this study was to evaluate the impact of a nonionic surfactant on cotton (Gossypium hirsutum L.) production on a Faceville loamy sand (fine, kaolinitic thermic Typic Kandiudult) in the coastal plain region of Georgia. The experiment consisted of two components: (1) on‐site rainfall simulation and (2) agronomic cotton field trials. Treatments were designed to test the impact of rate and frequency of surfactant applications using six combinations of application rates and timings. For the rainfall simulation component, only the control (0·0 L ha?1) and high rate (0·51 L ha?1) of surfactant applications were evaluated. During the field trial, soil water content, cotton stand counts, and yield were measured. Rainfall simulations showed that the addition of surfactant increased runoff, decreased infiltration, and promoted surface sealing. Despite the demonstrated potential for water loss, agronomic field trials showed that crop yields were not significantly different between surfactant‐treated and untreated plots. No differences in soil water content were observed between treatments at 5 and 15 cm depths; however, soil water content was significantly higher in untreated control plots at the 30 cm depth. Data demonstrate the need for clarification of soil physical/chemical properties and surfactant interactions that may lend themselves to the creation of surface seals and how these seals impact soil/water conservation and crop yield. Published in 2009 by John Wiley & Sons, Ltd.  相似文献   

18.
The loss of P in overland flow from most cultivated soils is controlled by erosion, and in‐turn soil moisture. We evaluated the effect of soil moisture on erosion and P transport in overland flow by applying rainfall (7 cm h?1) to packed soil boxes (1 m long and 0·15 m wide) and field plots (1 and 10 m long by 1 m wide) of silt loams in a central Pennsylvania (USA) catchment. Flow from packed soil boxes took longer to initiate as antecedent soil moisture decreased from field capacity (2 min) to air dried (8 to 9 min). Even in the more complex field plots (i.e. soil heterogeneity and topography), the wetter site (1 by 10 m plot; 70% field capacity) produced flow more quickly (3 min) and in greater volume (439 L) than the drier site (1 by 10 m plot; 40% field capacity, 15 min, and 214 L, respectively). However, less suspended sediment was transported from wetter soil boxes (1·6 to 2·5 g L?1) and field plots (0·9 g L?1) than drier boxes (2·9 to 4·2 g L?1) and plots (1·2 g L?1). Differences are attributed to their potential for soil aggregate breakdown, slaking and dispersion, which contribute to surface soil sealing and crusting, as dry soils are subject to rapid wetting (by rainfall). During flow, selective erosion and antecedent moisture conditions affected P transport. At field capacity, DRP and PP transport varied little during overland flow. Whereas P transport from previously dry soil decreased rapidly after the initiation of flow (6 to 1·5 mg TP L?1), owing to the greater slaking and dispersion of P‐rich particles into flow at the beginning than end of the flow event. These results indicate that soil moisture fluctuations greatly effect erosion and P transport potential and that management to decrease the potential for loss should consider practices such as conservation tillage and cover crops, particularly on areas where high soil P and erosion coincide. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

19.
Phosphorus (P) loss from agricultural watersheds has long been a critical water quality problem, the control of which has been the focus of considerable research and investment. Preventing P loss depends on accurately representing the hydrological and chemical processes governing P mobilization and transport. The Soil and Water Assessment Tool (SWAT) is a watershed model commonly used to predict run‐off and non‐point source pollution transport. SWAT simulates run‐off employing either the curve number (CN) or the Green and Ampt methods, both assume infiltration‐excess run‐off, although shallow soils underlain by a restricting layer commonly generate saturation‐excess run‐off from variable source areas (VSA). In this study, we compared traditional SWAT with a re‐conceptualized version, SWAT‐VSA, that represents VSA hydrology, in a complex agricultural watershed in east central Pennsylvania. The objectives of this research were to provide further evidence of SWAT‐VSA's integrated and distributed predictive capabilities against measured surface run‐off and stream P loads and to highlight the model's ability to drive sub‐field management of P. Thus, we relied on a detailed field management database to parameterize the models. SWAT and SWAT‐VSA predicted discharge similarly well (daily Nash–Sutcliffe efficiencies of 0.61 and 0.66, respectively), but SWAT‐VSA outperformed SWAT in predicting P export from the watershed. SWAT estimated lower P loss (0.0–0.25 kg ha?1) from agricultural fields than SWAT‐VSA (0.0–1.0+ kg ha?1), which also identified critical source areas – those areas generating large run‐off and P losses at the sub‐field level. These results support the use of SWAT‐VSA in predicting watershed‐scale P losses and identifying critical source areas of P loss in landscapes with VSA hydrology. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

20.
After the Valley Complex Fire burned 86 000 ha in western Montana in 2000, two studies were conducted to determine the effectiveness of contour‐felled log, straw wattle, and hand‐dug contour trench erosion barriers in mitigating postfire runoff and erosion. Sixteen plots were located across a steep, severely burned slope, with a single barrier installed in 12 plots (four per treatment) and four plots left untreated as controls. In a rainfall‐plus‐inflow simulation, 26 mm h?1 rainfall was applied to each plot for 1 h and 48 L min?1 of overland flow was added for the last 15 min. Total runoff from the contour‐felled log (0·58 mm) and straw wattle (0·40 mm) plots was significantly less than from the control plots (2·0 mm), but the contour trench plots (1·3 mm) showed no difference. The total sediment yield from the straw wattle plots (0·21 Mg ha?1) was significantly less than the control plots (2·2 Mg ha?1); the sediment yields in the contour‐felled log plots (0·58 Mg ha?1) and the contour trench plots (2·5 Mg ha?1) were not significantly different. After the simulations, sediment fences were installed to trap sediment eroded by natural rainfall. During the subsequent 3 years, sediment yields from individual events increased significantly with increasing 10 min maximum intensity and rainfall amounts. High‐intensity rainfall occurred early in the study and the erosion barriers were filled with sediment. There were no significant differences in event or annual sediment yields among treated and control plots. In 2001, the overall mean annual sediment yield was 21 Mg ha?1; this value declined significantly to 0·6 Mg ha?1 in 2002 and 0·2 Mg ha?1 in 2003. The erosion barrier sediment storage used was less than the total available storage capacity; runoff and sediment were observed going over the top and around the ends of the barriers even when the barriers were less than half filled. Published in 2007 by John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号