排序方式: 共有41条查询结果,搜索用时 640 毫秒
1.
The authors discuss the current measurement accuracy of the RD Instruments 1200-kHz acoustic Doppler current profiler (ADCP) in the near surface and bottom boundaries. Data are presented from tests. In the first series of tests, an ADCP was mounted on a bottom sled in an upward looking mode. The sled was towed at known speeds with and without surface waves. Additionally, tests were conducted with several acoustic baffle designs installed in the transducer head. The 1200-kHz ADCP has the ability to accurately measure mean currents in a dynamic wave induced flow field. Sidelobes can bias the measurements at 85% of the range when bottom or surface boundaries are present. The amount of bias is strongly dependent on surface wave characteristics. Sidelobe bias can be eliminated with a properly designed baffle system. The profilers have the ability to measure wave particle velocities with a properly configured system 相似文献
2.
Thomas R. Metcalf 《Solar physics》1994,155(2):235-242
I present a robust algorithm that resolves the 180-deg ambiguity in measurements of the solar vector magnetic field. The technique simultaneously minimizes both the divergence of the magnetic field and the electric current density using a simulated annealing algorithm. This results in the field orientation with approximately minimum free energy. The technique is well-founded physically and is simple to implement. 相似文献
3.
Thomas R. Metcalf Marc L. DeRosa Carolus J. Schrijver Graham Barnes Adriaan A. van Ballegooijen Thomas Wiegelmann Michael S. Wheatland Gherardo Valori James M. McTtiernan 《Solar physics》2008,247(2):269-299
We compare a variety of nonlinear force-free field (NLFFF) extrapolation algorithms, including optimization, magneto-frictional,
and Grad – Rubin-like codes, applied to a solar-like reference model. The model used to test the algorithms includes realistic
photospheric Lorentz forces and a complex field including a weakly twisted, right helical flux bundle. The codes were applied
to both forced “photospheric” and more force-free “chromospheric” vector magnetic field boundary data derived from the model.
When applied to the chromospheric boundary data, the codes are able to recover the presence of the flux bundle and the field’s
free energy, though some details of the field connectivity are lost. When the codes are applied to the forced photospheric
boundary data, the reference model field is not well recovered, indicating that the combination of Lorentz forces and small
spatial scale structure at the photosphere severely impact the extrapolation of the field. Preprocessing of the forced photospheric
boundary does improve the extrapolations considerably for the layers above the chromosphere, but the extrapolations are sensitive
to the details of the numerical codes and neither the field connectivity nor the free magnetic energy in the full volume are
well recovered. The magnetic virial theorem gives a rapid measure of the total magnetic energy without extrapolation though,
like the NLFFF codes, it is sensitive to the Lorentz forces in the coronal volume. Both the magnetic virial theorem and the
Wiegelmann extrapolation, when applied to the preprocessed photospheric boundary, give a magnetic energy which is nearly equivalent
to the value derived from the chromospheric boundary, but both underestimate the free energy above the photosphere by at least
a factor of two. We discuss the interpretation of the preprocessed field in this context. When applying the NLFFF codes to
solar data, the problems associated with Lorentz forces present in the low solar atmosphere must be recognized: the various
codes will not necessarily converge to the correct, or even the same, solution.
On 07/07/2007, the NLFFF team was saddened by the news that Tom Metcalf had died as the result of an accident. We remain grateful
for having had the opportunity to benefit from his unwavering dedication to the problems encountered in attempting to understand
the Sun’s magnetic field; Tom had completed this paper several months before his death, leading the team through the many
steps described above. 相似文献
4.
This study presents an approach for delineating groundwater basins and estimating rates of recharge to fractured crystalline bedrock. It entailed the use of completion report data (boring logs) from 2500 domestic wells in bedrock from the Coventry Quadrangle, which is located in northeastern Connecticut and characterized by metamorphic gneiss and schist. Completion report data were digitized and imported into ArcGIS® for data analysis. The data were processed to delineate groundwater drainage basins for the fractured rock based on flow conditions and to estimate groundwater recharge to the bedrock. Results indicate that drainage basins derived from surface topography, in general, may not correspond with bedrock drainage basins due to scale. Estimates of recharge to the bedrock for the study area indicate that only a small fraction of the precipitation or the amount of water that enters the overburden recharges the rock. The approach presented here can be a useful method for water resource‐related assessments that involve fractured rock aquifers. 相似文献
5.
In rural areas of New England groundwater from fractured crystalline and sedimentary bedrock is a critical water resource. Increasingly, studies have shown that development occurring in rural areas is resulting in the impairment of water quality in fractured rock aquifers. The objective of this study was to evaluate the spatial and temporal variations in groundwater quality associated with development and evaluate the extent to which common groundwater contaminants associated with rural development may be naturally buffered. The study entailed a compilation and synthesis of over 2500 reports on domestic water quality that spanned a 30 year period. Focus was placed on the spatial distribution and temporal variations in sodium, chloride, iron, manganese, nitrate, and nitrite. Results indicate that despite significant levels of development, the amount of contamination to the bedrock has been minimal. Of the constituents examined, only the chloride concentration exhibits a systematic increase over time, but the level of chloride remained relatively low. The flux of chloride to the bedrock from deicing appears minimal despite the significant amounts of road salt used in the study area. Sodium concentrations in the bedrock remained relatively constant and appear to be buffered by ion exchange with calcium as suggested by the increase in hardness with time. Iron and manganese were present at relatively low levels but did not show any systematic trends over time. Nitrate and nitrite concentrations were very low and found to be inversely correlated with manganese and iron concentrations. This suggests that the presence of iron and manganese contribute to denitrification. This study indicated that both geochemical and biogeochemical processes are active buffering mechanisms that help shield the bedrock from contaminants associated with development. 相似文献
6.
7.
J. B. Metcalf 《Geotechnical and Geological Engineering》1991,9(3-4):155-165
Summary This paper briefly outlines the use of non-standard materials in road base construction in Australia. After a short survey of the history of usage and the parallel development of current rigorous standards for base course materials, a selection of examples of the successful use of materials not meeting those standards is given. Such materials include decomposed rocks, soft rocks, fine grained materials, loams and sands. Some comments are made on construction and maintenance practices appropriate to such usage. 相似文献
8.
K. D. Leka Graham Barnes A. D. Crouch Thomas R. Metcalf G. Allen Gary Ju Jing Y. Liu 《Solar physics》2009,260(1):83-108
The objective testing of algorithms for performing ambiguity resolution in vector magnetic field data is continued, with an
examination of the effects of noise in the data. Through the use of analytic magnetic field models, two types of noise are
“added” prior to resolving: noise to simulate Poisson photon noise in the observed polarization spectra, and a spatial binning
to simulate the effects of unresolved structure. The results are compared through the use of quantitative metrics and performance
maps. We find that while no algorithm severely propagates the effects of Poisson noise beyond very local influences, some
algorithms are more robust against high photon-noise levels than others. In the case of limited spatial resolution, loss of
information regarding fine-scale structure can easily result in erroneous solutions. Our tests imply that photon noise and
limited spatial resolution can act so as to make assumptions used in some ambiguity resolution algorithms no longer consistent
with the observed magnetogram. We confirm a finding of the earlier comparison study that results can be very sensitive to
the details of the treatment of the observed boundary and the assumptions governing that treatment. We discuss the implications
of these findings, given the relative sensitivities of the algorithms to the two sources of noise tested here. We also touch
on further implications for interpreting observational vector magnetic field data for general solar physics research. 相似文献
9.
10.