首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Multivariate numerical analyses (DCA, CCA) were used to study the distribution of chironomids from surface sediments of 100 lakes spanning broad ecoclimatic conditions in northern Swedish Lapland. The study sites range from boreal forest to alpine tundra and are located in a region of relatively low human impact. Of the 19 environmental variables measured, ordination by CCA identified mean July air temperature as one of the most significant variables explaining the distribution and the abundance of chironomids. Lossonignition (LOI), maximum lake depth and mean January air temperature also accounted for significant variation in chironomid assemblages. A quantitative transfer function was created to estimate mean July air temperature from sedimentary chironomid assemblages using weightedaveraging partial least squares regression (WAPLS). The coefficient of determination was relatively high (r2 = 0.65) with root mean squared error of prediction (RMSEP, based on jack-knifing) of 1.13 °C and maximum bias of 2.1 °C, indicating that chironomids can provide useful quantitative estimates of past changes in mean July air temperature. The paper focuses mainly on the relationship between chironomid composition and July air temperature, but the relationship to LOI and depth are also discussed.  相似文献   

2.
We report on an objective methodology, referred to as intrinsic sample methodology, for the delineation of exploration target areas or resource areas for assessment. Important features of the methodology include (1) identification of recognition criteria for critical genetic factors, (2) synthesis of new variables from enhanced geodata, (3) estimation of logit probability models, and (4) cutting of estimated logit probabilities to delineate exploration targets or resource areas. The methodology is demonstrated on the Walker Lake quadrangle of Nevada and California.  相似文献   

3.
The object of this study is to build a three-dimensional (3D) geometric model of the stratigraphicunits of the margin of the Rhone River on the basis of geophysical investigations by a networkof seismic profiles at sea. The geometry of these units is described by depth charts of eachsurface identified by seismic profiling, which is done by geostatistics. The modeling starts bya statistical analysis by which we determine the parameters that enable us to calculate thevariograms of the identified surfaces. After having determined the statistical parameters, wecalculate the variograms of the variable Depth. By analyzing the behavior of the variogramwe then can deduce whether the situation is stationary and if the variable has an anisotropicbehavior. We tried the following two nonstationary methods to obtain our estimates: (a) Themethod of universal kriging if the underlying variogram was directly accessible. (b) Themethod of increments if the underlying variogram was not directly accessible. After havingmodeled the variograms of the increments and of the variable itself, we calculated the surfacesby kriging the variable Depth on a small-mesh estimation grid. The two methods then arecompared and their respective advantages and disadvantages are discussed, as well as theirfields of application. These methods are capable of being used widely in earthsciences forautomatic mapping of geometric surfaces or for variables such as a piezometricsurface or aconcentration, which are not stationary, that is, essentially, possess a gradient or a tendencyto develop systematically in space.  相似文献   

4.
Categorical spatial data, such as land use classes and socioeconomic statistics data, are important data sources in geographical information science (GIS). The investigation of spatial patterns implied in these data can benefit many aspects of GIS research, such as classification of spatial data, spatial data mining, and spatial uncertainty modeling. However, the discrete nature of categorical data limits the application of traditional kriging methods widely used in Gaussian random fields. In this article, we present a new probabilistic method for modeling the posterior probability of class occurrence at any target location in space-given known class labels at source data locations within a neighborhood around that prediction location. In the proposed method, transition probabilities rather than indicator covariances or variograms are used as measures of spatial structure and the conditional or posterior (multi-point) probability is approximated by a weighted combination of preposterior (two-point) transition probabilities, while accounting for spatial interdependencies often ignored by existing approaches. In addition, the connections of the proposed method with probabilistic graphical models (Bayesian networks) and weights of evidence method are also discussed. The advantages of this new proposed approach are analyzed and highlighted through a case study involving the generation of spatial patterns via sequential indicator simulation.  相似文献   

5.
Binary predictor patterns of geological features are integrated based on a probabilistic approach known as weights of evidence modeling to predict gold potential. In weights of evidence modeling, the log e of the posterior odds of a mineral occurrence in a unit cell is obtained by adding a weight, W + or W for presence of absence of a binary predictor pattern, to the log e of the prior probability. The weights are calculated as log e ratios of conditional probabilities. The contrast, C = W +W , provides a measure of the spatial association between the occurrences and the binary predictor patterns. Addition of weights of the input binary predictor patterns results in an integrated map of posterior probabilities representing gold potential. Combining the input binary predictor patterns assumes that they are conditionally independent from one another with respect to occurrences.  相似文献   

6.
In binary spatial pattern recognition, there are many situations where the researcher could be interested in a number of dependent variables that are themselves correlated. For instance, different types of crime often coexist in the same area, or different species could share the same habitat. In cases like these, a natural correlation exists amongst the dependent variables of interest and is informative for spatial probability mapping. Weights of evidence (WE) modelling is a popular Bayesian probability method for binary pattern recognition, but it only deals with one single dependent variable at a time and ignores the correlation between the dependent variables. In this article, a multiple dependent variable weights of evidence (MDVWE) model will be developed. It will be shown that the new MDVWE model can be viewed as a restricted version of the conditional dependence-adjusted weights of evidence (CDAWE) model of Deng (Nat Resour Res 18(4):249–258, 2009). The MDVWE model is easy to program and implement. By means of a simulation study, it will be shown that the MDVWE model outperforms the traditional WE model both in terms of in-sample fit and out-of-sample prediction accuracy.  相似文献   

7.
Two widely-used techniques to estimate the volume of remaining oil and gas resources are discovery process modeling and geologic assessment. Both were used in a recent national assessment of oil and gas resources of the United States. Parallel estimates were obtained for 27 provinces. Geological-based estimates can typically see into areas not available to discovery process models (that is areas with little or no exploration history) and thus, on average, yield higher estimates. However, a linear relation does exist between the mean estimates obtained from these two methods. In addition, other variables were found in a multiple regression model that explained much of the difference. Thus, it is possible to perform discovery process modeling and adjust the estimates to yield results that might be expected from geological-based assessments.  相似文献   

8.
Quantitative mineral resource assessments used by the United States Geological Survey are based on deposit models. These assessments consist of three parts: (1) selecting appropriate deposit models and delineating on maps areas permissive for each type of deposit; (2) constructing a grade-tonnage model for each deposit model; and (3) estimating the number of undiscovered deposits of each type. In this article, I focus on the estimation of undiscovered deposits using two methods: the deposit density method and the target counting method.In the deposit density method, estimates are made by analogy with well-explored areas that are geologically similar to the study area and that contain a known density of deposits per unit area. The deposit density method is useful for regions where there is little or no data. This method was used to estimate undiscovered low-sulfide gold-quartz vein deposits in Venezuela.Estimates can also be made by counting targets such as mineral occurrences, geophysical or geochemical anomalies, or exploration plays and by assigning to each target a probability that it represents an undiscovered deposit that is a member of the grade-tonnage distribution. This method is useful in areas where detailed geological, geophysical, geochemical, and mineral occurrence data exist. Using this method, porphyry copper-gold deposits were estimated in Puerto Rico.  相似文献   

9.
A Conditional Dependence Adjusted Weights of Evidence Model   总被引:3,自引:0,他引:3  
One of the key assumptions in weights of evidence (WE) modelling is that the predictor patterns have to be conditionally independent. When this assumption is violated, WE posterior probability estimates are likely to be biased upwards. In this paper, a formal expression for the bias of the contrasts will be derived. It will be shown that this bias has an intuitive and convenient interpretation. A modified WE model will then be developed, where the bias is corrected using the correlation structure of the predictor patterns. The new model is termed the conditional dependence adjusted weights of evidence (CDAWE) model. It will be demonstrated via a simulation study that the CDAWE model significantly outperforms the existing WE model when conditional independence is violated, and it is on par with logistic regression, which does not assume conditional independence. Furthermore, it will be argued that, in the presence of conditional dependence between predictor patterns, weights variance estimates from WE are likely to understate the true level of uncertainty. It will be argued that weights variance estimates from CDAWE, which are also bias-corrected, can properly address this issue.  相似文献   

10.
This paper proposes a new approach of weights of evidence method based on fuzzy sets and fuzzy probabilities for mineral potential mapping. It can be considered as a generalization of the ordinary weights of evidence method, which is based on binary or ternary patterns of evidence and has been used in conjunction with geographic information systems for mineral potential mapping during the past few years. In the newly proposed method, instead of separating evidence into binary or ternary form, fuzzy sets containing more subjective genetic elements are created; fuzzy probabilities are defined to construct a model for calculating the posterior probability of a unit area containing mineral deposits on the basis of the fuzzy evidence for the unit area. The method can be treated as a hybrid method, which allows objective or subjective definition of a fuzzy membership function of evidence augmented by objective definition of fuzzy or conditional probabilities. Posterior probabilities calculated by this method would depend on existing data in a totally data-driven approach method, but depend partly on expert's knowledge when the hybrid method is used. A case study for demonstration purposes consists of application of the method to gold deposits in Meguma Terrane, Nova Scotia, Canada.  相似文献   

11.
Oxygen isotopes and geochemistry from lake sediments are commonly used as proxies of past hydrologic and climatic conditions, but the importance of present-day hydrologic processes in controlling these proxies are sometimes not well established and understood. Here we use present-day hydrochemical data from 13 lakes in a hydrologically connected lake chain in the northern Great Plains (NGP) to investigate isotopic and solute evolution along a hydrologic gradient. The 18O and 2H of water from the chain of lakes, when plotted in 2H - 18O space, form a line with a slope of 5.9, indicating that these waters fall on an evaporation trend. However, 10 of the 13 lakes are isotopically similar (18O = –6 ± 1 VSMOW) and show no correlation with salinity (which ranges from 1 to 65). The lack of correlation implies that the isotopic composition of various source waters rather than in-lake evaporation is the main control of the 18O of the lakes. Groundwater, an important input in the water budget of this chain of lakes, has a lower 18O value (–16.7 in 1998) than that of mean annual precipitation (–11) owing to selective recharge from snow melt. For the lakes in this chain with salinity < 15, the water Mg/Ca ratios are strongly correlated with salinity, whereas Sr/Ca is not. The poor correlation between Sr/Ca and salinity results from uptake of Sr by endogenic aragonite. These new results indicate that 18O records may not be interpreted simply in term of climate in the NGP, and that local hydrology needs to be adequately investigated before a meaningful interpretation of sedimentary records can be reached.  相似文献   

12.
In the oil industry, uncertainty about hydrocarbon volumes in undrilled prospects generally is expressed as an expectation curve. The curve indicates the probability of exceeding a given amount.After drilling a number of prospects conclusively, that is, we know the amount of reserves in the targets, if any, the question arises about the validity of the prediction. Since the prediction was in the form of a probability distribution, the comparison with a single actual outcome of the process is not straightforward.I propose a specific combination of mainly well-known tests that can be applied in this hindsight analysis to address the following: (1) the measure of location or expectation, (2) the probability of success (3) the shape of the distribution of the nonzero outcomes or success cases, and (4) a measure of rank correlation between predictions and outcomes. Even small numbers of drilled structures may suffice for obtaining conclusive results. Such statistical analysis provides useful feedback for those concerned with the maintenance and control of the prediction system.  相似文献   

13.
Conditional Independence Test for Weights-of-Evidence Modeling   总被引:19,自引:0,他引:19  
Weights-of-evidence modeling is a GIS-based technique for relating a point pattern for locations of discrete events with several map layers. In general, the map layers are binary or ternary. Weights for presence, absence or missing data are added to a prior logit. Updating with two or more map layers is allowed only if the map layers are approximately conditionally independent of the point pattern. The final product is a map of posterior probabilities of occurrence of the discrete event within a small unit cell. This paper contains formal proof that conditional independence of map layers implies that T, the sum of the posterior probabilities weighted according to unit cell area, is equal to n, being the total number of discrete events. This result is used in the overall or omnibus test for conditional independence. In practical applications, T generally exceeds n, indicating a possible lack of conditional independence. Estimation of the standard deviation of T allows performance of a one-tailed test to check whether or not T-n is significantly greater than zero. This new test is exact and simpler to use than other tests including the Kolmogorov-Smirnov test and various chi-squared tests adapted from discrete multivariate statistics.  相似文献   

14.
The basal portion of the Ogallala Formation (=Laverne Formation) (Lower Pliocene) Beaver County, Oklahoma, contains an interesting assemblage of non-marine fossil molluscs that include both spinose and non-spinose forms of the aquatic gastropod species Pyrgophorus hibbardi. The origin and paleolimnological significance of the spinose morph has been a source of much conjecture that has influenced environmental reconstructions of this assemblage. In one hypothesis the spinose forms of P. hibbardi are assumed to be associated with brackish water conditions by analogy with some populations of a related hydrobiid Potamopyrgus jenkinsi. To test the hypothesis that the spinose forms lived under different water conditions than the non-spinose morphs, we analyzed 10 specimens each of the two varieties for stable oxygen and carbon isotope ratios in the shell aragonite.The mean isotope ratios for the smooth and spinose morphs show no significant difference (oxygen: t = 0.28, df = 18, P (T t) 0.78 n.s.; carbon: t = 0.96, df = 18, P (T t) 0.35 n.s). We conclude that the lack of a statistically significant difference between the means of the oxygen and carbon isotope values for the smooth and spinose morphs suggests that the two forms lived in waters having similar isotope signatures. The considerable range in oxygen isotope values recorded by both morphs of P. hibbardi, including values as high as 5–6, suggest that both morphs were associated with waters which were periodically evaporatively enriched in 18O.  相似文献   

15.
During the late Wisconsin, glacial flour from alpine glaciers along the east side of the Cascade Range in southern Oregon was deposited in Upper Klamath Lake. Quantitative interpretation of magnetic properties and grain-size data of cored sediments from Caledonia Marsh on the west side of the lake provides a continuous record of the flux of glacial flour spanning the last 37 000 calendar years. For modeling purposes, the lake sediments from the 13-m core were divided into three sedimentary components defined from magnetic, geochemical, petrographic, and grain-size data. The components are (1) strongly magnetic, glacial flour made up of extremely fine-grained, fresh volcanic rock particles, (2) less magnetic lithic material made up of coarser, weathered volcanic detritus, and (3) non-magnetic biogenic material (largely biogenic silica). Quantitative interpretation is possible because there has been no significant postdepositional destruction or formation of magnetic minerals, nor alteration affecting grain-size distributions. Major steps involved in the interpretation include: (1) computation of biogenic and lithic components; (2) determination of magnetic properties and grain-size distributions of the non-glacial and glacial flour end-members; (3) computation of the contents of weathered and glacial flour components for each sample; (4) development of an age model based on the mass accumulation of the non-glacial lithic component; and (5) use of the age model and glacial flour contents to compute the flux of glacial flour. Comparison of the glacial flour record from Upper Klamath Lake to mapped glacial features suggests a nearly linear relation between flux of glacial flour and the extent of nearby glaciers. At 22 ka, following an extended period during which glaciers of limited size waxed and waned, late Wisconsin (Waban) glaciers began to grow, reaching their maximum extent at 19 ka. Glaciers remained near their maximum extent for 1000 years. During this period, lake sediments were made up of 80% glacial flour. The content of glacial flour decreased as the glaciers receded, and reached undetectable levels by 14 ka.  相似文献   

16.
Favorability methods produce a unique measure for mineral potential mapping and quantitative estimation of mineral resources. Indicator favorability theory is developed in this study to account for spatial (auto and cross) correlations of regionalized geological, geochemical, and geophysical fields based on the indicator concept. Target and explanatory indicators are introduced to describe, respectively, direct and indirect evidence of the mineralization of interest. Mineralization is represented by a combination () of a set of target indicators. Indicator favorability theory estimates a regionalized favorability function in two stages: (1) estimate a linear combination of target indicators by maximizing var() and (2) estimate favorability functionF by minimizing estimation variance var[F–]. The model is established on the basis of a conceptual model of target. The favorability estimates can be justified by correlation analysis and cross validation in control areas. The indicator favorability theory is demonstrated on a case study for gold-silver mineral potential mapping based on geophysical, structural, and geochemical fields.  相似文献   

17.
In a previous paper we attempted to assess the contribution of red bacteria of theHalobacterium — Haloferax — Haloarcula group and of the -carotene-rich green algaDunaliella salina to the red colour of saltern crystallizer ponds. By means of light absorption measurements, we showed that bacterioruberin contained in the bacteria was mainly responsible for the colour of the brines, in spite of the fact that -carotene derived fromDunaliella was the pigment present in the greatest amount. This apparent discrepancy was explained by the very smallin vivo optical cross-section of -carotene, which is densely packed in globules inside theD. salina cells. We recently observed that the centrifugation technique used in the previous study to collect biomass from the ponds was unsuitable for this type of measurements, as a substantial part of theDunaliella cells present did not sediment upon centrifugation due to the low specific gravity caused by the high -carotene content. Therefore similar measurements were performed with biomass collected by filtration. Again,in vivo absorption spectra were dominated by the absorption peaks of bacterioruberin. The results reported here show that, in spite of the methodological problem associated with the earlier study, all views and conclusions expressed in our earlier paper retain their validity.  相似文献   

18.
A portion of a sedimentary basin is subdivided conceptually into hexagons of equal area. The area of each hexagon is equal to the minimum area an oil field should have to be commercial. Hexagons can be full of oil or empty. A field size 1 consists of a cell with oil surrounded by six empty cells; a field size 2 consists of two adjacent cells with oil surrounded by eight empty cells, etc. Principles of Percolation Theory are used to determine the probability distribution of the areas of the oil fields existing in this portion of the basin. The only piece of information necessary to determine this probability distribution is the Success Ratio (number of successful exploration wells/total number of exploration wells drilled in this portion of the basin). This approach has several practical applications.A probabilistic model is introduced to predict to which extent potential oil traps are filled with oil. The model assumes that the probability that an oil unit will end up in a particular trap, is proportional to the surface area of the trap. The model predicts that independently of the distribution of the trap volumes, there will be a critical trap volume. All the traps having a volume less than this critical volume, will be filled to spill point. An equation is deduced to predict, for all traps having a volume greater than the critical, the volume of oil that can be encountered in the trap, provided the volume of the trap is known.  相似文献   

19.
We investigated paleolimnological records from a series of river deltas around the northeastern rim of Lake Tanganyika, East Africa (Tanzania and Burundi) in order to understand the history of anthropogenic activity in the lakes catchment over the last several centuries, and to determine the impact of these activities on the biodiversity of littoral and sublittoral lake communities. Sediment pollution caused by increased rates of soil erosion in deforested watersheds has caused significant changes in aquatic communities along much of the lakes shoreline. We analyzed the effects of sediment discharge on biodiversity around six deltas or delta complexes on the east coast of Lake Tanganyika: the Lubulungu River delta, Kabesi River delta, Nyasanga/Kahama River deltas, and Mwamgongo River delta in Tanzania; and the Nyamuseni River delta and Karonge/Kirasa River deltas in Burundi. Collectively, these deltas and their associated rivers were chosen to represent a spectrum of drainage-basin sizes and disturbance levels. By comparing deltas that are similar in watershed attributes (other than disturbance levels), our goal was to explore a series of historical experiments at the watershed scale, with which we could more clearly evaluate hypotheses of land use or other effects on nearshore ecosystems. Here we discuss these deltas, their geologic and physiographic characteristics, and the field procedures used for coring and sampling the deltas, and various indicators of anthropogenic impact.  相似文献   

20.
A combined bulk and detailed geochemical study of the sedimentary organic matter in Lake Albano, central Italy, provides critical data to track the response of this aquatic system to the environmental changes of variable amplitude that occurred during the Holocene. RockEval pyrolysis of this predominantly laminated, organic carbonrich sedimentary sequence shows changes in hydrogen and oxygen indices that are related to variations in the dominance of the primary producers. These variations are further confirmed by the pigments and the carbon isotopic composition of bulk organic matter showing that cyanobacteria dominated the lake waters during the early and late Holocene whereas diatoms have been the main producers during the middle Holocene. Sharp decreases in productivity, 2–3 centuries long, are identified at ca. 8.2, 6.4 and 3.8 ka. B.P. Changes in temperature and/or effective moisture are suggested as the most probable causes, although human impact cannot be ruled out for the latest part of the Holocene.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号