首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   10篇
  免费   0篇
地质学   5篇
自然地理   5篇
  2002年   1篇
  1998年   1篇
  1997年   2篇
  1995年   1篇
  1993年   1篇
  1992年   1篇
  1988年   1篇
  1985年   1篇
  1984年   1篇
排序方式: 共有10条查询结果,搜索用时 500 毫秒
1
1.
The National Park Service needs to establish in all of the national parks how large the parking lots should be in order to enjoy and preserve our natural resources, for example, in the Delicate Arch in the Arches National Park. Probabilistic and statistical relationships were developed between the number of vehicles (N) at one time in the Wolfe Ranch parking lot and the number of visitors (X) at Delicate Arch 1.5 miles away in the Arches National Park, southeastern Utah. The value ofN is determined such that 30 or more visitors are at the arch only 10% of the time.  相似文献   
2.
Probabilistic methodology used by the U.S. Geological Survey is described for estimating the quantity of undiscovered recoverable conventional resources of oil and gas in the United States. A judgmental probability distribution of the quantity of resource and its properties is determined for a geologic province or basin. From this distribution, point and interval estimates of the quantity of undiscovered resource are obtained. Distributions and their properties are established for each of the following resources: (1) oil and nonassociated gas from estimates of the probability of the resource being present and the conditional probability distribution of the quantity of resource given that the resource is present, (2) associated-dissolved gas from its corresponding oil distribution, (3) total gas, (4) oil and total gas in two or more provinces. Computer graphics routines are illustrated with examples from the U.S. Geological Survey Circular 860.  相似文献   
3.
An objective replacement method for censored geochemical data   总被引:1,自引:0,他引:1  
Geochemical data are commonly censored, that is, concentrations for some samples are reported as less than or greater than some value. Censored data hampers statistical analysis because certain computational techniques used in statistical analysis require a complete set of uncensored data. We show that the simple substitution method for creating an uncensored dataset, e.g., replacement by3/4 times the detection limit, has serious flaws, and we present an objective method to determine the replacement value. Our basic premise is that the replacement value should equal the mean of the actual values represented by the qualified data. We adapt the maximum likelihood approach (Cohen, 1961) to estimate this mean. This method reproduces the mean and skewness as well or better than a simple substitution method using3/4 of the lower detection limit or3/4 of the upper detection limit. For a small proportion of less than substitutions, a simple-substitution replacement factor of 0.55 is preferable to3/4; for a small proportion of greater than substitutions, a simple-substitution replacement factor of 1.7 is preferable to4/3, provided the resulting replacement value does not exceed 100%. For more than 10% replacement, a mean empirical factor may be used. However, empirically determined simple-substitution replacement factors usually vary among different data sets and are less reliable with more replacements. Therefore, a maximum likelihood method is superior in general. Theoretical and empirical analyses show that true replacement factors for less thans decrease in magnitude with more replacements and larger standard deviation; those for greater thans increase in magnitude with more replacements and larger standard deviation. In contrast to any simple substitution method, the maximum likelihood method reproduces these variations. Using the maximum likelihood method for replacing less thans in our sample data set, correlation coefficients were reasonably accurately estimated in 90% of the cases for as much as 40% replacement and in 60% of the cases for 80% replacement. These results suggest that censored data can be utilized more than is commonly realized.  相似文献   
4.
The U.S. Geological Survey periodically makes appraisals of the oil and gas resources of the Nation. In its 1995 National Assessment the onshore areas and adjoining State waters of the Nation were assessed. As part of the 1995 National Assessment, 274 conventional oil plays and 239 conventional nonassociated-gas plays were assessed. The two datasets of estimates studied herein are the following: (1) the mean, undiscovered, technically recoverable oil resources estimated for each of the 274 conventional oil plays, and (2) the mean, undiscovered, technically recoverable gas resources estimated for each of the 239 conventional nonassociatedgas plays. It was found that the two populations of petroleum estimates are both distributed approximately as lognormal distributions. Fractal lognormal percentage theory is developed and applied to the two populations of petroleum estimates. In both cases the theoretical percentages of total resources using the lognormal distribution are extremely close to the empirical percentages from the oil and nonassociated-gas data. For example, 20% of the 274 oil plays account for 73.05% of the total oil resources of the plays if the lognormal distribution is used, or for 75.52% if the data is used; 20% of the 239 nonassociated-gas plays account for 76.32% of the total nonassociated-gas resources of the plays if the lognormal distribution is used, or for 78.87% if the data is used  相似文献   
5.
The U.S. Geological Survey assessed all significant sedimentary basins in the world for undiscovered conventionally recoverable crude-oil resources. Probabilistic methodology was applied to each basin assessment to produce estimates in the form of probability distributions. Basin probability distributions were computer aggregated to produce resource estimates for the entire world. Aggregation was approximated by a three-parameter lognormal distribution by combining the first three central moments of basin distributions. For purposes of experiment and study, world aggregation was conducted under four different sets of assumptions. The four cases are (1) dependent assessments of all basins, (2) dependent assessments within continental areas, but independent assessments among continental areas, (3) dependent assessments within countries, but independent assessments among countries, and (4) independent assessments of all basins. Mean estimate remained the same in all four cases, but the width of interval estimate formed using the 95th and 5th fractiles decreased with reduced dependency in going from first to fourth case.  相似文献   
6.
Fractal properties of the Pareto probability distribution are used to generalize “the 20/80 law.” The 20/80 law is a heuristic law that has evolved over the years into the following rule of thumb for many populations: 20 percent of the population accounts for 80 percent of the total value. The generalp100/q100 law in probabilistic form is defined withq as a function ofp, wherep is the population proportion andq is the proportion of total value. Using the Pareto distribution, thep100/q100 law in fractal form is derived with the parameterq being a fractal, whereq unexpectedly possesses the scale invariance property. The 20/80 law is a special case of thep100/q100 law in fractal form. Thep100/q100 law in fractal form is applied to petroleum fieldsize data to obtainp andq such thatp100% of the oil fields greater than any specified scale or size in a geologic play account forq100% of the total oil of the fields. The theoretical percentages of total resources of oil using the fractalq are extremely close to the empirical percentages from the data using the statisticq. Also, the empirical scale invariance property of the statisticq for the petroleum fieldsize data is in excellent agreement with the theoretical scale invariance property of the fractalq.  相似文献   
7.
A geostochastic system called FASPF was developed by the U.S. Geological Survey for their 1989 assessment of undiscovered petroleum resources in the United States. FASPF is a fast appraisal system for petroleum play analysis using a field-size geological model and an analytic probabilistic methodology. The geological model is a particular type of probability model whereby the volumes of oil and gas accumulations are modeled as statistical distributions in the form of probability histograms, and the risk structure is bilevel (play and accumulation) in terms of conditional probability. The probabilistic methodology is an analytic method derived from probability theory rather than Monte Carlo simulation. The resource estimates of crude oil and natural gas are calculated and expressed in terms of probability distributions. The probabilistic methodology developed by the author is explained.The analytic system resulted in a probabilistic methodology for play analysis, subplay analysis, economic analysis, and aggregation analysis. Subplay analysis included the estimation of petroleum resources on non-Federal offshore areas. Economic analysis involved the truncation of the field size with a minimum economic cutoff value. Aggregation analysis was needed to aggregate individual play and subplay estimates of oil and gas, respectively, at the provincial, regional, and national levels.  相似文献   
8.
Reserve growth refers to the typical increases in estimated sizes of fields that occur through time as oil and gas fields are developed and produced. Projections of the future reserve growth of known fields have become important components of hydrocarbon resource assessments. In this paper, we present an algorithm for estimating the future reserve growth of known fields. The algorithm, which incorporates fundamental reserve-growth assumptions used by others in the past, is programmed for a personal computer in the form of formulas for a spreadsheet. The primary advantages of this spreadsheet program lie in its simplicity and ease of use. We also present a library of 17 different growth functions that provides numerical models for predicting the future sizes of existing oil and gas fields in various regions of the United States. These growth functions are formatted for use in the spreadsheet program.  相似文献   
9.
The U.S. Geological Survey recently assessed undiscovered conventional gas and oil resources in eight regions of the world outside the U.S. The resources assessed were those estimated to have the potential to be added to reserves within the next thirty years. This study is a worldwide analysis of the estimated volumes and distribution of deep (>4.5 km or about 15,000 ft), undiscovered conventional natural gas resources based on this assessment. Two hundred forty-six assessment units in 128 priority geologic provinces, 96 countries, and two jointly held areas were assessed using a probabilistic Total Petroleum System approach. Priority geologic provinces were selected from a ranking of 937 provinces worldwide. The U.S. Geological Survey World Petroleum Assessment Team did not assess undiscovered petroleum resources in the U.S. For this report, mean estimated volumes of deep conventional undiscovered gas resources in the U.S. are taken from estimates of 101 deep plays (out of a total of 550 conventional plays in the U.S.) from the U.S. Geological Survey's 1995 National Assessment of Oil and Gas Resources. A probabilistic method was designed to subdivide gas resources into depth slices using a median-based triangular probability distribution as a model for drilling depth to estimate the percentages of estimated gas resources below various depths. For both the World Petroleum Assessment 2000 and the 1995 National Assessment of Oil and Gas Resources, minimum, median, and maximum depths were assigned to each assessment unit and play; these depths were used in our analysis. Two-hundred seventy-four deep assessment units and plays in 124 petroleum provinces were identified for the U.S. and the world. These assessment units and plays contain a mean undiscovered conventional gas resource of 844 trillion cubic ft (Tcf) occuring at depths below 4.5 km. The deep undiscovered conventional gas resource (844 Tcf) is about 17% of the total world gas resource (4,928 Tcf) based on the provinces assessed and includes a mean estimate of 259 Tcf of U.S. gas from the U.S. 1995 National Assessment. Of the eight regions, the Former Soviet Union (Region 1) contains the largest estimated volume of undiscovered deep gas with a mean resource of343 Tcf.  相似文献   
10.
The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed.This paper was presented at Emerging Concepts. MGLIS-87, Redwood City, California, 13–15 April 1987.  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号