首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   263篇
  免费   23篇
  国内免费   1篇
测绘学   14篇
大气科学   12篇
地球物理   101篇
地质学   95篇
海洋学   23篇
天文学   33篇
自然地理   9篇
  2024年   1篇
  2022年   1篇
  2021年   5篇
  2020年   16篇
  2019年   9篇
  2018年   22篇
  2017年   12篇
  2016年   17篇
  2015年   10篇
  2014年   18篇
  2013年   15篇
  2012年   7篇
  2011年   16篇
  2010年   22篇
  2009年   22篇
  2008年   14篇
  2007年   18篇
  2006年   8篇
  2005年   12篇
  2004年   9篇
  2003年   7篇
  2002年   3篇
  2001年   1篇
  2000年   1篇
  1999年   4篇
  1998年   1篇
  1997年   1篇
  1996年   2篇
  1994年   1篇
  1993年   1篇
  1992年   1篇
  1991年   2篇
  1986年   1篇
  1984年   2篇
  1982年   2篇
  1981年   1篇
  1978年   1篇
  1977年   1篇
排序方式: 共有287条查询结果,搜索用时 31 毫秒
11.
12.
The use of logic trees in probabilistic seismic hazard analyses often involves a large number of branches that reflect the uncertainty in the selection of different models and in the selection of the parameter values of each model. The sensitivity analysis, as proposed by Rabinowitz and Steinberg [Rabinowitz, N., Steinberg, D.M., 1991. Seismic hazard sensitivity analysis: a multi-parameter approach. Bull. Seismol. Soc. Am. 81, 796–817], is an efficient tool that allows the construction of logic trees focusing attention on the parameters that have greater impact on the hazard.In this paper the sensitivity analysis is performed in order to identify the parameters that have the largest influence on the Western Liguria (North Western Italy) seismic hazard. The analysis is conducted for six strategic sites following the multi-parameter approach developed by Rabinowitz and Steinberg [Rabinowitz, N., Steinberg, D.M., 1991. Seismic hazard sensitivity analysis: a multi-parameter approach. Bull. Seismol. Soc. Am. 81, 796–817] and accounts for both mean hazard values and hazard values corresponding to different percentiles (e.g., 16%-ile and 84%-ile). The results are assessed in terms of the expected PGA with a 10% probability of exceedance in 50 years for rock conditions and account for both the contribution from specific source zones using the Cornell approach [Cornell, C.A., 1968. Engineering seismic risk analysis. Bull. Seismol. Soc. Am. 58, 1583–1606] and the spatially smoothed seismicity [Frankel, A., 1995. Mapping seismic hazard in the Central and Eastern United States. Seismol. Res. Lett. 66, 8–21]. The influence of different procedures for calculating seismic hazard, seismic catalogues (epicentral parameters), source zone models, frequency–magnitude parameters, maximum earthquake magnitude values and attenuation relationships is considered. As a result, the sensitivity analysis allows us to identify the parameters with higher influence on the hazard. Only these parameters should be subjected to careful discussion or further research in order to reduce the uncertainty in the hazard while those with little or no effect can be excluded from subsequent logic-tree-based seismic hazard analyses.  相似文献   
13.
14.
A solution to the fixed-time minimum-fuel two-impulse rendezvous problem for the general non-coplanar elliptical orbits is provided. The optimal transfer orbit is obtained using the constrained multiple-revolution Lambert solution. Constraints consist of lower bound for perigee altitude and upper bound for apogee altitude. The optimal time-free two-impulse transfer problem between two fixed endpoints implies finding the roots of an eighth order polynomial, which is done using a numerical iterative technique. The set of feasible solutions is determined by using the constraints conditions to solve for the short-path and long-path orbits semimajor axis ranges. Then, by comparing the optimal time-free solution with the feasible solutions, the optimal semimajor axis for the two fixed-endpoints transfer is identified. Based on the proposed solution procedure for the optimal two fixed-endpoints transfer, a contour of the minimum cost for different initial and final coasting parameters is obtained. Finally, a numerical optimization algorithm (e.g., evolutionary algorithm) can be used to solve this global minimization problem. A numerical example is provided to show how to apply the proposed technique.  相似文献   
15.
Since the birth of X-ray astronomy, spectral, spatial and timing observation improved dramatically, procuring a wealth of information on the majority of the classes of the celestial sources. Polarimetry, instead, remained basically unprobed. X-ray polarimetry promises to provide additional information procuring two new observable quantities, the degree and the angle of polarization. Polarization from celestial X-ray sources may derive from emission mechanisms themselves such as cyclotron, synchrotron and non-thermal bremsstrahlung, from scattering in aspheric accreting plasmas, such as disks, blobs and columns and from the presence of extreme magnetic field by means of vacuum polarization and birefringence. Matter in strong gravity fields and Quantum Gravity effects can be studied by X-ray polarimetry, too. POLARIX is a mission dedicated to X-ray polarimetry. It exploits the polarimetric response of a Gas Pixel Detector, combined with position sensitivity, that, at the focus of a telescope, results in a huge increase of sensitivity. The heart of the detector is an Application-Specific Integrated Circuit (ASIC) chip with 105,600 pixels each one containing a full complete electronic chain to image the track produced by the photoelectron. Three Gas Pixel Detectors are coupled with three X-ray optics which are the heritage of JET-X mission. A filter wheel hosting calibration sources unpolarized and polarized is dedicated to each detector for periodic on-ground and in-flight calibration. POLARIX will measure time resolved X-ray polarization with an angular resolution of about 20 arcsec in a field of view of 15 × 15 arcmin and with an energy resolution of 20% at 6 keV. The Minimum Detectable Polarization is 12% for a source having a flux of 1 mCrab and 105 s of observing time. The satellite will be placed in an equatorial orbit of 505 km of altitude by a Vega launcher. The telemetry down-link station will be Malindi. The pointing of POLARIX satellite will be gyroless and it will perform a double pointing during the earth occultation of one source, so maximizing the scientific return. POLARIX data are for 75% open to the community while 25% + SVP (Science Verification Phase, 1 month of operation) is dedicated to a core program activity open to the contribution of associated scientists. The planned duration of the mission is one year plus three months of commissioning and SVP, suitable to perform most of the basic science within the reach of this instrument. A nice to have idea is to use the same existing mandrels to build two additional telescopes of iridium with carbon coating plus two more detectors. The effective area in this case would be almost doubled.  相似文献   
16.
Quality assessment of the Italian Landslide Inventory using GIS processing   总被引:4,自引:1,他引:3  
Landslides constitute one of the most important natural hazards in Italy as they are widespread and result in considerable damage and fatalities every year. The Italian Landslide Inventory (IFFI) Project was launched in 1999 with the aim of identifying and mapping landslides over the entire Italian territory. The inventory currently holds over 480,000 landslides and has been available by means of Web services since 2005. The aim of this study is to define quality indices for evaluation of the homogeneity and completeness of the IFFI database. In order to estimate the completeness of the landslide attribute information, a heuristic approach has been used to assign weighting values to significant parameters selected from the landslide data sheet. The completeness and homogeneity of the landslide mapping has been evaluated by means of three different methods: an area-frequency distribution analysis; the proximity of the landslides surveyed to urban areas; variation of the landslide index within the same lithology. The quality indices have allowed identification of areas with a high level of completeness and critical areas in which the data collected have been underestimated or are not very accurate. The quality assessment of collected and stored data is essential in order to use the IFFI database for definition and implementation of landslide susceptibility models and for land use planning and management.  相似文献   
17.
We present experimental results showing the impact of the proposed LightSquared (LS) Long-term Evolution (LTE) signals on both GPS and Galileo civil modulations in the L1/E1 band. The experiments were conducted in radiated mode in a large anechoic chamber. Three Galileo enabled receivers were chosen for the tests, and a state of the art GNSS signal generator was used to simulate both GPS and Galileo signals. The LTE signals were generated by an Agilent Programmable Signal Generator with a license to generate the signals according to the 3GPP LTE FDD standard. The interference impact was measured in terms of a Carrier-to-Noise power spectral density ratio (C/N 0) degradation, in accordance with the methodology which the LS/GPS Technical Working Group (TWG) established by mandate of the FCC. A model for determining the impact of the LS signal on the considered GNSS signals is provided and is validated against experimental data. It is shown that the Galileo E1 Open Service (OS) signal is marginally more susceptible to this form of interference than the GPS L1 C/A signal due to its greater proximity to the lower edge of the L1 band. The impact of LS interference was further analyzed in terms of pseudorange and position errors. Despite its relevance for most GNSS users, this aspect was not considered by the TWG. Measurement and position domain analysis along with the study of the LS impact on the Galileo OS signals are the major contributions. The analysis confirms the results obtained by the TWG and shows that the receiver front-end plays a major role in protecting GNSS signals against RF interference. While it appears that, for now, the LS network will not be deployed, the approach taken and the results obtained herein can be readily adapted for any future terrestrial mobile network that may take the place of LS.  相似文献   
18.
The paper deals with the automatic grid azimuth determination of an object on the earth using the electronic theodolite Kern E2 by observing the sun, a star, or a planet. The observation time and the readings of horizontal and vertical circles of the electronic theodolite enter automatically the electronic calculator Hewlett-Packard HP41CX. The calculator computes the grid azimuth of the terrestrial object and directs the observer what to do. Therefore, even a personnel without specific high education can make observations. For the solar observations it is not necessary to have an astronomical almanac, because the program computes the solar coordinates. The general input data, together with the measured ones, can be stored on a magnetic cassette and later on, if necessary, one can correct the general data and recalculate the azimuth. The method proved to be very practical in the field work.  相似文献   
19.
Sampling exploration of uncertain functions to locate critical contour levels is most effective if sampling decisions are made sequentially. A simple sequential exploration strategy, based on pseudo-Bayesian second-moment analysis, is proposed and compared with non-sequential systematic sampling. Repeated application to functions simulated pseudorandomly from stationary random processes on the line and on the plane indicates uniform superiority of the sequential strategy. The method is particularly advantageous when the function of interest,h(X), has an uncertain trend, and in general when the random process that quantifies prior uncertainty onh(X) is highly correlated.  相似文献   
20.
Evidence of climate change within the Adamello Glacier of Italy   总被引:2,自引:2,他引:0  
We analyze a daily series of rainfall, snowfall, air temperature, and snow water equivalent at fixed dates from 40 high-altitude stations on the Adamello Glacier area (Italian Alps), for the period 1965–2007. Purposes of the study are (1) to investigate significant variation in time, (2) to evaluate effect of temperature changes on cryospheric water cycle, and (3) to evaluate underlying climate patterns and the most significant variables for climate change studies. We detect the presence of a trend using linear regression, moving window average and Mann Kendall test. Linear dependence of water related variables on temperatures is assessed. We find substantially unchanged atmospheric water input along with increasing temperature and rainfall, decreasing snowfall and snow water equivalent at thaw, and shortening of snow cover extent and duration. We carry out a principal components analysis which highlights patterns of precipitation distribution resulting from local temperature and external forcing. A set of the most representative variables for climate and glacier studies is then assessed. A comparison with three nearby Southern Alpine glacierized areas in Italy and Switzerland shows substantial agreement. In spite of the relative shortness of the series, the results here are of interest and can be used as a benchmark for climate change impact assessment for the Adamello Glacier area and southern Alps.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号