首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
The Fisher matrix approach allows one to calculate in advance how well a given experiment will be able to estimate model parameters, and has been an invaluable tool in experimental design. In the same spirit, we present here a method to predict how well a given experiment can distinguish between different models, regardless of their parameters. From a Bayesian viewpoint, this involves computation of the Bayesian evidence. In this paper, we generalize the Fisher matrix approach from the context of parameter fitting to that of model testing, and show how the expected evidence can be computed under the same simplifying assumption of a Gaussian likelihood as the Fisher matrix approach for parameter estimation. With this 'Laplace approximation' all that is needed to compute the expected evidence is the Fisher matrix itself. We illustrate the method with a study of how well upcoming and planned experiments should perform at distinguishing between dark energy models and modified gravity theories. In particular, we consider the combination of 3D weak lensing, for which planned and proposed wide-field multiband imaging surveys will provide suitable data, and probes of the expansion history of the Universe, such as proposed supernova and baryonic acoustic oscillations surveys. We find that proposed large-scale weak-lensing surveys from space should be able readily to distinguish General Relativity from modified gravity models.  相似文献   

2.
The subject of this paper is a quantification of the impact of uncertainties in bias and bias evolution on the interpretation of measurements of the integrated Sachs–Wolfe (ISW) effect, in particular on the estimation of cosmological parameters. We carry out a Fisher matrix analysis for quantifying the degeneracies between the parameters of a dark energy cosmology and bias evolution, for the combination of the PLANCK microwave sky survey with the EUCLID main galaxy sample, where bias evolution   b ( a ) = b 0+ (1 − a ) ba   is modelled with two parameters b 0 and   ba   . Using a realistic bias model introduces a characteristic suppression of the ISW spectrum on large angular scales, due to the altered distance-weighting functions. The errors in estimating cosmological parameters if the data with evolving bias is interpreted in the framework of cosmologies with constant bias are quantified in an extended Fisher formalism. We find that the best-fitting values of all parameters are shifted by an amount comparable to the statistical accuracy: the estimation bias in units of the statistical accuracy amounts to 1.19 for Ωm, 0.27 for σ8 and 0.72 for w for bias evolution with   ba = 1  . Leaving   ba   open as a free parameter deteriorates the statistical accuracy, in particular on Ωm and w .  相似文献   

3.
介绍了稀疏矩阵的四种常见形式以及稀疏矩阵技术在天测与测地VLBI数据处理中的应用。推演了天测与测地VLBI数据综合解算中所用稀疏矩阵形式下待估参数求解和协方差矩阵估算的算法。通过对是否采用稀疏矩阵技术时方程求解 (乘法和加法 )运算对数的估算和比较 ,表明普通最小二乘方法的运算对数约为参数总数的 3次方 ,而采用稀疏矩阵技术时的运算对数近似与参数总数成线性关系 ,从而能够在现代空间对地观测技术的大样本数据处理中显著缩短计算时间。  相似文献   

4.
LISA Pathfinder is a science and technology demonstrator of the European Space Agency within the framework of its LISA mission, which aims to be the first space-borne gravitational wave observatory. The payload of LISA Pathfinder is the so-called LISA Technology Package, which is designed to measure relative accelerations between two test masses in nominal free fall. Its disturbances are monitored and dealt by the diagnostics subsystem. This subsystem consists of several modules, and one of these is the magnetic diagnostics system, which includes a set of four tri-axial fluxgate magnetometers, intended to measure with high precision the magnetic field at the positions of the test masses. However, since the magnetometers are located far from the positions of the test masses, the magnetic field at their positions must be interpolated. It has been recently shown that because there are not enough magnetic channels, classical interpolation methods fail to derive reliable measurements at the positions of the test masses, while neural network interpolation can provide the required measurements at the desired accuracy. In this paper we expand these studies and we assess the reliability and robustness of the neural network interpolation scheme for variations of the locations and possible offsets of the magnetometers, as well as for changes in environmental conditions. We find that neural networks are robust enough to derive accurate measurements of the magnetic field at the positions of the test masses in most circumstances.  相似文献   

5.
6.
The success of LISA Pathfinder in demonstrating the LISA drag-free requirement paved the way for using space interferometers to detect low-frequency and middle-frequency gravitational waves(GWs). The TAIJI GW mission and the new LISA GW mission propose using an arm length of 3 Gm(1 Gm = 10~6 km) and an arm length of 2.5 Gm respectively. For a space laser-interferometric GW antenna,due to astrodynamical orbit variation, time delay interferometry(TDI) is needed to achieve nearly equivalent equal-arms for suppressing the laser frequency noise below the level of optical path noise, acceleration noise, etc in order to attain the requisite sensitivity. In this paper, we simulate TDI numerically for the TAIJI mission and the new LISA mission. To do this, we work out a set of 2200-day(6-year) optimized science orbits for each mission starting on 2028 March 22 using the CGC 2.7.1 ephemeris framework. Then we use the numerical method to calculate the residual optical path differences of the first-generation TDI configurations and the selected second-generation TDI configurations. The resulting optical path differences of the second-generation TDI configurations calculated for TAIJI, new LISA and eLISA are well below their respective requirements for laser frequency noise cancelation. However, for the first-generation TDI configurations, the original requirements need to be relaxed by 3 to 30 fold to be satisfied. For TAIJI and the new LISA, about one order of magnitude relaxation would be good and recommended; this could be borne on the laser stability requirement in view of recent progress in laser stability, or the GW detection sensitivities of the second-generation TDIs have to be used in the diagnosis of the observed data instead of the commonly used X, Y and Z TDIs.  相似文献   

7.
We develop a general formalism for analysing parameter information from non-Gaussian cosmic fields. The method can be adapted to include the non-linear effects in galaxy redshift surveys, weak lensing surveys and cosmic velocity field surveys as part of parameter estimation. It can also be used as a test of non-Gaussianity of the cosmic microwave background. Generalizing maximum-likelihood analysis to second order, we calculate the non-linear Fisher information matrix and likelihood surfaces in parameter space. To this order we find that the information content is always increased by including non-linearity. Our methods are applied to a realistic model of a galaxy redshift survey, including non-linear evolution, galaxy bias, shot-noise and redshift-space distortions to second order. We find that including non-linearities allows all of the degeneracies between parameters to be lifted. Marginalized parameter uncertainties of a few per cent will then be obtainable using forthcoming galaxy redshift surveys.  相似文献   

8.
We optimise the parameters of the Population Monte Carlo algorithm using numerical simulations. The optimisation is based on an efficiency statistic related to the number of samples evaluated prior to convergence, and is applied to a D ‐dimensional Gaussian distribution to derive optimal scaling laws for the algorithm parameters. More complex distributions such as the banana and bimodal distributions are also studied. We apply these results to a cosmological parameter estimation problem that uses CMB anisotropy data from the WMAP nine‐year release to constrain a six parameter adiabatic model and a fifteen parameter admixture model, consisting of correlated adiabatic and isocurvature perturbations. In the case of the adiabatic model and the admixture model we find that the number of sample points increase by factors of 3 and 20, respectively, relative to the optimal Gaussian case. This is due to degeneracies in the underlying parameter space. The WMAP nine‐year data constrain the admixture model to have an isocurvature fraction of 36.3 ± 2.8 %. (© 2016 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

9.
The traditional least square estimation (LSE) method for orbit determination will not be optimal if the error of observational data does not obey the Gaussian distribution. In order to solve this problem, the least p-norm (Lp) estimation method is presented in this paper to deal with the non-Gaussian distribution cases. We show that a suitable selection of parameter p may guarantee a reasonable orbit determination result. The character of Lp estimation is analyzed. It is shown that the traditional Lp estimation method is not a robust method. And a stable Lp estimating based on data depth weighting is put forward to deal with the model error and outlier. In the orbit determination process, the outlier of observational data and coarse model error can be quantitatively described by their weights. The farther is the data from the data center, the smaller is the value of data depth and the smaller is the weighted value accordingly. The result of the new Lp method is stabler than that of the traditional Lp estimation and the breakdown point could be up to 1/2. In addition, the orbit parameter is adaptively estimated by residual analysis and matrix estimation method, and the estimation efficiency is enhanced. Finally, by taking the Space-based Space Surveillance System as an example and performing simulation experiments, we show that if there are system error or abnormal value in the observational data or system error in satellite dynamical model and space-based observation platform, LSE will not be optimal even though the observational data obeys the Gaussian distribution, and the orbit determination precision by the self-adaptive robust Lp estimation method is much better than that by the traditional LSE method.  相似文献   

10.
Binary black hole coalescences emit gravitational waves that will be measurable by the space-based detector LISA to large redshifts. This suggests that LISA may be able to observe black holes grow and evolve as the Universe evolves, mapping the distribution of black hole masses as a function of redshift. An immediate difficulty with this idea is that LISA measures certain redshifted combinations of masses with good accuracy: if a system has some mass parameter m , then LISA measures  (1+ z ) m   . This mass–redshift degeneracy makes it difficult to follow the mass evolution. In many cases, LISA will also measure the luminosity distance D of a coalescence accurately. Since cosmological parameters (particularly the mean density, the cosmological constant and the Hubble constant) are now known with moderate precision, we can obtain z from D and break the degeneracy. This makes it possible to untangle the mass and redshift and to study the mass and merger history of black holes. Mapping the black hole mass distribution could open a window on to an early epoch of structure formation.  相似文献   

11.
Attila Elteto  Owen B. Toon 《Icarus》2010,210(2):566-588
We present a new parameter retrieval algorithm for Mars Global Surveyor Thermal Emission Spectrometer data. The algorithm uses Newtonian first-order sensitivity functions of the infrared spectrum in response to variations in physical parameters to fit a model spectrum to the data at 499, 1099, and 1301 cm−1. The algorithm iteratively fits the model spectrum to data to simultaneously retrieve dust extinction optical depth, effective radius, and surface temperature. There are several sources of uncertainty in the results. The assumed dust vertical distribution can introduce errors in retrieved optical depth of a few tens of percent. The assumed dust optical constants can introduce errors in both optical depth and effective radius, although the systematic nature of these errors will not affect retrieval of trends in these parameters. The algorithm does not include the spectral signature of water ice, and hence data needs to be filtered against this parameter before the algorithm is applied. The algorithm also needs sufficient dust spectral signature, and hence surface-to-atmosphere temperature contrast, to successfully retrieve the parameters. After the application of data filters the algorithm is both relatively accurate and very fast, successfully retrieving parameters, as well as meaningful parameter variability and trends from tens of thousands of individual spectra on a global scale (Elteto, A., Toon, O.B. [2010]. Icarus, this issue). Our results for optical depth compare well with TES archive values when corrected by the single scattering albedo. Our results are on average 1–4 K higher in surface temperatures from the TES archive values, with greater differences at higher optical depths. Our retrieval of dust effective radii compare well with the retrievals of Wolff and Clancy (Wolff, M.J., Clancy, R.T. [2003]. J. Geophys. Res. 108 (E9), 5097) for the corresponding data selections from the same orbits.  相似文献   

12.
This paper presents an ‘adaptive probability of crossover’ technique, as a variation of the differential evolution algorithm (ACDE), for optimal parameter estimation in the general curve-fitting problem. The technique is applied to the determination of orbital elements of a spectroscopic binary system (eta Bootis). In the ACDE, Varying the crossover probability rate (Cr) provides faster convergence than keeping it constant. The Cr is determined for each trial parameter vector (‘individual’) as a function of fit goodness. The adaptation automatically updates control parameter to an appropriate value, without requiring prior knowledge of the relationship between particular parameter settings and a given problem optimization characteristics. The presented analysis of eta Bootis derives best-fitting Keplerian and phasing curves. Error estimation of the optimal parameters is also included. Comparison of the results with previously published values suggests that the ACDE technique has a useful applicability to astrophysical data analysis.  相似文献   

13.
A continuation of V. Yu. Terebizh, Astrofizika,40, 139, 273, 413 (1997). An explicit representation is found for the Fisher matrix for spectral density, enabling one to calculate the lower limit of the variance of an arbitrary unbiased density estimate. Basic equations describing smoothed density estimates are given for comparison with exact results. By numerical modeling based on the example of an AR-1 process, it is shown that the relative accuracy q of estimation of density is some universal function of the parameter w = (F -1)/N, where F is the number of parameters underlying the estimate and N is the length of the time series. The relationship q = θ(w), a similarity law, explains why a number of density estimates proposed earlier (Schuster’s periodogram, in particular) proved to be statistically inconsistent. It is just these estimates that presume an extremely detailed model of spectral density. The need for the complexity of the model to be consistent with the observational data follows from the limitation of information about the spectrum of the random process included in a sample of readings from a series of fixed volume. Translated from Astrofizika, Vol. 41, No. 1. pp. 113–122, January-March, 1998.  相似文献   

14.
We analyse the effects of the detector response time on bolometric measurements of the anisotropy of the cosmic microwave background (CMB). We quantify the effect in terms of a single dimensionless parameter L defined as the ratio between the time the beam sweeps its own size and the bolometer response time. As L decreases below ∼ 2.5, the point-source response of the experiment becomes elongated. We introduce a window function matrix based on the timestream data to assess the effects of the elongated beam. We find that the values of the window function matrix elements decrease slowly as a function of L . Our analysis and results apply to other cases of beam asymmetry. For the High Frequency Instrument on board the Planck Surveyor satellite we show that, for a broad range of L , the ability of the experiment to extract the cosmological parameters is not degraded. Our analysis enhances the flexibility in tuning the design parameters of CMB anisotropy experiments.  相似文献   

15.
In this paper, we study extended Chaplygin gas as a candidate for inflation and predict the values of gas parameters for a physically viable cosmological model. The extended Chaplygin gas which proposed recently has n+2 free parameters. When n=1, there are three parameters which are corresponding to modified Chaplygin gas. Here we focus on the second order equation of state where n=2, so we have generally four free parameters. Under some assumptions, we reduced free parameters of the model to the only one parameter and try to fix it using the dimensionless age parameter. Also we check validity of our calculations using recent observations of BICEP2.  相似文献   

16.
We present a new algorithm, Eclipsing Binary Automated Solver (EBAS), to analyse light curves of eclipsing binaries. The algorithm is designed to analyse large numbers of light curves, and is therefore based on the relatively fast ebop code. To facilitate the search for the best solution, EBAS uses two parameter transformations. Instead of the radii of the two stellar components, EBAS uses the sum of radii and their ratio, while the inclination is transformed into the impact parameter. To replace human visual assessment, we introduce a new 'alarm' goodness-of-fit statistic that takes into account correlation between neighbouring residuals. We perform extensive tests and simulations that show that our algorithm converges well, finds a good set of parameters and provides reasonable error estimation.  相似文献   

17.
In Brans-Dicke theory of gravity, from the nature of the scalar field-potential considered, the dark energy, dark matter, radiation densities predicted by different observations and the closedness of the universe considered, we can fix our ω BD , the Brans-Dicke parameter, keeping only the thing in mind that from different solar system constrains it must be greater than 5×105. Once we have a value, satisfying the required lower boundary, in our hand we proceed for setting unknown parameters of the different dark energy models’ EoS parameter. In this paper we work with three well known red shift parametrizations of dark energy EoS. To constrain their free parameters for Brans Dicke theory of gravity we take twelve point red shift vs Hubble’s parameter data and perform χ 2 test. We present the observational data analysis mechanism for Stern, Stern+BAO and Stern+BAO+CMB observations. Minimising χ 2, we obtain the best fit values and draw different confidence contours. We analyze the contours physically. Also we examine the best fit of distance modulus for our theoretical models and the Supernovae Type Ia Union2 sample. For Brans Dicke theory of gravity the difference from the mainstream confidence contouring method of data analysis id that the confidence contours evolved are not at all closed contours like a circle or a ellipse. Rather they are found to be open contours allowing the free parameters to float inside a infinite region of parameter space. However, negative EoSs are likely to evolve from the best fit values.  相似文献   

18.
The compilation of a central database for asteroid lightcurve data, i.e., rotation rate and amplitude along with ancillary information such as diameter and albedo (known or estimated), taxonomic class, etc., has been important to statistical studies for several decades. Having such a compilation saves the researcher hours of effort combing through any number of journals, some obvious and some not, to check on prior research. Harris has been compiling such data in the Asteroid Lightcurve Database (LCDB) for more than 25 years with Warner and Pravec assisting the past several years. The main data included in the LCDB are lightcurve rotation periods and amplitudes, color indices, H-G parameters, diameters (actual or derived), basic binary asteroid parameters, and spin axis and shape models. As time permits we are reviewing existing entries to enter data not previously recorded (e.g., phase angle data). As of 2008 December, data for 3741 asteroids based on more than 10650 separate detail records derived from entries in various journals were included in the LCDB. Of those 3741 asteroids, approximately 3100 have data of sufficient quality for statistical analysis, including 7 that have “dual citizenship” - meaning that they have (or had) asteroid designations as well comet designations. Here we present a discussion of the nature of LCDB data, i.e., which values are actually measured and which are derived. For derived data, we give our justification for specific values. We also present some analysis based on the LCDB data, including new default albedo (pV) and phase slope parameter (G) values for the primary taxonomic classes and a review of the frequency-diameter distribution of all asteroids as well as some selected subsets. The most recent version of data used in this analysis is available for download from the Collaborative Asteroid Lightcurve Link (CALL) site at http://www.MinorPlanetObserver.com/astlc/default.htm. Other data sets, some only subsets of the full LCDB, are available in the Ephemeris of Minor Planets, The Planetary Data System, and the Minor Planet Center web site.  相似文献   

19.
A novel artificial intelligence (AI) technique that uses machine learning (ML) methodologies combines several algorithms, which were developed by ThetaRay, Inc., is applied to NASA’s Transiting Exoplanets Survey Satellite (TESS) dataset to identify exoplanetary candidates. The AI/ML ThetaRay system is trained initially with Kepler exoplanetary data and validated with confirmed exoplanets before its application to TESS data. Existing and new features of the data, based on various observational parameters, are constructed and used in the AI/ML analysis by employing semi-supervised and unsupervised machine learning techniques. By the application of ThetaRay system to 10,803 light curves of threshold crossing events (TCEs) produced by the TESS mission, obtained from the Mikulski Archive for Space Telescopes, the algorithm yields about 50 targets for further analysis, and we uncover three new exoplanetary candidates by further manual vetting. This study demonstrates for the first time the successful application of the particular combined multiple AI/ML-based methodologies to a large astrophysical dataset for rapid automated classification of TCEs.  相似文献   

20.
The absence of compelling theoretical model requires the parameterizing the dark energy to probe its properties. The parametrization of the equation of state of the dark energy is a common method. We explore the theoretical optimization of the parametrization based on the Fisher information matrix. As a suitable parametrization, it should be stable at high redshift and should produce the determinant of the Fisher matrix as large as possible. For the illustration, we propose one parametrization which can satisfy both criteria. By using the proper parametrization, we can improve the constraints on the dark energy even for the same data. We also show the weakness of the so-called principal component analysis method.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号