首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
V.R. Eke  L.F.A. Teodoro 《Icarus》2009,200(1):12-18
A new analysis of the Lunar Prospector epithermal neutron data is presented, providing an improved map of the distribution of hydrogen near to the lunar poles. This is achieved using a specially developed pixon image reconstruction algorithm to deconvolve the instrumental response of the Lunar Prospector's neutron spectrometer from the observed data, while simultaneously suppressing the statistical noise. The results show that these data alone require the hydrogen to be concentrated into the cold traps at up to 1 wt% water-equivalent hydrogen. This combination of localisation and high concentration suggests that the hydrogen is present either in the form of a volatile compound or as solar wind protons implanted into small regolith grains.  相似文献   

2.
A speedy pixon algorithm for image reconstruction is described. Two applications of the method to simulated astronomical data sets are also reported. In one case, galaxy clusters are extracted from multiwavelength microwave sky maps using the spectral dependence of the Sunyaev–Zel'dovich effect to distinguish them from the microwave background fluctuations and the instrumental noise. The second example involves the recovery of a sharply peaked emission profile, such as might be produced by a galaxy cluster observed in X-rays. These simulations show the ability of the technique both to detect sources in low signal-to-noise ratio data and to deconvolve a telescope beam in order to recover the internal structure of a source.  相似文献   

3.
本文以CCD图象为技术背景,指出CCD图象噪声主要将以与信号有关的泊松噪声形式存在。根据图象形成的模型及图象统计特性,提出了最大似然(MI)法,最大后验概率(MAP)法及最大熵(ME)复原方法。针对MAP法提出了具体算法和参数估计方法。并提出应用图象分割法来提高计算机复原速度,节省存储空间。  相似文献   

4.
Cocks  F.H.  Watkins  S.A.  Walker  M.J.  Lutz  T.A.  Sussingham  J.C. 《Solar physics》2001,198(2):211-222
A telescope based upon dark-lens diffractive optics would be a uniquely new instrument for solar astronomy. The image formation process in such a telescope gives an intrinsically higher resolving power and a greatly reduced image intensity compared to that of refracting or reflecting optical systems of similar lens dimension. This low image intensity would be an advantage for solar observations made using a very large imaging element. After a brief overview of the history of solar instrument development, a quantitative evaluation of the dark-lens diffracting solar telescope concept is presented, showing the potential of this imaging method to meet or even to exceed the most demanding resolution goals currently being considered for future space-borne solar telescopes.  相似文献   

5.
6.
We present a two-dimensional version of the classical one-dimensional Kolmogorov–Smirnov (KS) test, extending an earlier idea due to Peacock and an implementation proposed by Fasano and Franceschini. The two-dimensional KS test is used to optimize the goodness of fit in an iterative source-detection scheme for astronomical images. The method is applied to a ROSAT /HRI X-ray image of the post-core-collapse globular cluster NGC 6397 to determine the most probable source distribution in the cluster core. Comparisons to other widely used source-detection methods, and to a Chandra image of the same field, show that our iteration scheme is superior in measuring statistics-limited sources in severely crowded fields.  相似文献   

7.
Time-dependent magneto-hydrodynamic simulations of active region coronal magnetic field require the underlying photospheric magnetic footpoint velocities. The minimum energy fit (MEF) is a new velocity inversion technique to infer the photospheric magnetic footpoint velocities using a pair of vector magnetograms, introduced by Longcope (2004). The MEF selects the smallest overall flow from several consistent flows by minimizing an energy functional. The inferred horizontal and vertical flow fields by the MEF can be further constrained by incorporating the partial or imperfect velocity information obtained through independent means. This hybrid method is expected to give a velocity close to the true magnetic footpoint velocity. Here, we demonstrate that a combination of the MEF, the local correlation tracking (LCT) and Doppler velocity is capable of inferring the velocity close to the photospheric flow.  相似文献   

8.
Image restoration, computerized tomography, and other similar problems are considered as a unified class of stochastic inverse problems. The conventional approach to these problems that proceeds from some integral or functional equations suffers from three main shortcomings: (i) subjectivity, (ii) inability to account for the inner (radiational) noise, and (iii) inability to include the fundamental concept of the natural limit of solution accuracy. A general approach is developed, the Statistical Parameterization of Inverse Problems (SPIPR), that takes into account both the inner and external random noise and gives an explicit form of the above-mentioned natural limit. Applications of the SPIPR to various problems show that the maximum likelihood method as the concrete way to obtain an object estimate has practically limiting efficiency.Two new fields of applications of the SPIPR are outlined along with the image restoration problem: the elimination of blurring due to atmosphere turbulence and reconstruction of an object structure in the computerized tomography. The expressions for the main distribution function in all these problems are found. The corresponding real examples and model cases are considered as well.  相似文献   

9.
Irbah  A.  Bouzaria  M.  Lakhal  L.  Moussaoui  R.  Borgnino  J.  Laclare  F.  Delmas  C. 《Solar physics》1999,185(2):255-273
Good edge extraction from temporal series of solar images is fundamental in the solar astrolabe experiment. Noise and spots in images cause however, difficulties to extract an accurate solar edge. We present in this paper a new image-processing method which solves this problem and thus improves the characteristics of the solar astrolabe experiment. The method is based upon the use of wavelet transform in the solar image analysis. It is developed to remove image defects (parasite spots) and noise without reducing image resolution. Solar images obtained at Calern Observatory astrolabe (France) are then processed using this method. Solar edges of these images are extracted and trajectories reconstructed.  相似文献   

10.
Lucky imaging is a high-resolution astronomical image recovery technique with two classic implementation algorithms, i.e. image selecting, shifting and adding in image space, and data selecting and image synthesizing in Fourier space. This paper proposes a novel lucky imaging algorithm where with space-domain and frequency-domain selection rates as a link, the two classic algorithms are combined successfully, making each algorithm a proper subset of the novel hybrid algorithm. Experimental results show that with the same experiment dataset and platform, the high-resolution image obtained by the proposed algorithm is superior to that obtained by the two classic algorithms. This paper also proposes a new lucky image selection and storage scheme, which can greatly save computer memory and enable lucky imaging algorithm to be implemented in a common desktop or laptop with small memory and to process astronomical images with more frames and larger size. In addition, through simulation analysis,this paper discusses the binary star detection limits of the novel lucky imaging algorithm and traditional ones under different atmospheric conditions.  相似文献   

11.
We present a new method of image cleaning for imaging atmospheric Cherenkov telescopes. The method is based on the utilization of wavelets to identify noise pixels in images of gamma-ray and hadronic induced air showers. This method selects more signal pixels with Cherenkov photons than traditional image processing techniques. In addition, the method is equally efficient at rejecting pixels with noise alone. The inclusion of more signal pixels in an image of an air shower allows for a more accurate reconstruction, especially at lower gamma-ray energies that produce low levels of light. We present the results of Monte Carlo simulations of gamma-ray and hadronic air showers which show improved angular resolution using this cleaning procedure. Data from the Whipple Observatory's 10-m telescope are utilized to show the efficacy of the method for extracting a gamma-ray signal from the background of hadronic generated images.  相似文献   

12.
With increasingly large data sets, weak lensing measurements are able to measure cosmological parameters with ever-greater precision. However, this increased accuracy also places greater demands on the statistical tools used to extract the available information. To date, the majority of lensing analyses use the two-point statistics of the cosmic shear field. These can be either studied directly using the two-point correlation function or in Fourier space, using the power spectrum. But analysing weak lensing data inevitably involves the masking out of regions, for example to remove bright stars from the field. Masking out the stars is common practice but the gaps in the data need proper handling. In this paper, we show how an inpainting technique allows us to properly fill in these gaps with only   N log  N   operations, leading to a new image from which we can compute straightforwardly and with a very good accuracy both the power spectrum and the bispectrum. We then propose a new method to compute the bispectrum with a polar fft algorithm, which has the main advantage of avoiding any interpolation in the Fourier domain. Finally, we propose a new method for dark matter mass map reconstruction from shear observations, which integrates this new inpainting concept. A range of examples based on 3D N -body simulations illustrates the results.  相似文献   

13.
讨论了经过大气后天文图象的象质衰减,研究了对这种衰减进行描述的视宁度参数r0及其它几个大气光学参数,介绍了几种测量象质衰减参数的方法,重点介绍了我们将差分像运动法运用于白日视宁度测量的尝试以及新设计的可兼顾昼夜的视宁度测量仪,仪器已用于云台白日的视宁度测量。最后对近场近似假设进行了定量分析,在此基础上给出了新确定的近场近似成立的范围,这一范围比以前所给的要更大些。  相似文献   

14.
E. Pagot  P. Lamy  A. Llebaria  B. Boclet 《Solar physics》2014,289(4):1433-1453
We report on automated procedures for correcting the images of the LASCO coronagraph for i) spurious quasi-point-sources such as the impacts of cosmic rays, stars, and planets, and ii) the absence of signal due to transmission errors or dropouts, which results in blocks of missing information in the images. Correcting for these undesirable artifacts is mandatory for all quantitative works on the solar corona that require data inversion and/or long series of images, for instance. The nonlinear filtering of spike noise or point-like objects is based on mathematical morphology and implements the procedure opening by morphological reconstruction. However, a simple opening filter is applied whenever the fractional area of corrupted pixels exceeds 50 % of the original image. We describe different strategies for reconstructing the missing information blocks. In general, it is possible to implement the method of averaged neighbors using the two images obtained immediately before and after the corrupted image. For the other cases, and in particular when missing blocks overlapped in three images, we developed an original procedure of weighted interpolation along radial profiles from the center of the Sun that intercept the missing block(s). This procedure is also adequate for the saturated images of bright planets (such as Venus) that bleed along the neighboring pixels. Missing blocks in polarized images may generally be reconstructed using the associated unpolarized image of the same format. But in the case of overlapping missing blocks, we implemented our procedure of weighted interpolation. All tests performed on numerous LASCO-C2 images at various periods of solar activity (i.e. varying complexity of the structure of the corona) demonstrate the excellent performance of these new procedures, with results vastly superior to the methods implemented so far in the pipeline-processing of the LASCO images.  相似文献   

15.
A new approach is proposed and developed to handle pre-processed CCD frames in order to identify stellar images and derive their relevant parameters. The present method relies on: 1) identifying stellar images and assigning approximate positions of their centres using artificial intelligence techniques, 2) accurate determination of the centre coordinates applying an elementary statistical concept and 3) estimating the image peak intensity as a stellar magnitude measure employing a simple numerical analysis approach. The method has been coded for personal computer users. A CCD frame of the star cluster M67 was adopted as a test case. The results obtained are discussed in comparison with the DAOPHOTII ones and the corresponding published data. Exact coincidence has been found between both results except very few cases. These exceptions have been discussed in view of both methods' bases and the cluster plates. It has been realized that the method suggested exhibits very simple, extremely fast and high precise approach in stellar CCD photometry domain. Moreover, it is more capable for handling blended and distorted stellar images than the DAOPHOTII. These characteristics show the usefulness of the present method in some astronomical applications such as auto-focusing and auto-guiding sensing approaches beside the main purpose, viz. stellar photometry. This revised version was published online in July 2006 with corrections to the Cover Date.  相似文献   

16.
We derive a generalized van Cittert-Zernike (vC-Z) theorem for radio astronomy that is valid for partially polarized sources over an arbitrarily wide field of view (FoV). The classical vC-Z theorem is the theoretical foundation of radio astronomical interferometry, and its application is the basis of interferometric imaging. Existing generalized vC-Z theorems in radio astronomy assume, however, either paraxiality (narrow FoV) or scalar (unpolarized) sources. Our theorem uses neither of these assumptions, which are seldom fulfiled in practice in radio astronomy, and treats the full electromagnetic field. To handle wide, partially polarized fields, we extend the two-dimensional (2D) electric field (Jones vector) formalism of the standard 'Measurement Equation' (ME) of radio astronomical interferometry to the full three-dimensional (3D) formalism developed in optical coherence theory. The resulting vC-Z theorem enables full-sky imaging in a single telescope pointing, and imaging based not only on standard dual-polarized interferometers (that measure 2D electric fields) but also electric tripoles and electromagnetic vector-sensor interferometers. We show that the standard 2D ME is easily obtained from our formalism in the case of dual-polarized antenna element interferometers. We also exploit an extended 2D ME to determine that dual-polarized interferometers can have polarimetric aberrations at the edges of a wide FoV. Our vC-Z theorem is particularly relevant to proposed, and recently developed, wide FoV interferometers such as Low Frequency Array (LOFAR) and Square Kilometer Array (SKA), for which direction-dependent effects will be important.  相似文献   

17.
本文对图象的统计特性及其适用模型作了分析。讨论了以前常用的平稳图象模型的缺陷,对实际图象的统计特性作了分析,认为它是非平稳的,不满足各态历经性,同时在空间上是高度相关的;讨论了相应的随机参数统计模型和描述性统计模型。指出作用于整幅图象的“全局”图象复原算法比每次只独立计算单个象元的“点”图象复原算法更为优越。  相似文献   

18.
Miura  Noriaki  Baba  Naoshi  Sakurai  Takashi  Ichimoto  Kiyoshi  Soltau  Dirk  Brandt  Peter 《Solar physics》1999,187(2):347-356
A method for the improvement of resolution in an observed solar image is proposed. A blind deconvolution method is used for restoration of an atmospherically-degraded solar image, and a super-resolution method is applied to its restored image to improve the resolution. It is confirmed that a blind deconvolution process can restore fine structures that are blurred in an observed image, and that the super-resolution process can make a cutoff frequency in a blind-deconvolved image higher. A time series of super-resolved images of a sunspot observed with the 70-cm Vacuum Tower Telescope at Teide Observatory is presented.  相似文献   

19.
The increasing use of data compression by space mission experiments poses the question of quality of the images obtained after the compression-decompression process. Indeed, working on an Image Compression Module (ICM), Using Discrete Cosine Transform (DCT), with 8*8 pixel-sized sub-images (each pixel being coded on eight bits), one can find blocking effects on their boundaries. Avril and Nguyen (1992, thereafter ANG 1992), have shown that One Neighbour Accounting Filters, used after image reconstruction without modifying the coding method , provide the best and fastest correction as far as linear filtering is concerned. We present here a non-linear method, also used after image reconstruction, but working on spatial frequencies. It allows us to segregate, in the Fourier space, the signal from the defect, and then to remove it through applying a filter adapted to the frequency spectrum of each spoiled image. Employing the reverse Fourier transform, we then retrieve the corrected image. The efficiency of this new method was tested by three different means:- when Fourier filtering is applied to a reference set of aerial photographs of the Earth, blocking effects are quite indistinguishable by human vision, even when zooming on the images, which was not the case with ONAF;- the improvement of the Root Mean Square (RMS) Error, calculated between the filtered and original images, is at least three times greater than the one obtained with ONAF;- the reconstruction of a three-dimensional view of a landscape, thanks to two stereoscopic images having undergone a compression-decompression process with an algorithm using DCT and a compression rate of about 10, is possible only after Fourier filtering has been applied.The quite good preliminary results of the application of Fourier filtering to the Clementine images of the Moon are also represented.  相似文献   

20.
The possibility of reconstructing the surface topography from single images with the photometric method in the linear approximation is analyzed. The photometric method or surface topography Reconstruction employs a statistical approach to the problem formulation and is the most mathematically correct. This method allows determination of the most probable surface topography given specific observational data. When only one image is available, the photometric method is superior in comparison with the currently available photoclinometry. The processing of test surface topography with the photometric method shows that, under typical conditions, the error of surface relief reconstruction is of higher than 40% in terms of the standard deviation of the surface height. The surface relief of some Martian areas are reconstructed from HRSC images obtained by the Mars Express spacecraft. It is shown that the image-reconstructed surface topography is in good agreement with the topographic information for the same Martian areas obtained by the MOLA altimeter.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号