首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
We present a study of the dynamic range limitations in images produced with the proposed Square Kilometre Array (SKA) using the Cotton-Schwab CLEAN algorithm for data processing. The study is limited to the case of a small field of view and a snap-shot observation. A new modification of the Cotton-Schwab algorithm involving optimization of the position of clean components is suggested. This algorithm can reach a dynamic range as high as 106 even if the point source lies between image grid points, in contrast to about 103 for existing CLEAN-based algorithms in the same circumstances. It is shown that the positional accuracy of clean components, floating point precision and the w-term are extremely important at high dynamic range. The influence of these factors can be reduced if the variance of the gradient of the point spread function is minimized during the array design.  相似文献   

2.
天体搜索是天文数据处理流程的一个重要环节,也是以平方公里阵列射电望远镜(Square Kilometre Array, SKA)为代表的下一代射电望远镜在面向海量数据处理中的挑战之一.现今天体自动搜索算法、软件已日趋成熟并投入应用,不过在自动化、兼容性等方面仍具有提升空间.以更自动化、更适应海量数据需求的天体搜索算法研究为宗旨,以现有算法为研究基础,天体自动搜索软件系统得到设计和开发.该系统包含友好的交互式用户操作界面,具备可视化输出数据显示、兼容不同数据输入和输出并包含为实际应用服务的文件管理功能.该系统对于大天区图以及图像集,均能够很好地进行自动化处理.测试结果显示,上述方法对于天体搜索的改进有一定成效.后续将在此基础上对该集成系统做进一步的改进开发,以适应更多的需求.  相似文献   

3.
A sky model from CLEAN deconvolution is a particularly effective high dynamic range reconstruction in radio astronomy,which can effectively model the sky and remove the sidelobes of the point spread function(PSF)caused by incomplete sampling in the spatial frequency domain.Compared to scale-free and multi-scale sky models,adaptive-scale sky modeling,which can model both compact and diffuse features,has been proven to have better sky modeling capabilities in narrowband simulated data,especially for large-scale features in high-sensitivity observations which are exactly one of the challenges of data processing for the Square Kilometre Array(SKA).However,adaptive scale CLEAN algorithms have not been verified by real observation data and allow negative components in the model.In this paper,we propose an adaptive scale model algorithm with non-negative constraint and wideband imaging capacities,and it is applied to simulated SKA data and real observation data from the Karl G.Jansky Very Large Array(JVLA),an SKA precursor.Experiments show that the new algorithm can reconstruct more physical models with rich details.This work is a step forward for future SKA image reconstruction and developing SKA imaging pipelines.  相似文献   

4.
Wide-field imaging of low-frequency radio telescopes is subject to a number of difficult problems. One particularly pernicious problem is the non-coplanar baseline effect. It will lead to distortion of the final image when the phase of w-direction called w-term is ignored. The image degradation effects are amplified for the telescopes with a wide field of view. This paper summarizes and analyzes several w-term correction methods and their technical principles. Their advantages and disadvantages are analyzed after comparing their computational cost and computational complexity. We conduct simulations with two of these methods, i.e., faceting and w-projection, based on the configuration of the first-phase Square Kilometre Array (SKA) low-frequency array. The resulted images are also compared with the result of the two-dimensional Fourier transform method. The results show that the image quality and correctness derived from both faceting and w-projection are better than the two-dimensional Fourier transform method in wide-field imaging. The effects of the number of facets and the w-direction step length on the image quality and running time are evaluated. The results indicate that the number of facets and the w-direction step length must be reasonable. Finally, we analyze the effect of data size on the running times of the faceting and w-projection algorithms. The results show that the faceting and w-projection algorithms need to be optimized before the huge amount of data processing. The research of the present paper initiates the analysis of wide-field imaging techniques and their applications in the existing and future low-frequency arrays, and will foster their applications in even broader fields.  相似文献   

5.
平方公里阵列(Square Kilometre Array,SKA)项目是建设全球最大射电望远镜的国际合作项目,其灵敏度和测量速度将比当前所有的射电望远镜都要高出一个数量级.连续谱巡天是SKA的主要观测模式之一,基于连续谱成像建立巡天区域的标准星图,将能为后续天文科学研究奠定重要基础.银河系与河外星系全天默奇森宽场阵列拓展巡天(GaLactic and Extragalactic All-sky Murchison Widefield Array survey eXtended,GLEAM-X)是2018—2020年利用SKA先导望远镜默奇森宽场阵列(Murchison Wide-field Array,MWA)二期拓展阵列开展的新的射电连续谱巡天项目,观测期间积累了大量的低频巡天观测数据.海量观测数据的自动化、大批量处理是SKA望远镜项目所面临的的最大挑战和难题之一,基于分布式执行框架的成像管线优化经验将有助于解决海量数据处理问题.详细介绍了GLEAM-X成像管线并对其进行整合和改进,在中国SKA区域中心原型机(China SKA Regional Centre Prototype,...  相似文献   

6.
天线增益校准是射电天文观测数据处理过程中的一个关键步骤。分析了经典的天线增益校准算法Antsol的基本原理,并基于Python对Antsol算法进行了高性能实现,所完成的程序代码已经集成到平方公里阵列(Square Kilometre Array,SKA)的射电天文模拟校准成像软件(Radio Astronomy Simulation,Calibration and Imaging Library,RASCIL)中,不仅为当前平方公里阵列数据处理提供了支撑,也为未来数据处理的性能优化提供算法参考。  相似文献   

7.
The performance goals of the Square Kilometre Array (SKA) are such that major departures from prior practice for imaging interferometer arrays are required. One class of solutions involves the construction of large numbers of stations, each composed of one or more small antennas. The advantages of such a “large-N” approach are already documented, but attention has recently been drawn to scaling relationships for SKA data processing that imply excessive computing costs associated with the use of small antennas. In this paper we examine the assumptions that lead to such scaling laws, and argue that in general they are unlikely to apply to the SKA situation. A variety of strategies for SKA imaging which exhibit better scaling behaviour are discussed. Particular attention is drawn to field of view issues, and the possibility of using weighting functions within an advanced correlator system to precisely control the field-of-view.  相似文献   

8.
The new generation of radio telescopes, such as the proposed Square Kilometer Array (SKA) and the Low-Frequency Array (LOFAR) rely heavily on the use of very large phased aperture arrays operating over wide band-widths at frequency ranges up to approximately 1.4?GHz. The SKA in particular will include aperture arrays consisting of many thousands of elements per station providing un-paralleled survey speeds. Currently two different arrays (from nominally 70?MHz to 450?MHz and from 400?MHz to 1.4?GHz) are being studied for inclusion within the overall SKA configuration. In this paper we aim to analyze the array contribution to system temperature for a number of regular and irregular planar antenna array configurations which are possible geometries for the low-frequency SKA (sparse disconnected arrays). We focus on the sub-500?MHz band where the real sky contribution to system temperature (T sys ) is highly significant and dominants the overall system noise temperature. We compute the sky noise contribution to T sys by simulating the far field response of a number of SKA stations and then convolve that with the sky brightness temperature distribution from the Haslam 408?MHz survey which is then scaled to observations at 100?MHz. Our analysis of array temperature is carried out by assuming observations of three cold regions above and below the Galactic plane. The results show the advantages of regular arrays when sampled at the Nyquist rate as well as their disadvantages in the form of grating lobes when under-sampled in comparison to non-regular arrays.  相似文献   

9.
The future Square Kilometre Array (SKA) radio telescope is an interferometer array that will use a variety of collector types, including approximately 2500 dishes distributed with separations up to a few thousand kilometres, and about 250 aperture array (AA) stations located within 200 km of the core. The data rates associated with each individual collector are vast: around 10 GBytes/s for each dish and 2 TBytes/s for an AA station. As each of these must be connected directly to a central correlator, designing a cost-effective cabling and trenching infrastructure presents a great engineering challenge. In this paper we discuss approaches to performing this optimisation. In graph theory, the concept of a minimum spanning tree (MST) is equivalent to finding the minimum total trench length joining a set of n arbitrary points in the plane. We have developed a set of algorithms which optimise the infrastructure of any given telescope layout iteratively, taking into consideration not only trenching but also cabling and jointing costs as well. Solutions for few example configurations of telescope layout are presented. We have found that these solutions depend significantly on the collectors’ output data rates. When compared to a “traditional” MST-based approach which minimises trenching costs only, our algorithms can further reduce total costs by up to 15–20%. This can influence greatly the SKA infrastructure related costs.  相似文献   

10.
Machine learning has achieved great success in many areas today. The lifting algorithm has a strong ability to adapt to various scenarios with a high accuracy, and has played a great role in many fields. But in astronomy, the application of lifting algorithms is still rare. In response to the low classification accuracy of the dark star/galaxy source set in the Sloan Digital Sky Survey (SDSS), a new research result of machine learning, eXtreme Gradient Boosting (XGBoost), has been introduced. The complete photometric data set is obtained from the SDSS-DR7, and divided into a bright source set and a dark source set according to the star magnitude. Firstly, the ten-fold cross-validation method is used for the bright source set and the dark source set respectively, and the XGBoost algorithm is used to establish the star/galaxy classification model. Then, the grid search and other methods are used to adjust the XGBoost parameters. Finally, based on the galaxy classification accuracy and other indicators, the classification results are analyzed, by comparing with the models of function tree (FT), Adaptive boosting (Adaboost), Random Forest (RF), Gradient Boosting Decision Tree (GBDT), Stacked Denoising AutoEncoders (SDAE), and Deep Belief Nets (DBN). The experimental results show that, the XGBoost improves the classification accuracy of galaxies in the dark source classification by nearly 10% as compared to the function tree algorithm, and improves the classification accuracy of sources with the darkest magnitudes in the dark source set by nearly 5% as compared to the function tree algorithm. Compared with other traditional machine learning algorithms and deep neural networks, the XGBoost also has different degrees of improvement.  相似文献   

11.
幸运成像技术是一种从大量短曝光图像中选取少量幸运好图进行配准、叠加的高分辨率图像恢复技术,能够有效减小大气湍流对图像质量的影响,但传统的基于中央处理器(Central Processing Unit,CPU)的幸运成像算法难以实现实时化。利用现场可编程门阵列(Field Programmable Gate Array,FPGA)的并行性和灵活性优势,提出了一种新的基于现场可编程门阵列的幸运成像算法,并构建了一个现场可编程门阵列实验系统。该算法采用一种固定选图数且无需排序的图像选择策略和一种以行列坐标为基准的图像配准策略,能够有效节省算法处理时间和硬件资源,达到实时幸运成像的目的。实时幸运成像算法能够以简洁的方式在中小规模的现场可编程门阵列上实现,所得高分辨率图像与基于传统中央处理器算法处理的结果完全相同。实验表明,对于2000帧128×128像素的输入图像进行幸运成像,本算法的运行速度比本实验室之前提出的算法快27倍,比传统的基于CPU+MATLAB幸运成像算法速度快150多倍,处理帧率可达197帧/秒。该算法及其现场可编程门阵列实现技术可以用于构建真正实时的幸运成像系统。  相似文献   

12.
NASA is proposing a new receiving facility that needs to beamform broadband signals from hundreds of antennas. This is a similar problem to SKA beamforming with the added requirement that the processing should not add significant noise or distortion that would interfere with processing spacecraft telemetry data. The proposed solution is based on an FX correlator architecture and uses oversampling polyphase filterbanks to avoid aliasing. Each beamformer/correlator module processes a small part of the total bandwidth for all antennas, eliminating interconnection problems. Processing the summed frequency data with a synthesis polyphase filterbank reconstructs the time series. Choice of suitable oversampling ratio, and analysis and synthesis filters can keep aliasing below −39 dB while keeping the passband ripple low. This approach is readily integrated into the currently proposed SKA correlator architecture.  相似文献   

13.
14.
The acquisition of H  i Parkes All Sky Survey (HIPASS) southern sky data commenced at the Australia Telescope National Facility's Parkes 64-m telescope in 1997 February, and was completed in 2000 March. HIPASS is the deepest H  i survey yet of the sky south of declination +2°, and is sensitive to emission out to 170 h75−1 Mpc. The characteristic root mean square noise in the survey images is 13.3 mJy. This paper describes the survey observations, which comprise 23 020 eight-degree scans of 9-min duration, and details the techniques used to calibrate and image the data. The processing algorithms are successfully designed to be statistically robust to the presence of interference signals, and are particular to imaging point (or nearly point) sources. Specifically, a major improvement in image quality is obtained by designing a median-gridding algorithm which uses the median estimator in place of the mean estimator.  相似文献   

15.
Detection of individual luminous sources during the reionization epoch and cosmic dawn through their signatures in the HI 21-cm signal is one of the direct approaches to probe the epoch. Here, we summarize our previous works on this and present preliminary results on the prospects of detecting such sources using the SKA1-low experiment. We first discuss the expected HI 21-cm signal around luminous sources at different stages of reionization and cosmic dawn. We then introduce two visibility based estimators for detecting such signals: one based on the matched filtering technique and the other relies on simply combing the visibility signal from different baselines and frequency channels. We find that the SKA1-low should be able to detect ionized bubbles of radius \(R_{\mathrm {b}} \gtrsim 10\) Mpc with ~100 h of observations at redshift z~8 provided that the mean outside neutral hydrogen fraction \(\mathrm {x}_{\text {HI}} \gtrsim 0.5\). We also investigate the possibility of detecting HII regions around known bright QSOs such as around ULASJ1120+0641 discovered by Mortlock et al. (Nature 474, 7353 (2011)). We find that a 5σ detection is possible with 600 h of SKA1-low observations if the QSO age and the outside xHI are at least ~2×107 Myr and ~0.2 respectively. Finally, we investigate the possibility of detecting the very first X-ray and Ly- α sources during the cosmic dawn. We consider mini-QSOs like sources which emits in X-ray frequency band. We find that with a total ~ 1000 h of observations, SKA1-low should be able to detect those sources individually with a ~ 9σ significance at redshift z=15. We summarize how the SNR changes with various parameters related to the source properties.  相似文献   

16.
The multiple signal classi?cation (MUSIC) algorithm is introduced to the estimation of light periods of BL Lac objects. The principle of the MUSIC algorithm is given, together with a testing on its spectral resolution by using a simulative signal. From a lot of literature, we have collected a large number of effective observational data of the BL Lac object S5 0716+714 in the three optical wavebands V, R, and I from 1994 to 2008. The light periods of S5 0716+714 are obtained by means of the MUSIC algorithm and average periodogram algorithm, respectively. It is found that there exist two major periodic components, one is the period of (3.33±0.08) yr, another is the period of (1.24±0.01) yr. The comparison of the performances of periodicity analysis of two algorithms indicates that the MUSIC algorithm has a smaller requirement on the sample length, as well as a good spectral resolution and anti-noise ability, to improve the accuracy of periodicity analysis in the case of short sample length.  相似文献   

17.
Five out of six Square Kilometre Array (SKA) science programs need extensive surveys at frequencies below 1.4 GHz and only four need high-frequency observations. The latter ones drive to expensive high surface accuracy collecting area, while the former ask for multi-beam receiver systems and extensive post correlation processing. In this paper, we analyze the system cost of a SKA when the field-of-view (Fov) is extended from 1 deg2 at 1.4 GHz to 200 deg2 at 0.7 GHz for three different antenna concepts. We start our analysis by discussing the fundamental limitations and cost issues of wide-band focal plane arrays (FPA) in dishes and cylinders and of wide-band receptors in aperture arrays. We will show that a hybrid SKA in three different antenna technologies will give the highest effective sensitivity for all six key science programs.  相似文献   

18.
Atmospheric turbulence severely restricts the spatial resolution of astronomical images obtained by a large ground-based telescope. In order to reduce effectively this effect, we propose a method of blind deconvolution, with a bandwidth constraint determined by the parameters of the telescope's optical system based on the principle of maximum likelihood estimation, in which the convolution error function is minimized by using the conjugate gradient algorithm. A relation between the parameters of the telescope optical system and the image's frequency-domain bandwidth is established, and the speed of convergence of the algorithm is improved by using the positivity constraint on the variables and the limited-bandwidth constraint on the point spread function. To avoid the effective Fourier frequencies exceed the cut-off frequency, it is required that each single image element (e.g., the pixel in the CCD imaging) in the sampling focal plane should be smaller than one fourth of the diameter of the diffraction spot. In the algorithm, no object-centered constraint was used, so the proposed method is suitable for the image restoration of a whole field of objects. By the computer simulation and by the restoration of an actually-observed image of α Piscium, the effectiveness of the proposed method is demonstrated.  相似文献   

19.
We present and discuss the characteristics and performance, both in term of computational speed and precision, of a numerical code which integrates the equation of motions of N ‘particles’ interacting via Newtonian gravitation and move in an external galactic smooth field. The force evaluation on every particle is done by mean of direct summation of the contribution of all the other system’s particles, avoiding truncation error. The time integration is done with second-order and sixth-order symplectic schemes. The code, NBSymple, has been parallelized twice, by mean of the Compute Unified Device Architecture (CUDA) to make the all-pair force evaluation as fast as possible on high-performance Graphic Processing Units NVIDIA TESLA C1060, while the O(N) computations are distributed on various CPUs by mean of OpenMP Application Program. The code works both in single-precision floating point arithmetics or in double precision. The use of single-precision allows the use of the GPU performance at best but, of course, limits the precision of simulation in some critical situations. We find a good compromise in using a software reconstruction of double-precision for those variables that are most critical for the overall precision of the code. The code is available on the web site astrowww.phys.uniroma1.it/dolcetta/nbsymple.html.  相似文献   

20.
We present empirical machine learning algorithms for measuring the probabilistic photometric redshifts (photo-z) of X-ray quasars based on the quantile regression of ensembles of decision trees. Relying on the data of present-day photometric sky surveys (e.g., SDSS, GALEX, WISE, UKIDSS, 2MASS, FIRST), the proposed methods allow one to make high-quality photo-z point predictions for extragalactic objects, to estimate the confidence intervals, and to reconstruct the full probability distribution functions for all predictions. The quality of photo-z predictions has been tested on samples of X-ray quasars from the 1RASS and 3XMM DR7 surveys, which have spectroscopic redshift measurements in the SDSS DR14Q catalog. The proposed approaches have shown the following accuracy (the metrics are the normalized median absolute deviation σNMAD and the percentage of outliers n>0.15): σNMAD, n>0.15 = 0.043, 12% (SDSS + WISE), 0.037, 8% (SDSS + WISE + GALEX) and 0.032, 8.6% (SDSS + WISE + GALEX + UKIDSS) on the RASS sample; σNMAD, n>0.15 = 0.054, 13% (SDSS + WISE), 0.045, 7.6% (SDSS + WISE + GALEX), and 0.037, 6.6% (SDSS + WISE + GALEX + UKIDSS) on the 3XMM sample. The presented photo-z algorithms will become an important tool for analyzing the multi-wavelength data on X-ray quasars in the forthcoming Spectrum–Roentgen–Gamma sky survey.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号