首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
When the number of variables exceeds the number of samples, one method of multivariate discriminationis to use principal components analysis to reduce the dimensionality and then to perform canonicalvariates analysis (PC-CVA). This paper proposes an alternative approach in which discriminant analysisis carried out by a weighted principal component analysis of the group means (DPCA). This method doesnot require prior data reduction and produces discriminant factors that are orthogonal in the original dataspace. The theory and performance of the two methods are compared. Although the individual factors ofDPCA are found to be less discriminating than PC-CVA, the overall discrimination, calculated bymultivariate analysis of variance, and the predictive value, estimated by the leaving-one-out error rate,are broadly comparable.  相似文献   

2.
Compositional data arise naturally in several branches of science,including chemistry,geology,biology,medicine,ecology and manufacturing design.In chemistry,these constrained data seem to occur typicallywhen raw data are normalized or when output is obtained from a constrained estimation procedure,suchas might be used in a source apportionment problem.It is important not only for chemists to be awarethat the usual multivariate statistical techniques are not applicable to constrained data,but also to haveaccess to appropriate techniques as they become available.The currently available methodology is dueprincipally to Aitchison and is based on log-normal models.This paper suggests new parametric andnon-parametric approaches to significantly improve the existing methodology.In the parametric setting,some recent work of Rayens and Srinivasan is extended and a practical regression model is proposed.In the development of the non-parametric approach,minimum distance methods coupled withmultivariate bootstrap techniques are used to obtain point and region estimators.  相似文献   

3.
Digital filter smoothing methods for shot-noise-limited data are addressed in this study.The preferredmethod is based on a Gaussian filter in which the width of the Gaussian filter function is varied dependingon the estimate of the second derivative of the raw data.This filter is developed from the standpoint ofmaximum likelihood parameter estimation of the probability density function which describes shot-noise-limited data.The smoothing filter is tested and compared with the conventional sequential regressionfilter.This adaptive Gaussian smoothing filter works better than both the sequential regression and theadaptive Gaussian filter derived for normal noise.For data containing both high-and low-frequencycomponents,the limiting step in the adaptive filter is an estimation of the smoothing interval.Methodsfor determining an optimum smoothing interval are discussed.With the optimized smoothing interval,the adaptive Gaussian filter works well for data sets with a wide range of varying frequency components.In particular,synthetic data typical of atomic emission spectra are used to test this smoothing filter.  相似文献   

4.
5.
Window factor analysis(WFA)is a self-modeling method for extracting the concentration profiles ofindividual components from evolutionary processes such as flow injection,chromatography,titrationsand reaction kinetics.The method takes advantage of the fact that each component lies in a specificregion along the evolutionary axis,called the‘window’.Theoretical equations are derived.The methodis used to extract the concentration profiles and spectra of seven bismuth species from data obtained byGemperline and Hamilton,who injected bismuth perchlorate into a flowing stream of hydrochloric acid.  相似文献   

6.
In the present paper,the possible analytical applications of two topological models,the DARC modeland the group contribution model,are discussed.Both models are applied to obtain calibration laws,which relate UV and IR characteristics with the chemical structure of ethylene oxide condensates.The group contribution model is also applied to determine the contribution of each part of thedifferent compounds involved in a chemical interaction process,having established the sensitizationparameters of benzodiazepines and anionic surfactants from the micellar enhancement fluorescence.  相似文献   

7.
Pavement snow and icing are worldwide problems, but effective countermeasures are just beginning to be developed in China. The two most common snow and ice removal methods are mechanical clearance and chemical melting, and the advantages and disadvantages of each approach are discussed here, including environmental and structural damage caused by corrosive snow melting agents. New developments in chemical melting agents and mechanical equipment are discussed, and an overview of alternative thermal melting systems is presented, including the use of geothermy and non-geothermal heating systems utilizing solar energy, electricity, conductive pavement materials, and infrared/microwave applications. Strategic recommendations are made for continued enhancement of public safety in snow and ice conditions.  相似文献   

8.
The University of Antwerp is spread out over three campuses in the larger Antwerp area.Thechemometrics group is situated at the UIA campus in the suburb of Wilrijk.The Universityof Antwerp has 6000 students and the number of personnel is 2000.For the UIA campus thisis 1800 and 600,respectively.The faculties in Antwerp are:mathematics and natural sciences,medicine and pharmacy,law,applied economy and humanities.The location of the UIA  相似文献   

9.
The statistical analysis of compositional data is of fundamental importance to practitioners in generaland to chemists in particular.The existing methodology is principally due to Aitchison,who effectivelyuses two transformations,a ratio followed by the logarithmic,to create a useful,coherent theory thatin principle allows the plethora of normal-based multivariate techniques to be used on the transformeddata.This paper suggests that the well-known class of Box-Cox transformations can be employed inplace of the logarithmic to significantly improve the existing methodology.This is supported in part byshowing that one of the most basic problems that Aitchison managed to overcome,namely thespecification of an interpretable covariance structure for compositional data,can be resolved,or nearlyresolved,once the ratio transformation has been applied.Hence the resolution is not directly dependenton the logarithmic transformation.It is then verified that access to the general Box-Cox family will allowa more accurate use of the normal-based multivariate techniques,simply because better fits to normalitycan be achieved.Finally,maximum likelihood estimation and some associated asymptotics are employedto construct confidence intervals for ratios of the true,unknown compositional constituents.Heretoforethis had not been done even in the context of the logarithmic transformation.Applications to real dataare presented.  相似文献   

10.
The Free-Wilson paradigm is an established and powerful tool for quantitatively relating activity withchemical structure.Current implementations of the paradigm,however,are flawed both conceptually andin execution.As part of an attempt to more fully realize the promise of the paradigm,it was necessaryto examine these limitations in detail.This report introduces a robust,theory-founded Free-Wilson implementation:stepwise principalcomponents regression analysis(SPCRA).SPCRA is computationally superior to previousimplementations but does not in itself correct their conceptual flaws.The development of SPCRA did,however,facilitate derivation of a simple and chemically significantinterpretation of the Free-Wilson structure-activity model.A number of statistical aspects of this modelcommonly misused in previous applications are discussed at length.These discussions provide criticalbackground for the development of an alternative implementation of the Free-Wilson paradigm.  相似文献   

11.
In contrast with conventional PCA,a direct superposition and joint interpretation of loading plots is notpossible in three-way PCA,since there may be data variance which is described by unequal componentsof different modes.The contributions to variance of all possible combinations of components aredescribed in the core matrix.Body diagonalization,which is achieved by appropriate rotation ofcomponent matrices,is an essential tool for simplifying the core matrix structure.The maximum degreeof body diagonality which may be obtained from such transformations is analysed from both themathematical and simulation viewpoints.It is shown that,at least in the average case,high degrees canbe expected,which makes the procedure reasonable for many practical applications.Furthermore,simulation as well as theoretical derivation show that the success of body diagonality depends on the so-called polarity of the core array.The methodology is illustrated by a three-way data example fromenvironmental chemistry.  相似文献   

12.
Orthogonal rotations,e.g,the varimax rotation,are common practice in factor analysis.However,theterm varimax rotation does not refer to a unique procedure,since several different types of rotation arepossible.In this paper six different types of rotation are examined(raw varimax of loadings,normalvarimax of loadings,raw and normal varimax of scores,eigenvalue-weighted varimax of loadings andArthur varimax)from both a theoretical and practical point of view.It can be concluded that an adequateapplication of these methods can often simplify the interpretation of the calculated factors.  相似文献   

13.
The total content of nine trace elements(Mn,Fe,Co,Ni,Cu,Zn,Cd,Hg,Pb)in the soft part of mussels(Mytilus galloprovincialis Lamarck)sampled in two sites was considered.Wild,polluted molluscs weresampled in Muggia Bay(Gulf of Trieste,Northern Adriatic Sea)in the proximity of an important sewerof the city.Edible,unpolluted mussels were simultaneously sampled in a hatchery just off the Bay.Principal component analysis has been applied to correlation matrices obtained from data matrices fromthe literature.The nine variables were reduced to three or two principal components,which explained70-80% of the total variance.The unrotated and orthogonally rotated matrix of the correlations ofvariables with principal components showed that the clusters of elements are positively associated to thefirst two eigenvectors.The origin of some toxic elements in the soft part of mussels from Muggia Bay isdiscussed.The projection on the first two eigenvectors of all data as component scores allow a nearlycomplete separation of polluted from unpolluted molluscs.  相似文献   

14.
Landscape indices are popular for the quantification of landscape pattern. But all landscape indices being used so far are scalar quantity, which measure patterns without considering sufficiently the pattern size and the directionality together. Based on planar characteristics defined in mechanics such as centroid, moment of inertia, product of inertia, principal axes and so on, vector analysis theory on landscape pattern (VATLP) is explored here. Firstly we establish a coordinate system of centroidal principal axes (CSCPA) of a patch or patches. Some related new indices including those describing the direction of pattern distribution (patch orientation (PO), vectorial patch orientation (VPO)), and those indicating the shape of patch‘s equivalent ellipse (major axis (MJA), minor axis (M1A) and eccentric rate (ER)) are deduced. These landscape metrics are then applied to the pattern analysis of Sanjiang plain marsh, the study area. Two temporal vector-based data sets of the study area come from interpretation of remote sensing images M SS (1980) and T M (2000). The application of the theory captures some shape properties of riparian wetland in Sanjiang plain marsh. The dissymmetrieal featires of Sanjiang plain marsh around principal axes due to agricultural development could also be explained.  相似文献   

15.
Carey et al.utilized principal components analysis (PCA) to analyze frequency shift data obtained frompiezoelectric sensors formed by coating quartz crystals with 27 different GC stationary phases and testedusing 14 analytes.The objective of the analysis was to determine an optimal reduced set of coatings fordetection of the analytes.The results were correlated with those obtained from cluster analysis.In thispaper the data are re-analyzed using correspondence analysis (CA).The advantage of using CA includea symmetric treatment of sensor coatings and analytes and better identification of the representation ofthe analytes in terms of the detection components.The results obtained by the conjunctive use of PCA,a varimax rotation and cluster analysis were obtained by CA.  相似文献   

16.
The automatic procedures for optimizing the composition of a binary mobile phase in reversed phaseliquid chromatography have been intensively studied for the past ten years.The performance of theseprocedures,based on either the black box approach or on other methods such as the experimentaldesign,are very often limited by the large number of time-consuming experimental runs that arenecessary for the determination of the analysis conditions to be optimized.The proposed method reduces this number of experiments:two experiments,run under linearvariation of the composition of the binary mobile phase(the linear gradient elution mode),make itpossible to determine the mobile phase composition corresponding to the maximum resolution betweenpeaks,the final analysis being assumed to be carried out under isocratic conditions.The method requires two steps:the determination of the retention characteristics for each solute,which depend upon the composition of the mobile phase;the selection of the optimum composition ofthe isocratic mobile phase,by using a criterion such as the maximum resolution normalized by the squareroot of the plate number,for the least separated pair of adjacent peaks.The interest,performance and limits of use of such an optimization procedure are discussed by meansof the chromatographic analysis of different complex mixtures.  相似文献   

17.
中国天气发生器的降水模拟   总被引:1,自引:0,他引:1  
A stochastic model for daily precipitation simulation in China was developed based on the framework of a ‘Richardson-type‘ weather generator that is an important tool in studying impacts of weather/climate on a variety of systems including ecosystem and risk assessment. The purpose of this work is to develop a weather generator for applications in China. The focus is on precipitation simulation since determination of other weather variables such as temperature is dependent on precipitation simulation. A framework of first order Markov Chain with Gamma Distribution for daily precipitation is adopted in this work. Based on this framework, four parameters of precipitation simulation for each month at 672 stations all over China were determined using daily precipitation data from 1961 to 2000. Compared with previous works, our estimation for the parameters was made for more stations and longer observations, which makes the weather generator more applicable and reliable. Spatial distributions of the four parameters are analyzed in a regional climate context. The seasonal variations of these parameters at five stations representing regional differences are discussed.Based on the estimated monthly parameters at 672 stations, daily precipitations for any period can be simulated. A 30-year simulation was made and compared with observations during 1971-2000 in terms of annual and monthly statistics. The results are satisfactory, which demonstrates the usefulness of the weather generator.  相似文献   

18.
Slope is one of the crucial terrain variables in spatial analysis and land use planning, especially in the Loess Plateau area of China which is suffering from serious soil erosion. DEM based slope extracting method has been widely accepted and applied in practice. However slope accuracy derived from this method usually does not match with its popularity. A quantitative simulation to slope data uncertainty is important not only theoretically but also necessarily to applications. This paper focuses on how resolution and terrain complexity impact on the accuracy of mean slope extracted from DEMs of different resolutions in the Loess Plateau of China. Six typical geomorphologic areas are selected as test areas, representing different terrain types from smooth to rough. Their DEMs are produced from digitizing contours of 1:10,000 scale topographic maps. Field survey results show that 5 m should be the most suitable grid size for representing slope in the Loess Plateau area. Comparative and math-simulation methodology was employed for data processing and analysis. A linear correlativity between mean slope and DEM resolution was found at all test areas, but their regression coefficients related closely with the terrain complexity of the test areas. If taking stream channel density to represent terrain complexity, mean slope error could be regressed against DEM resolution (X) and stream channel density (S) at 8 resolution levels and expressed as(0.0015S2 0.031S-0.0325)X-0.0045S2-0.155S 0.1625, with a R2 value of over 0.98. Practical tests also show an effective result of this model in applications. The new development methodology applied in this study should be helpful to similar researches in spatial data uncertainty investigation.  相似文献   

19.
A non-parametric method for supervised pattern recognition is presented. The method is of the classmodelling type, meaning that a classification rule is developed for each class, using the dissimilaritiesbetween the objects of the class. The dissimilarities between the objects within a class are related to thedistances between all pairs of training objects. As distance metric, a measure is proposed that takes thecorrelation between the interval-scale variables into account, and that moreover can be used for mixedtypes of variables. The classification rule is based on the construction of a boundary in the measurementspace. For the determination of the class boundary, several strategies are proposed and compared. The performance of the technique is evaluated on the basis of several data sets. Comparison with theclass modelling technique UNEQ shows its usefulness for practical applications.  相似文献   

20.
Recent advances in precipitation-bias correction and application   总被引:1,自引:0,他引:1  
Significant progresses have been made in recent years in precipitation data analyses at regional to global scales. This paper re-views and synthesizes recent advances in precipitation-bias corrections and applications in many countries and over the cold re-gions. The main objective of this review is to identify and examine gaps in regional and national precipitation-error analyses. This paper also discusses and recommends future research needs and directions. More effort and coordination are necessary in the determinations of precipitation biases on large regions across national borders. It is important to emphasize that bias cor-rections of precipitation measurements affect both water budget and energy balance calculations, particularly over the cold regions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号