430 likes | 531 Vues
Description of Multivariate Data. Multivariate Analysis. The analysis of many variables. Multivariate Analysis : The analysis of many variables.
E N D
Multivariate Analysis The analysis of many variables
Multivariate Analysis: The analysis of many variables More precisely and also more traditionally this term stands fo the study of a random sample pf n objects (units or cases) such that on each object we measure p variables or characteristics. So that for each object there is a vector: Each with p components:
The variables will be correlated as they are measured on the same object. A common practice is to treat each variable separately by applying methods of univariate analysis. This may lead to incorrect and inadequate analysis:
The challenge of multivariate analysis is to untangle the overlapping information provided by a set of correlated variables and to reveal the underlying structure.
some of which are • generalizations of univariate methods and some which are • multivariate with without univariate counterparts This is done by a variety of methods,
to describe and perhaps justify these methods, and also • provide some guidance about how to select an appropriate method for a given multivariate data set. The purpose of this course is
Example x1 = age (in years) at entry to university, Randomly select n = 5 students as objects and for each student measure: x2 = mark out of 100 in an exam at the end of the first year, x3 = sex (0 = female, 1= male)
The result may look something like this: It is of interest to note that the variables in the example are not of the same type: • x1 is a continuous variable, • x2 is a discrete variable and • x3 is a binary variable
variables The Data Matrix
We can write where = the ith row of X.
We can also write where
In this notation is the p-vector denoting the p observations on the first object, while is the n-vector denoting the observations on the first variable The rows form a random sample while the columns do not (this is emphasized in the notation by the use of parentheses)
The objective of multivariate analysis will be a attempt to find some feature of the variables (i.e. the columns of the data matrix) At other times, the objective of multivariate analysis will be a attempt to find some feature of the individuals (i.e. the rows of the data matrix) The feature that we often look for is grouping of the individuals or of the variables. We will give a classification of multivariate methods later
Even when n and p are moderately large, the amount of information (np elements of the data matrix) can be overwhelming and it is necessary to find ways of summarizing data. Later on we will discuss way of graphical representation of the data
Definitions: • The sample mean for the ith variable • The sample variance for the ith variable • The sample covariance between the ith variableand the jthvariable
Putting the definitions together we are led to the following definitions: Defn: The sample mean vector
Expressing the sample mean vector and the sample covariance matrix in terms of the data matrix
Note where is the n-vector whose components are all equal to 1.
because then
It is easy to check that The final step is to realize that that
In the text book And then
Note: and
Thus Hence
Data are frequently scaled as well as centered. The scaling is done by introducing: Defn: the sample correlation coefficient for (between) the ithand the jth variables the sample correlation matrix
Obviously and using the Schwartz’s inequality If R = I then we say the variables are uncorrelated
Note: if we denote Then it can be checked that
The sample variance-covariance matrix S is an obvious generalization of the univariate concept of variance, which measures scatter about the mean. Sometimes it is convenient to have a single number to measure the overall multivariate scatter.
There are two common measures of this type: Defn: The generalized sample variance Defn: The total sample variance
In both cases, large values indicate a high degree of scatter about the centroid: low values indicate concentration about the centroid: Using the eigenvalues l1, l2, …,lp of the matrix S,it can be shown that
If lp = 0 then This says that there is a linear dependence amongst the variables. Normally, S is positive definite and all the eigenvalues are positive.
Taking linear combinations of variables is one of the most important tools of multivariate analysis. This is for basically two reasons: • A few appropriately chosen combinations may provide more of the information than a lot of the original variables. (this is called dimension reduction.) • Linear combinations can simplify the structure of the variance-covariance matrix, which can help in the interpretation of the data.
For a given vector of constraints: We consider a linear combination For i = 1, 2, … , n. Then