1 / 26

Analysis of RT distributions with R

Analysis of RT distributions with R. Emil Ratko-Dehnert WS 2010/ 2011 Session 02 – 16.11.2010. Last time. Organisational Information ->see webpage Why response times? -> ratio-scaled, math. treatment Why use R? -> standard, free, powerful, extensible

Télécharger la présentation

Analysis of RT distributions with R

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Analysis of RT distributionswith R Emil Ratko-Dehnert WS 2010/ 2011 Session 02 – 16.11.2010

  2. Last time ... • Organisational Information ->see webpage • Why response times? -> ratio-scaled, math. treatment • Why use R? -> standard, free, powerful, extensible • Sources of randomness in the brain -> neurons, bottom-up and top-down factors, measuring procedure • Mathematical modelling of phenomena in the world

  3. I Introduction toProbability theory

  4. I Probability space Ω Probability space 1 P Probability measure A 0 Subsets of interest

  5. I Probability Space (Ω, A, P) A Ω { 1; 2; 3 } 1 { } { 1 } 3 { 1; 3 } 2 { 3 } { 2; 3 } { 2 } { 1; 2 } Sample space: set of all possible outcomes P 1/4 1/2 3/4 0 1 Set of events : collection of subsets (σ-Algebra) Probability measure: Governed by Kolmogorov-Axioms

  6. I Probability measure P • Is governed by „Kolmogorov-Axioms“ • P(A) ≥ 0; A event (non-negativity) • P({}) = 0 and P(Ω) = 1 (normality) • P(Σ Ai) = Σ P(Ai); for Ai disjoint (σ-additivity)

  7. I Example: Rolling a die • Ω = {1, 2, 3, 4, 5, 6} • A = Powerset(A) = { {1}, {2}, ..., {6}, {1, 2}, {1,3} , ..., {5, 6}, {1,2,3}, ..., {1, 2, 3, 4, 5, 6} } • P(ω) = 1/6, for all ωєΩ • A = { „even pips“ } = {2, 4, 6} • P(A) = 3/6 = 1/2

  8. I Example: RT Distribution Ex-Gaussian distribution

  9. I Modelling behavioural experiments „Response times to a pop-out experiment?“ • What is the probability space (Ω, A, P)? • ΩRT= („all times between 0 and +∞ ms“) • A = B(R) = ( [x, y); x, y єR ) • P([x, y)) = ?  this will be addressed in II

  10. I Important Laws in Probability theory • Law of large numbers • Central limit theorem

  11. I Law of large numbers • „The sample average Xn (of a random variable Xn) converges towards the theoretical expectation μ of X“ • Example: • Expected value of rolling a die is 3.5 • Average value of 1000 dice should be 3500 / 1000 = 3.5

  12. I Importance of Law of large numbers • It justifies aggregation of data to its mean • (will be important again in ) III

  13. I Central limit theorem • The average of many iid random variables with finite variance tends towards a normal distribution irrespective of the distribution followed by the original random variables. N n  ∞

  14. Binomial distributions B(n, p), e.g. Tossing a coin n-times with prob(head) = p • increasing n  Normal distribution

  15. I Importance of Central limit theorem • Why is this important: • It argues that the sum of many random processes (whatever distribution they may follow) behaves like a normal random process • i.e. If you have a system, where many random processes interact, you can just treat the overall effect like a normal error/ noise(!)

  16. Excursion Matrix Calculus

  17. Excursion: Matrix Calculus • Def: A matrix A= (ai,j) is an array of numbers • It has m rows and n columns (dim = m*n) m n

  18. Matrix operations (I) • Addition of two 2-by-2 matrices A, B performed component-wise: • Note that „+“ is commutative, i.e. A+B = B+A A B A+B

  19. Matrix operations (II) • Scalar Multiplication of a 2-by-2 matrix A with a scalar c • Again commutativity, i.e. c*A = A*c c A cA

  20. Matrix operations (III) • Transposition of a 2-by-3 matrix A  AT • It holds, that ATT= A. A AT

  21. Matrix operations (IV) • Matrix multiplication ofmatrices C (2-by-3) and D (3-by-2) to E (2-by-2): C E D

  22. Matrix operations (V) !Warning! One can only multiply matrices if their dimensions correspond, i.e. (m-by-n) x (n-by-k)  (m-by-k) • And generally: if A*B exists, B*A need not • Furthermore: if A*B, B*A exists, they need not be equal!

  23. Geometric interpretation • Matrices can be interpreted as linear transformations in a vector space

  24. Significance of matrices • Matrix calculus is relevant for • Algebra: Solving linear equations (Ax = b) • Statistics: LLS, covariance matrices of r. v. • Calculus: differentiation of multidimensional functions • Physics: mechanics, linear combinations of quantum states and many more...

  25. And now to

More Related