1 / 94

ANOVA & sib analysis

ANOVA & sib analysis. ANOVA & sib analysis. basics of ANOVA - revision application to sib analysis intraclass correlation coefficient. analysis of variance (ANOVA) is a way of comparing the ratio of systematic variance to unsystematic variance in a study .

yanka
Télécharger la présentation

ANOVA & sib analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ANOVA & sib analysis

  2. ANOVA & sib analysis • basics of ANOVA - revision • application to sib analysis • intraclass correlation coefficient

  3. analysis of variance (ANOVA) is a way of comparing the ratio of systematic variance to unsystematic variance in a study

  4. analysis of variance (ANOVA) is a way of comparing the ratio of systematic variance to unsystematic variance in a study • ANOVA as regression

  5. analysis of variance (ANOVA) is a way of comparing the ratio of systematic variance to unsystematic variance in a study • ANOVA as regression • research question: does exposure to content of Falconer & Mackay (1996) increase knowledge of quantitative genetics?

  6. analysis of variance (ANOVA) is a way of comparing the ratio of systematic variance to unsystematic variance in a study • ANOVA as regression • research question: does exposure to content of Falconer & Mackay (1996) increase knowledge of quantitative genetics?

  7. analysis of variance (ANOVA) is a way of comparing the ratio of systematic variance to unsystematic variance in a study • ANOVA as regression • research question: does exposure to content of Falconer & Mackay (1996) increase knowledge of quantitative genetics? score person

  8. analysis of variance (ANOVA) is a way of comparing the ratio of systematic variance to unsystematic variance in a study • ANOVA as regression • research question: does exposure to content of Falconer & Mackay (1996) increase knowledge of quantitative genetics? • outcomeij = model + errorij score person

  9. Dummy coding:

  10. Dummy coding:

  11. Dummy coding: outcomeij = model + errorij

  12. Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3

  13. Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3 knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1

  14. Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3 knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1

  15. Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3 knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1

  16. Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3 knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2

  17. Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3 knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2

  18. Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3 knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2

  19. Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3 knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2 knowledgei3 = b0 + b1*dummy13 + b2*dummy23 + εi3

  20. Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εiji = 1, … N, N = number of people per condition = 5 j = 1, … M, M = number of conditions = 3 knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2 knowledgei3 = b0 + b1*dummy13 + b2*dummy23 + εi3 = b0 + b1*0 + b2*1 + εi3 = b0 + b2 + εi3

  21. Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εij knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2 knowledgei3 = b0 + b1*dummy13 + b2*dummy23 + εi3 = b0 + b1*0 + b2*1 + εi3 = b0 + b2 + εi3 Therefore:

  22. Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εij knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2 knowledgei3 = b0 + b1*dummy13 + b2*dummy23 + εi3 = b0 + b1*0 + b2*1 + εi3 = b0 + b2 + εi3 Therefore: → μcondition1 = b0b0 is the mean of condition 1 (N)

  23. Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εij knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2 knowledgei3 = b0 + b1*dummy13 + b2*dummy23 + εi3 = b0 + b1*0 + b2*1 + εi3 = b0 + b2 + εi3 Therefore: → μcondition1 = b0b0 is the mean of condition 1 (N) → μcondition2 = b0 + b1 = μcondition1 + b1 μcondition2 - μcondition1= b1

  24. Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εij knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2 knowledgei3 = b0 + b1*dummy13 + b2*dummy23 + εi3 = b0 + b1*0 + b2*1 + εi3 = b0 + b2 + εi3 Therefore: → μcondition1 = b0b0 is the mean of condition 1 (N) → μcondition2 = b0 + b1 = μcondition1 + b1 b1 is the difference in means of μcondition2 - μcondition1= b1 condition 1 (N) and condition 2 (L)

  25. Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εij knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2 knowledgei3 = b0 + b1*dummy13 + b2*dummy23 + εi3 = b0 + b1*0 + b2*1 + εi3 = b0 + b2 + εi3 Therefore: → μcondition1 = b0b0 is the mean of condition 1 (N) → μcondition2 = b0 + b1 = μcondition1 + b1 b1 is the difference in means of μcondition2 - μcondition1= b1 condition 1 (N) and condition 2 (L) → μcondition3 = b0 + b2 = μcondition1 + b2 μcondition3 - μcondition1= b2

  26. Dummy coding: outcomeij = model + errorij knowledgeij = b0 + b1*dummy1j + b2*dummy2j + εij knowledgei1 = b0 + b1*dummy11 + b2*dummy21 + εi1 = b0 + b1*0 + b2*0 + εi1 = b0 + εi1 knowledgei2 = b0 + b1*dummy12 + b2*dummy22 + εi2 = b0 + b1*1 + b2*0 + εi2 = b0 + b1 + εi2 knowledgei3 = b0 + b1*dummy13 + b2*dummy23 + εi3 = b0 + b1*0 + b2*1 + εi3 = b0 + b2 + εi3 Therefore: → μcondition1 = b0b0 is the mean of condition 1 (N) → μcondition2 = b0 + b1 = μcondition1 + b1 b1 is the difference in means of μcondition2 - μcondition1= b1 condition 1 (N) and condition 2 (L) → μcondition3 = b0 + b2 = μcondition1 + b2 b2 is the difference in means of μcondition3 - μcondition1= b2 condition 1 (N) and condition 3 (LB)

  27. μLB μ μL μN

  28. μLB b2 μ μL b1 μN b0

  29. μLB μ μL μN

  30. μLB μ μL μN Sums of squares

  31. μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2

  32. μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2

  33. μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2 SSB = ΣNj(μj - μ)2

  34. μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2 SSB = ΣNj(μj - μ)2

  35. μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2 SSB = ΣNj(μj - μ)2 SSW = Σ(scoreij - μj)2

  36. μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2 SSB = ΣNj(μj - μ)2 SSW = Σ(scoreij - μj)2

  37. μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2 SSB = ΣNj(μj - μ)2 SSW = Σ(scoreij - μj)2 SST = SSB + SSW

  38. μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2 SSB = ΣNj(μj - μ)2 SSW = Σ(scoreij - μj)2 SST = SSB + SSW Degrees of freedom dfT = MN - 1 dfB = M – 1 dfW = M(N – 1) N = number of people per condition M = number of conditions

  39. μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2 SSB = ΣNj(μj - μ)2 SSW = Σ(scoreij - μj)2 SST = SSB + SSW Degrees of freedom dfT = MN - 1 dfB = M – 1 dfW = M(N – 1) Mean squares MST = SST/dfT MSB = SSB/dfB MSW = SSW/dfW N = number of people per condition M = number of conditions

  40. μLB μ μL μN Sums of squares SST = Σ(scoreij - μ)2 SSB = ΣNj(μj - μ)2 SSW = Σ(scoreij - μj)2 SST = SSB + SSW Degrees of freedom dfT = MN - 1 dfB = M – 1 dfW = M(N – 1) Mean squares MST = SST/dfT MSB = SSB/dfB MSW = SSW/dfW F-ratio F = MSB/MSW = MSmodel/MSerror N = number of people per condition M = number of conditions

  41. Sib analysis

  42. Sib analysis • number of males (sires) each mated to number of females (dams)

  43. Sib analysis • number of males (sires) each mated to number of females (dams) • mating and selection of sires and dams→ random

  44. Sib analysis • number of males (sires) each mated to number of females (dams) • mating and selection of sires and dams→ random • thus: population of full sibs (same father, same mother; same cell in table) and half sibs (same father, different mother; same row in table)

  45. Sib analysis • number of males (sires) each mated to number of females (dams) • mating and selection of sires and dams→ random • thus: population of full sibs (same father, same mother; same cell in table) and half sibs (same father, different mother; same row in table) • data: measurements of all offspring

  46. Sib analysis • example with 3 sires: scoreoffspring1dam1sire1 μdam1sire1 μsire1

  47. Sib analysis • ANOVA: • Partitioning the phenotypic variance (VP):

  48. Sib analysis • ANOVA: • Partitioning the phenotypic variance (VP): • between-sire component

  49. Sib analysis • ANOVA: • Partitioning the phenotypic variance (VP): • between-sire component • - component attributable to differences • between the progeny of different males

  50. Sib analysis μsire3 μsire2 μsire1

More Related