340 likes | 489 Vues
Educational Research. Chapter 22 Evaluating a Research Report Gay, Mills, and Airasian 10 th Edition. Topics Discussed in this Chapter. Gathering information General evaluation criteria Design specific evaluation criteria Qualitative research in general Observational research
E N D
Educational Research Chapter 22 Evaluating a Research Report Gay, Mills, and Airasian 10th Edition
Topics Discussed in this Chapter • Gathering information • General evaluation criteria • Design specific evaluation criteria • Qualitative research in general • Observational research • Historical research • Survey – Questionnaire and Interview • Correlational – Relationship and Prediction • Causal-Comparative • Experimental
Gathering Information • Necessity of knowing what was done • Examples • What was the problem? • Who were the subjects? • What research design was used? • What were the results and conclusions? • What are the implications of the research? • Basic formats to collect information for quantitative and qualitative research
Gathering Information - Quantitative • Introduction • Problem • Provide a general statement of the problem that includes the variables and the relationships between them • State the importance of the study • Review of the Literature • List the major issues identified in the review • Hypothesis • State the specific hypothesis or hypotheses being investigated
Gathering Information - Quantitative • Method • Participants • Identify the population and sample • Describe the sampling and/or assignment procedures • Identify the size of the total sample and of each group if applicable • Describe the general characteristics of the subjects
Gathering Information - Quantitative • Method (continued) • Instruments • List the specific instruments used in the study • Describe the evidence of validity provided for each instrument • Describe the reliability evidence cited for each instrument • Describe the information needed to interpret the scores for each instrument
Gathering Information - Quantitative • Method (continued) • Design and Procedures • Identify the specific type of research design • Identify any threats to internal validity • Identify any threats to external validity • Results • Identify the specific analyses being used • A comparison between the mean scores for a control and experimental group • A correlation between students’ math attitudes and achievement • A survey of parental attitudes toward an extended school year
Gathering Information - Quantitative • Method (continued) • Results (continued) • Identify any descriptive statistics used and summarize the results • Identify the specific statistical test of significance, report the test statistic itself, and report its level of significance • The experimental group means were significantly higher (t = 5.68, p = .023) than those for the control group • There was a significant (r2 = 0.91, p = .001) positive relationship between students’ attitudes and achievement
Gathering Information - Quantitative • Discussion • Identify the specific conclusions of the researchers • Discuss the implications described by the researchers
Gathering Information – Qualitative • Introduction • Research topic • Provide a statement of the general issue, topic, or question being investigated • Describe any reformulation of the topic on the basis of the ongoing interactive nature of the collection, analysis, and synthesis of data • Discuss the importance of the topic
Gathering Information – Qualitative • Introduction (continued) • Review of the literature • Describe the nature of the review of the literature • List the major issues identified in the review of the literature
Gathering Information – Qualitative • Method • Site and participant selection • Describe the strategies used to gain entry to the site • Describe the site • Identify the participant(s) and list the sampling strategies used to select them • Describe the characteristics of the participant(s)
Gathering Information – Qualitative • Method (continued) • Data collection and analysis • Describe the researcher’s role in the study • Report the data collection strategies used • Identify any instruments or protocols used by the researchers • Identify any threats to the quality of the data (i.e., observer bias and observer effect) • Describe the strategies used to enhance validity and reduce bias in data collection • Describe the strategies used to classify and interpret data
Gathering Information – Qualitative • Method (continued) • Research approach and procedures • Identify the research approach • Briefly describe the procedures used • Identify any ethical issues related to the study • Results • Report the findings • Describe the researcher’s interpretation of the findings
Gathering Information – Qualitative • Discussion • Report the researcher’s conclusions • State the relationship between the conclusions and the initial problem
Focus of General Evaluation Criteria • See the evaluation criteria in the text and on the web site • Introduction • Problem • Review of the related literature • Hypotheses • Methods • Participants • Instruments • Research design and procedures
Focus of General Evaluation Criteria • Results • Discussion • Abstract or summary
Type-Specific Evaluation Criteria • Descriptive research • Questionnaire studies • Are pilot study procedures and results described? • Are directions to questionnaire respondents clear? • Does each item in the questionnaire relate to one of the objectives of the study? • Does each questionnaire item deal with a single concept? • When necessary, is a point of reference given for questionnaire scales?
Type-Specific Evaluation Criteria • Descriptive research (continued) • Questionnaire studies (continued) • Are leading questions avoided in the questionnaire? • Are there sufficient alternatives for each questionnaire item? • Does the cover letter explain the purpose and importance of the study and give the potential respondent a good reason to co-operate? • If appropriate, is confidentiality or anonymity assured in the cover letter?
Type-Specific Evaluation Criteria • Descriptive research (continued) • Questionnaire studies (continued) • What is the percentage of returns and how does this affect the study results? • Are follow-up activities to increase returns described? • If the response rate was low, was any attempt made to determine any major differences between respondents and non-respondents? • Are data analyzed in groups or clusters rather than a series of many single variable analyses?
Type-Specific Evaluation Criteria • Correlational research • Relationships • Were variables carefully selected? • Is the rationale for variable selection described? • Do the conclusions avoid suggesting causal relationships between variables?
Type-Specific Evaluation Criteria • Causal-comparative research • Are the characteristics or experiences that differentiate the groups clearly defined or described? • Are critical extraneous variables identified? • Were any control procedures applied to equate the groups on extraneous variables? • Are plausible alternative hypotheses discussed?
Type-Specific Evaluation Criteria • Experimental research • Was an appropriate experimental design selected? • Is a rationale given for the design selected? • Is the method of group formation described? • Was the experimental group formed in the same way as the control group? • Were groups randomly formed and the use of existing groups avoided?
Type-Specific Evaluation Criteria • Experimental research (continued) • Were treatments randomly assigned to groups? • Were critical extraneous variables identified? • Were any control procedures applied to equate groups on extraneous variables? • Were the results generalized to the appropriate group?
Type-Specific Evaluation Criteria • Interview studies • Were the interview procedures pretested? • Are pilot study procedures and results described? • Does each item in the interview guide relate to a specific objective of the study? • When necessary, is a point of reference given in the guide for interview items? • Are leading questions avoided in the interview guide? • Is the language and complexity of the questions appropriate for the participants?
Type-Specific Evaluation Criteria • Interview studies (continued) • Does the interview guide indicate the type and amount of prompting and probing that was permitted? • Are the qualifications and special training of the interviewers described? • Is the method used to record responses described? • Did the researcher use the most reliable, unbiased method of recording responses?
Type-Specific Evaluation Criteria • Mixed methods research • Does the study use at least one quantitative and one qualitative research method? • Does the study include a rationale for using a mixed methods research design? • Does the study include a classification of the type of mixed methods research design?
Type-Specific Evaluation Criteria • Mixed methods research (continued) • Was the study feasible given the amount of data to be collected and concomitant issues of resources, time, and expertise? • Does the study include both quantitative and qualitative research questions? • Does the study clearly identify qualitative and quantitative data collection techniques? • Does the study use appropriate data analysis techniques for the type of mixed methods design?
Validity and Reliability • Threats to internal validity in qualitative studies • Did the researcher effectively deal with problems of history and maturation by documenting historical changes over time? • Did the researcher effectively deal with problems of mortality by using a large enough sample? • Was the researcher in the field long enough to effectively minimize observer effects? • Did the researcher take the time to become familiar and comfortable with participants?
Validity and Reliability • Threats to internal validity in qualitative studies (continued) • Were the interview questions pretested? • Did the researcher interview key informants to verify field observations? • Were participants demographically screened to ensure that they were representative of the larger population?
Validity and Reliability • Threats to internal validity in qualitative studies (continued) • Was the data collected using different media to facilitate cross-validation? • Were participants allowed to evaluate the researcher results before publication? • Is sufficient data presented to support findings and conclusions?
Validity and Reliability • Threats to reliability in qualitative studies (continued) • Is the researcher’s relationship with the group and setting fully described? • Is all field documentation comprehensive, fully cross-referenced and annotated, and rigorously detailed?
Validity and Reliability • Threats to reliability in qualitative studies • Is construction, planning, and testing of all instruments documented? • Are key informants fully described, including information on groups they represent and their community status? • Are sampling techniques fully documented as being sufficient for the study?