1 / 18

Investigating Scientifically

Investigating Scientifically. Types of investigation. Observations Controlled experiments Surveys Trial and error Case studies Longitudinal studies

alaina
Télécharger la présentation

Investigating Scientifically

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Investigating Scientifically

  2. Types of investigation • Observations • Controlled experiments • Surveys • Trial and error • Case studies • Longitudinal studies • meta analysis - a systematic method that takes data from a number of independent studies and integrates them using statistical analysis. • double blind - used to prevent research outcomes from being 'influenced' by the placebo effect or observer bias.

  3. Scientific Method • Scientists investigate in a methodical and systematic way. • The method used will be the one that best suits the situation. • Many investigations lead to the testing of a hypothesis and will follow a pattern: 1. recognise a problem and define a question 2. collect as much information as possible relating to the problem 3. propose a hypothesis—a possible explanation for the problem 4. test the hypothesis using an experiment 5. analyse and interpret the data collected from the experiment 6. draw conclusions about whether the hypothesis was supported or disproved 7. report on the investigation.

  4. Planning an investigation Also includes - • Literature review – a survey of the material that has been written about the subject under consideration. • Safety - It is important that an investigation presents no danger to the participants or to the investigators. • Ethics - a set of moral principles or values. that are observed by most people in our society.

  5. Ethics Some of the principles that an investigation involving humans must satisfy if it is to be ethically sound are: • voluntary participation—the subjects should not be pressured into taking part in the investigation • informed consent—the subjects should be fully informed about the objectives of the research, the procedures to be followed, any possible risks and the potential benefits of the research; consent should only be sought after all information has been given • risk of harm—as mentioned in the section on safety, there should be no risk of physical or psychological harm • confidentiality—the identities of participants will not be revealed except to people directly involved in the study. • anonymity—a stronger guarantee of privacy than confidentiality; the participants in the study remain anonymous, even to the researchers.

  6. Ethics • Just as humans must be treated in an ethical way, the same applies to animals. • The requirements for investigations involving animals are set out in the Australian Code of Practice for the Care and Use of Animals for Scientific Purposes. • The code sets out detailed requirements but in general terms any use of animals in research or teaching should be: • valid • humane • justifiable • considerate.

  7. Placebos • An inactive substance that looks like the real thing. • One group of subjects, the experimental group, takes the real thing that is being tested and the other group, the control, takes the placebo. • The placebo should look exactly the same, and be given in the same way, as the substance being tested. • Subjects do not know whether they are receiving the drug or the placebo. • If there is a clear difference in results between the new drug and the placebo, then the researcher can say with confidence that the difference was due to the effectiveness of the drug.

  8. Controlling variables • A variable is any factor that may change during an experiment. • The independent variable is the factor that is being investigated—the factor that is deliberately changed to determine its effect. • It is deliberately different between the control and the experimental groups in an experiment. • The independent variable may also be called the experimental variable or the manipulatedvariable. • The dependent variable is the factor that changes in response to the changes made to the independent variable. • It is sometimes called the respondingvariable.

  9. Controlling variables • Controlled variables are the factors that are kept the same for both the control and experimental groups. • Uncontrolled variables are variables that were not kept the same for the control and experimental groups and may have been overlooked by the experimenter or they may have been impossible to control. • To ensure that the results of an experiment will either support or disprove the hypothesis, only one factor, or variable, is tested at a time. • A control, or comparison, experiment is done in which the only difference is in the one variable being tested.

  10. Repetition and replication • Repetition - doing the same experiment many times. • Replication - having a number of identical experiments running together or performing the experiment on a large number subjects at the same time. • Both help to demonstrate that results are consistent. • If results are different each time an experiment is performed they are of little value. • Repetition and replication can also help to overcome the effects of any uncontrolled variables e.g. if 10 subjects are used in an experiment and one of them is unusual in some way it will have a big effect on the overall result. If 100 subjects were used, 1 unusual person in 100 would not have much effect on the average result. • In designing any experiment, plan for as much repetition or replication as time and resources will allow.

  11. Validity and reliability of results • An experiment is valid when it tests what it supposed to test(Doh) • Reliability is the extent to which an experiment gives the same result each time it is performed. • The measuring instruments used in the experiment must also be reliable; that is, give the same measurement each time they are used. • Repetition and replication are used to make sure that results are reliable. • They do not improve the accuracy of the experiment.

  12. Experimental error There are three possible types of error that may occur in an experiment: 1. Human error – simply a mistake. • Random errors – unpredictable and occur because no measurement can be made with absolute precision. Reduced by taking several measurements and averaging them. • Systematic errors - occur because of the way in which the experiment was designed. In this case a measurement will always be too high or too low. Systematic errors cannot be reduced by averaging; the only solution is to change the experimental procedure.

  13. Quantifying results Data from an investigation can be one of two types: • quantitative data—expressed in numbers and usually involving measurement; for example, ‘the students are 174 and 176 cm in height’ • qualitative data—observations that do not involve numbers or measurement; for example, ‘student A is taller than student B’. • Wherever possible, you should design an investigation so that results are quantifiable. • Numerical results can be ranked, averaged and manipulated in other ways. They can also be summarised using graphs.

  14. Processing data • Averages • Ranges • Ratios and rates: • A ratio is a numerical statement of how one variable relates to another. • A rate is a special kind of ratio that shows how long it takes to do something. • Percentages • Percentage change • Frequencies

  15. Errors and limitations in data • It is important that data be checked carefully for any errors. • It is also important to understand the limitations of any data obtained from an investigation. • You must not draw conclusions that go beyond the data.

  16. Presentation of data Tables • It has a title. The title should state the variables represented by the data. • The data are presented in columns. Usually the independent variable) is in the left hand column and the dependent variable (number of students) is in the right hand column. • Each column has a heading, and where appropriate, the heading must state the units in which the data have been measured.

  17. Presentation of dataGraphs When drawing a graph it is important to remember to: • label the axes with the names of the variables • indicate the units in which each variable is measured • give the graph a title that summarises the relationship illustrated by the graph • use equal intervals of units on each axis.

  18. Reporting Reports may be written using the following headings: • title and name of the author or authors • introduction, stating the nature of the problem that was investigated and the hypothesis that was tested • materials and equipment, listing the apparatus used, particularly any specialised items of equipment • procedure, describing the exact method that was used to carry out the investigation • results, often presented as tables, graphs, diagrams or photographs • discussion, including comments about the results and the way they relate to the hypothesis that was tested • conclusion, summarising the most important parts of the discussion and stating the success or otherwise of the investigation • further research, as scientific investigations often raise more questions than they answer—many reports suggest areas that need further investigation • references, which consist of a list of any reports, books, journal articles, websites or other sources of information that have been referred to in the report • acknowledgments, which are to people who have helped with the investigation or to organisations that have provided funds for the research.

More Related