240 likes | 407 Vues
BNCOD16 5th July 1998. Brian Lings. slide 2. . . Aim. To consider research planning from the perspective of Ph.D. requirements in the UK within the area of Database research.. BNCOD16 5th July 1998. Brian Lings. slide 3. Objectives. To explore the impact of thesis statement on methodological approach to, and presentation of researchTo consider a range of different methods in the Database context.
E N D
1. BNCOD16 Research Methodologyan informal and personal viewDr Brian LingsDepartment of Computer ScienceUniversity of Exeter
2. BNCOD16 5th July 1998 Brian Lings slide 2 Aim To consider research planning from the perspective of Ph.D. requirements in the UK within the area of Database research. Also, to get feedback: how to refine the talk, make it more generally applicable.
Also, to get feedback: how to refine the talk, make it more generally applicable.
3. BNCOD16 5th July 1998 Brian Lings slide 3 Objectives To explore the impact of thesis statement on methodological approach to, and presentation of research
To consider a range of different methods in the Database context
4. BNCOD16 5th July 1998 Brian Lings slide 4 Research Requirements (UK) Original, significant contribution, knowledge of field, critical judgement, unified work
Contribution to learning, systematic study related to body of knowledge
5. BNCOD16 5th July 1998 Brian Lings slide 5 Scientific Method (Traditional) observation and experiment
inductive generalization
hypothesis
attempted verification of hypothesis
proof or disproof
knowledge Note: induction is not valid logically - it is a psychological phenomenon
e.g. interface: use it yourself over and over: rugged; get a novice (or a supervisor) to look at it - it will crash instantly.
Counterexamples can disprove a hypothesis - note that step 4 psychologically predisposes one to find conforming instances; these prove nothing (cf testing)
What does all this say about knowledge?
Important:
There are an infinite number of hypotheses which will account for the same set of results. Hence the inductive problem.
A false hypothesis may still lead to an infinite number of true deductions.
No number of affirmative results will in themselves increase the probability of the hypothesis being valid.Note: induction is not valid logically - it is a psychological phenomenon
e.g. interface: use it yourself over and over: rugged; get a novice (or a supervisor) to look at it - it will crash instantly.
Counterexamples can disprove a hypothesis - note that step 4 psychologically predisposes one to find conforming instances; these prove nothing (cf testing)
What does all this say about knowledge?
Important:
There are an infinite number of hypotheses which will account for the same set of results. Hence the inductive problem.
A false hypothesis may still lead to an infinite number of true deductions.
No number of affirmative results will in themselves increase the probability of the hypothesis being valid.
6. BNCOD16 5th July 1998 Brian Lings slide 6 Scientific Method (Popper 1) problem
proposed solution (new theory)
deduction of testable propositions
tests: attempted refutations
preference established between competing theories Closer to what is actually practised.
Proposed solution may be formulated through the psychological process of induction, but induction plays no part in the method itself.
Absolute importance of testable (falsifiable) - equated with scientific.
Note attempt to refute - predispose to find counter-examples.
Try to explain counter-examples rather than ruling them out by narrowing the range of applicability (this reduces information content of hypothesis and is methodologically poor).
Establish preference: more general; more economic;...
Expect any hypothesis to meet its Waterloo. This is the right way to view criticism - we learn from counter-examples, not from confirming examples.Closer to what is actually practised.
Proposed solution may be formulated through the psychological process of induction, but induction plays no part in the method itself.
Absolute importance of testable (falsifiable) - equated with scientific.
Note attempt to refute - predispose to find counter-examples.
Try to explain counter-examples rather than ruling them out by narrowing the range of applicability (this reduces information content of hypothesis and is methodologically poor).
Establish preference: more general; more economic;...
Expect any hypothesis to meet its Waterloo. This is the right way to view criticism - we learn from counter-examples, not from confirming examples.
7. BNCOD16 5th July 1998 Brian Lings slide 7 The Research Problem With which community do you identify?
What problems have you identified and with what current theories or techniques?
What hypotheses will you investigate?
Are these hypotheses falsifiable?
Importance of community - shared understanding, shared methods.
Within PhD unlikely to come up with entirely novel hypotheses - usually try to improve on results of others, e.g.
apply them elsewhere
generalise
Always ask yourself: How would my claim be falsified?Importance of community - shared understanding, shared methods.
Within PhD unlikely to come up with entirely novel hypotheses - usually try to improve on results of others, e.g.
apply them elsewhere
generalise
Always ask yourself: How would my claim be falsified?
8. BNCOD16 5th July 1998 Brian Lings slide 8 A Good Hypothesis is falsifiable
is focused, related to other work but offers scope for originality
will result in a significant contribution whatever the outcome of the research
can be investigated in the time available
is broad enough to be interesting - and your goal should be to falsify it in order to improve it.
Significant whatever the outcome: information content is high for your community even for a negated hypothesis. e.g. thorough analysis of reasons for negation, or limits of hypothesis and reasons for non-generality.
Broad enough to be interesting: temptation to reduce applicability (constrain more and more) to fit the hypothesis - which then becomes uninteresting (low information content).
High information content: there are many ways of falsifying it. e.g. Einstein’s general theory of relativity: observations of star positions by night and day.- and your goal should be to falsify it in order to improve it.
Significant whatever the outcome: information content is high for your community even for a negated hypothesis. e.g. thorough analysis of reasons for negation, or limits of hypothesis and reasons for non-generality.
Broad enough to be interesting: temptation to reduce applicability (constrain more and more) to fit the hypothesis - which then becomes uninteresting (low information content).
High information content: there are many ways of falsifying it. e.g. Einstein’s general theory of relativity: observations of star positions by night and day.
9. BNCOD16 5th July 1998 Brian Lings slide 9 Evaluating a Thesis Statement for Tools development
for Theoretical advancement
for Technical advancement
for Methods development
for Human Factors research No attempt to make these mutually exclusive: they are not a classification scheme.
Which best typifies yours?
Examples of good and bad thesis statements sought.No attempt to make these mutually exclusive: they are not a classification scheme.
Which best typifies yours?
Examples of good and bad thesis statements sought.
10. BNCOD16 5th July 1998 Brian Lings slide 10 Evaluating a Thesis Statement for Tools development
11. BNCOD16 5th July 1998 Brian Lings slide 11 Evaluating a Thesis Statement for Theoretical advancement
12. BNCOD16 5th July 1998 Brian Lings slide 12 Evaluating a Thesis Statement for Technical advancement The first is normally done through a review and classification of existing approaches, along the dimensions chosen for evaluation.The first is normally done through a review and classification of existing approaches, along the dimensions chosen for evaluation.
13. BNCOD16 5th July 1998 Brian Lings slide 13 Evaluating a Thesis Statement for Methods development
14. BNCOD16 5th July 1998 Brian Lings slide 14 Evaluating a Thesis Statement for Human Factors research
15. BNCOD16 5th July 1998 Brian Lings slide 15 Research Methods What subsidiary hypotheses will you be testing?
For each, what method will allow you to claim you have tested the hypothesis?
Do you have the knowledge and resources to address each hypothesis? Methods should be explicit but not intrusive, i.e. given a community there should be a short-hand way of communicating which method is being applied.
This may be by reference to its use elsewhere.
If your method needs detailed elaboration, it is probably unacceptable.
Remember your audience is your chosen community.Methods should be explicit but not intrusive, i.e. given a community there should be a short-hand way of communicating which method is being applied.
This may be by reference to its use elsewhere.
If your method needs detailed elaboration, it is probably unacceptable.
Remember your audience is your chosen community.
16. BNCOD16 5th July 1998 Brian Lings slide 16 Hypotheses and Methods Methods follow from hypotheses, not the other way around
DO ask yourself what you need to implement and for what specific purpose
DO NOT decide to implement and then think about contribution
17. BNCOD16 5th July 1998 Brian Lings slide 17 Methods (informal classification) Formal
Case Based Reasoning
Empirical
Quantitative
Qualitative
Again, not a classification - these overlap (e.g. formal proof assisted by case-based reasoning (e.g. proof of 4 colours hypothesis).
The role of proof-of-principle systems - comes from engineering rather than science background.Again, not a classification - these overlap (e.g. formal proof assisted by case-based reasoning (e.g. proof of 4 colours hypothesis).
The role of proof-of-principle systems - comes from engineering rather than science background.
18. BNCOD16 5th July 1998 Brian Lings slide 18 Formal Properties of systems
termination of ECA rule sets
correctness of locking protocols
correctness of Join algorithms
Complexity measures
time complexity of temporal queries
efficient buffer strategies for synchronized data retrieval
19. BNCOD16 5th July 1998 Brian Lings slide 19 Case Based Reasoning Properties of systems
complete semantic capture in ER to SQL3
UML repository design for F3
an improved API for temporal databases
Complexity measures
an improvement on a method for coupling of databases and expert systems
20. BNCOD16 5th July 1998 Brian Lings slide 20 Empirical: Quantitative Simulation
predicting the behaviour of a locking scheme or buffering algorithm
a comparative study of database caching algorithms in client-server architectures
Profiling
benchmarking of trigger management in current DBMS
21. BNCOD16 5th July 1998 Brian Lings slide 21 Empirical: Qualitative Evaluation
a comparative study of the quality of data modelling notations for user feedback
visualisation in scientific databases: is it effective?
Diagnosis
why do CASE tools fail to improve DBA performance in schema maintenance?
22. BNCOD16 5th July 1998 Brian Lings slide 22 Reporting on Methods Show that you have understood the method(s) chosen
advantages and disadvantages
limitations of the findings due to the method or available resources
Be clear about your use of the method(s)
detail for reproducibility
honesty for credibility
23. BNCOD16 5th July 1998 Brian Lings slide 23 Limitations of Methods consider: scalability of techniques
generality of results
counter-indicators
affecting factors
24. BNCOD16 5th July 1998 Brian Lings slide 24 References `Popper’; Bryan Magee, Fontana Press, 1985
various, including`Questions to Ask of AI Research’ and `A Scientific Checklist’; Alan Bundy, DAI Edinburgh
http://www.dai.ed.ac.uk/daidb/people/homes/bundy/how-tos/how-tos.html
25. BNCOD16 5th July 1998 Brian Lings slide 25 References (cont.) The Research Student’s Guide to Success, Pat Cryer, OU Press, 1996
Supervising the Ph.D.: A guide to success, Sara Delamont et al, OU Press, 1997