1 / 66

Research Methodology

Research Methodology. Lecture #3. Roadmap . Understand What is Research . Define Research Problem . How to solve it? Research methods Experiment Design . How to write and publish an IT paper?. How to solve the problem?. Understanding the problem

sereno
Télécharger la présentation

Research Methodology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research Methodology Lecture #3

  2. Roadmap • Understand What is Research • Define Research Problem • How to solve it? • Research methods • Experiment Design How to write and publish an IT paper?

  3. How to solve the problem? • Understanding the problem • Distinguishing the unknown, the data and the condition • Devising a plan • Connecting the data to the unknown, finding related problems, relying on previous findings • Carrying out the plan • Validating each step, (if possible) proving correctness • Looking back • Checking the results, contemplating alternative solutions, exploring further potential of result/method

  4. Research Methods Vs Methodology • Research Methods are the methods by which you conduct research into a subject or a topic. • involve conduct of experiments, tests, surveys ,…. • Research Methodology is the way in which research problems are solved systematically. • It is the Science of studying how research is conducted Scientifically • involves the learning of the various techniques that can be used in the conduct of research and in the conduct of tests, experiments, surveys and critical studies. http://www.differencebetween.com/difference-between-research-methods-and-vs-research-methodology/

  5. Research Approaches • Quantitative Approach (Uses experimental, inferential and simulation approaches to research) • Qualitative Approach (Uses techniques like in-depth interview, focus group interviews) Shashikant S Kulkarni,Research Methodology An Introduction

  6. Types of Research in general • Descriptive • Analytical • Applied • Fundamental • Quantitative • Qualitative • Conceptual • Empirical • Other Types Shashikant S Kulkarni,Research Methodology An Introduction

  7. Descriptive Vs Analytical • In Descriptive Research, the Researcher has to only report what is happening or what has happened. • In Analytical Research, the Researcher has to use the already available facts or information, and analyse them to make a critical evaluation of the subject Shashikant S Kulkarni,Research Methodology An Introduction

  8. Applied Vs Fundamental • Applied Research , is an attempt to find solution to an immediate problem encountered by a firm, an Industry, a business organization, or the Society. • ‘Fundamental’ Research , is gathering knowledge for knowledge’s sake is called ‘Pure’ or ‘Basic’. Shashikant S Kulkarni,Research Methodology An Introduction

  9. Quantitative Vs Qualitative • Quantitative Research involves the measurement of quantity or amount. (ex: Economic & Statistical methods) • Qualitative Research is concerned with the aspects related to or involving quality or Kind.(ex: Motivational Research involving behavioural Sciences) Shashikant S Kulkarni,Research Methodology An Introduction

  10. Conceptual Vs Empirical • Conceptual Research, The Research related to some abstract idea or theory. (Ex: Philosophers and Thinkers using this to developing new concepts) • Empirical Research relies on the observation or experience with hardly any regard for theory and system. Shashikant S Kulkarni,Research Methodology An Introduction

  11. Other Types of Research • One-time or Longitudinal Research (On the basis time) • Laboratory Research or Field-setting or Simulational Research (On the basis of environment) • Historical Research Shashikant S Kulkarni,Research Methodology An Introduction

  12. Research Method Classification in Computer Science • Scientific: understanding nature • Engineering: providing solutions • Empirical: data centric models • Analytical: theoretical formalism • Computing: hybrid of methods From W.R.Adrion, Research Methodology in Software Engineering, ACM SE Notes, Jan. 1993

  13. Scientist vs. Engineer • A scientist sees a phenomenon and asks “why?” and proceeds to research the answer to the question. • An engineer sees a practical problem and wants to know “how” to solve it and “how” to implement that solution, or “how” to do it better if a solution exists. • A scientist builds in order to learn, but an engineer learns in order to build

  14. Observe real world Propose a model or theory of some real world phenomena Measure and analyze above Validate hypotheses of the model or theory If possible, repeat The Scientific Method

  15. Observe existing solutions Propose better solutions Build or develop better solution Measure, analyze, and evaluate Repeat until no further improvements are possible The Engineering Method

  16. Propose a model Develop statistical or other basis for the model Apply to case studies Measure and analyze Validate and then repeat The Empirical Method

  17. Propose a formal theory or set of axioms Develop a theory Derive results If possible, compare with empirical observations Refine theory if necessary The Analytical Method

  18. The Computing Method

  19. Empirical Method example (1) • Do algorithm animations assist learning?: an empirical study and analysis • Algorithm animations are dynamic graphical illustrations of computer algorithms, and they are used as teaching aids to help explain how the algorithms work. Although many people believe that algorithm animations are useful this way, no empirical evidence has ever been presented supporting this belief. We have conducted an empirical study of a priority queue algorithm animation, and the study's results indicate that the animation only slightly assisted student understanding. In this article, we analyze those results and hypothesize why algorithm animations may not be as helpful as was initially hoped. We also develop guidelines for making algorithm animations more useful in the future.

  20. Empirical Method example (2) • An empirical study of FORTRAN programs • A sample of programs, written in FORTRAN by a wide variety of people for a wide variety of applications, was chosen ‘at random’ in an attempt to discover quantitatively ‘what programmers really do’. Statistical results of this survey are presented here, together with some of their apparent implications for future work in compiler design. The principal conclusion which may be drawn is the importance of a program ‘profile’, namely a table of frequency counts which record how often each statement is performed in a typical run; there are strong indications that profile-keeping should become a standard practice in all computer systems, for casual users as well as system programmers. This paper is the report of a three month study undertaken by the author and about a dozen students and representatives of the software industry during the summer of 1970. It is hoped that a reader who studies this report will obtain a fairly clear conception of how FORTRAN is being used, and what compilers can do about it..

  21. Research Phases • Informational: gathering information through reflection, literature, people survey • Propositional: Proposing/formulating a hypothesis, method, algorithm, theory or solution • Analytical: analyzing and exploring proposition, leading to formulation, principle or theory • Evaluative: evaluating the proposal R.L. Glass, A structure-based critique of contemporary computing research, Journal of Systems and Software Jan (1995)

  22. Method-Phase Matrix

  23. Experimenting: experiment design

  24. Scientific method in one minute • Use experience and observations to gain insight about a phenomenon • Construct a hypothesis • Use hypothesis to predict outcomes • Test hypothesis by experimenting • Analyze outcome of experiment • Go back to step 1 http://www.cl.cam.ac.uk/teaching/0910/C00/L10/

  25. Typical computer science scenario • A particular task needs to be solved by a software system • This task is currently solved by an existing system (a baseline) • You propose a new, in your opinion, better system • You argue why your proposed system is better than the baseline • You support your arguments by providing evidence that your system http://www.cl.cam.ac.uk/teaching/0910/C00/L10/

  26. Running example in this lecture Text entry on a Tablet PC: • Handwriting recognition • Software keyboard http://www.cl.cam.ac.uk/teaching/0910/C00/L10/

  27. Why experiments? • Substantiate claims • A research paper needs to provide evidence to convince other researchers of the paper’s main points • Strengthen or falsify hypotheses • “My system/technique/algorithm is [in some aspect] better than previously published systems/techniques/algorithms” • Evaluate and improve/revise/reject models • “The published model predicts users will type at 80 wpm on average after 40 minutes of practice with a thumb keyboard. In our experiment no one surpassed 25 wpm after several hours of practice.” • Gain further insights, stimulate thinking and creativity http://www.cl.cam.ac.uk/teaching/0910/C00/L10/

  28. Experiments in Computer Science

  29. Experiments in Computer Science

  30. Experiment example http://win.ua.ac.be/~sdemey/Tutorial_ResearchMethods/

  31. Research Method : Feasibility Study • Metaphor: Christopher Columbus and western route to India • Is it possible to solve a specific kind of problem effectively ? • computer science perspective (Turing test, …) • engineering perspective (build efficiently; fast — small) • economic perspective (cost effective; profitable) • Is the technique new / novel / innovative ? • compare against alternatives • See literature survey; comparative study • Proof by construction • build a prototype • often by applying on a “CASE” • primarily qualitative; "lessons learned“ • quantitative • economic perspective: cost - benefit • engineering perspective: speed - memory footprint http://win.ua.ac.be/~sdemey/Tutorial_ResearchMethods/

  32. Feasibility Study • Example : A feasibility study for power management in LAN switches We examine the feasibility of introducing power management schemes in network devices in the LAN. Specifically, we investigate the possibility of putting various components on LAN switches to sleep during periods of low traffic activity. Traffic collected in our LAN indicates that there are significant periods of inactivity on specific switch interfaces. Using an abstract sleep model devised for LAN switches, we examine the potential energy savings possible for different times of day and different interfaces (e.g., interfaces connecting to hosts to switches, or interfaces connecting switches, or interfaces connecting switches and routers). Algorithms developed for sleeping, based on periodic protocol behavior as well as traffic estimation are shown to be capable of conserving significant amounts of energy. Our results show that sleeping is indeed feasible in the LAN and in some cases, with very little impact on other protocols. However, we note that in order to maximize energy savings while minimizing sleep-related losses, we need hardware that supports sleeping. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1348125&tag=1

  33. Pilot Case • !Metaphor: Portugal (Amerigo Vespucci) explores western route • Here is an idea that has proven valuable; does it work for us ? • Proven valuable • accepted merits (e.g. “lessons learned” from feasibility study) • there is some (implicit) theory explaining why the idea has merit • does it work for us • context is very important • Demonstrated on a simple yet representative “CASE” • “Pilot case” <> “Pilot Study” • Proof by construction • build a prototype • apply on a “case” http://win.ua.ac.be/~sdemey/Tutorial_ResearchMethods/

  34. Pilot Case • Example : Code quality analysis in open source software development Abstract Proponents of open source style software development claim that better software is produced using this model compared with the traditional closed model. However, there is little empirical evidence in support of these claims. In this paper, we present the results of a pilot case study aiming: (a) to understand the implications of structural quality; and (b) to figure out the benefits of structural quality analysis of the code delivered by open source style development. To this end, we have measured quality characteristics of 100 applications written for Linux, using a software measurement tool, and compared the results with the industrial standard that is proposed by the tool. Another target of this case study was to investigate the issue of modularity in open source as this characteristic is being considered crucial by the proponents of open source for this type of software development. We have empirically assessed the relationship between the size of the application components and the delivered quality measured through user satisfaction. We have determined that, up to a certain extent, the average component size of an application is negatively related to the user satisfaction for this application. http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.102.7392

  35. Comparative Study • Here are two techniques, which one is better ? • For a given purpose ! • Where are the differences ? What are the tradeoffs ? • Criteria check-list • qualitative and quantitative • qualitative: how to remain unbiased ? • quantitative: represent what you want to know ? • Often by applying the technique on a “CASE” • Compare • typically in the form of a table http://win.ua.ac.be/~sdemey/Tutorial_ResearchMethods/

  36. Comparative Study • A comparative study of fuzzy rough sets • Abstract http://www.sciencedirect.com/science/article/pii/S016501140100032X

  37. Observational Study • Understand phenomena through observations • Metaphor: Diane Fossey “Gorillas in the Mist” • Systematic collection of data derived from direct observation of the everyday life • phenomena is best understood in the fullest possible context • observation & participation • interviews & questionnaires http://win.ua.ac.be/~sdemey/Tutorial_ResearchMethods/

  38. Observational Study • Example: Action Research • Action research is carried out by people who usually recognize a problem or limitation in their workplace situation and, together, devise a plan to counteract the problem, implement the plan, observe what happens, reflect on these outcomes, revise the plan, implement it, reflect, revise and so on. • Conclusions • primarily qualitative: classifications/observations/… http://win.ua.ac.be/~sdemey/Tutorial_ResearchMethods/

  39. Literature Survey • What is known ? What questions are still open ? • Systematic • comprehensive” • precise research question is prerequisite • defined search strategy (rigor, completeness, replication) • clearly defined scope • criteria for inclusion and exclusion • specify information to be obtained • the “CASES” are the selected papers

  40. Formal Model • How can we understand/explain the world ? • make a mathematical abstraction of a certain problem • analytical model, stochastic model, logical model, re-write system, ... • prove some important characteristics • Example : A Formal Model of Crash Recovery in a Distributed System • Abstract A formal model for atomic commit protocols for a distributed database system is introduced. The model is used to prove existence results about resilient protocols for site failures that do not partition the network and then for partitioned networks. For site failures, a pessimistic recovery technique, called independent recovery, is introduced and the class of failures for which resilient protocols exist is identified. For partitioned networks, two cases are studied: the pessimistic case in which messages are lost, and the optimistic case in which no messages are lost. In all cases, fundamental limitations on the resiliency of protocols are derived. http://win.ua.ac.be/~sdemey/Tutorial_ResearchMethods/

  41. Simulation • What would happen if … ? • study circumstances of phenomena in detail • simulated because real world too expensive; too slow or impossible • make prognoses about what can happen in certain situations • test using real observations, typically obtained via a “CASE” • Examples • distributed systems (grid); network protocols • too expensive or too slow to test in real life • embedded systems — simulating hardware platforms • impossible to observe real clock-speed / http://win.ua.ac.be/~sdemey/Tutorial_ResearchMethods/

  42. Back to our example • Why this experiment? • Despite decades of research there is no empirical data of text entry performance of handwriting recognition • An inappropriate study of handwriting (sans recognition) from 1967 keeps getting cited in the literature, often through secondary or tertiary sources (handbooks, etc.) • Based on these numerous citations in research papers, handwriting recognition is perceived to be rather slow • However, there is no empirical evidence that supports this claim http://www.cl.cam.ac.uk/teaching/0910/C00/L10/

  43. Controlled experiments and hypotheses • A controlled experiment tests the validity of one or more hypothesis • Here we will consider the simplest case: • One method vs. another method • Each method is referred to as a condition • The null hypothesis H0 states there is no difference between the conditions • Our hypothesis H1 states there is a difference between the conditions • To show a statistically significant difference the null hypothesis H0 needs to be rejected http://www.cl.cam.ac.uk/teaching/0910/C00/L10/

  44. Choice of baseline • A baseline needs to be accepted by your readers as a suitable baseline • Preferably the baseline is the best method that is currently available • In practice a baseline is often a standard method which is well understood but often not representative of the state-of-the-art http://www.cl.cam.ac.uk/teaching/0910/C00/L10/

  45. Example • Our example, two conditions http://www.cl.cam.ac.uk/teaching/0910/C00/L10/

  46. Why this baseline? • The software keyboard is well understood • – Many empirical studies of their performance • – Also exists expert computational performance models • The software keyboard is the de-facto standard text entry method on tablets • The literature compares handwriting recognition text entry performance against measures of the software keyboard http://www.cl.cam.ac.uk/teaching/0910/C00/L10/

  47. Aim of controlled experiment • To measure effects of the different conditions • To control for all other confounding factors • To be internally valid • To be externally valid • To be reproducible http://www.cl.cam.ac.uk/teaching/0910/C00/L10/

  48. Experimental design • Dependent and independent variables • Within-subjects vs. between-subjects • Mixed designs • Single session vs. longitudinal experiments http://www.cl.cam.ac.uk/teaching/0910/C00/L10/

  49. Dependent and independent variables • Dependent variable: • – What is measured • – Typical examples (in CS): time, accuracy, memory usage • Independent variable • – What is manipulated • – Typical examples (in CS): the system used by participants, feedback to participant (e.g. a beep versus a visual flash) http://www.cl.cam.ac.uk/teaching/0910/C00/L10/

  50. Deciding what to manipulate and whatto measure • This is a key issue in research • Boils down to your hypothesis: • What do you believe? • How can you substantiate your claim by making measures? • What can you measure? • Is it possible to protect internal validity without sacrificing external validity? http://www.cl.cam.ac.uk/teaching/0910/C00/L10/

More Related