1 / 52

Mandy S. Lee Filipa M. Ribeiro

Special Plenary Session: Social Network Analysis & Research Ethics: Ethical issues for conducting SNA in healthcare and educational fields. Mandy S. Lee Filipa M. Ribeiro Trinity College Dublin University of Porto 1 st European Social Networks Conference (EUSN)

biana
Télécharger la présentation

Mandy S. Lee Filipa M. Ribeiro

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Special Plenary Session:Social Network Analysis & Research Ethics: Ethical issues for conducting SNA in healthcare and educational fields Mandy S. Lee Filipa M. Ribeiro Trinity College Dublin University of Porto 1st European Social Networks Conference (EUSN) UAB, Barcelona Thursday, 3rd July, 2014

  2. Why? • “Knowledge is powerful, and particularly so with information that we amass from conducting SNA. So, where are the ethical considerations in our academic papers? Are there ethical publications like position pieces about our academic responsibility as SNA researchers, in particular? Do members of the SNA community include ethical discussions in their workshops, classrooms, and publications?” (MaryaDoerfel, 2002)

  3. Citizens’ concerns • 9 out of 10 Europeans (92%) say they are concerned about mobile apps collecting their data without their consent. • 7 Europeans out of 10 are concerned about the potential use that companies may make of the information disclosed. Source: SpecialEurobarometer 359: Attitudes on Data Protection and Electronic Identity in the European Union, June 2011

  4. EU Data Protection Regulation 2014 • To replace the EU Data Protection Directive in 1995 • Covers the use of personal data across a wide range of sectors and affects how patient data are used in research. • Key innovations: • One continent, one law; • One-stop-shop for users and businesses; • Same rules for all companies regardless of original HQ (e.g. US internet companies). • Right to be forgotten • Easier access to own data – data portability • User control of data – specific, explicit consent • Data protection first – “privacy by design & by default”

  5. EU amendments regarding “specific, explicit consent” • Explicit Consent • Already covered in standard informed consent procedures and tightly regulated under strict research governance frameworks (e.g. research ethics committees) • Specific Consent • “In many studies that will be affected, individuals have voluntarily given broad consent for their data to be used in research to further our understanding of society, health and disease. Their valuable contributions could be wasted if the amendments become law.” • (Wellcome Trust-led non-commercial research organisations and academics’ Position Statement April 2014)

  6. Key principles in research ethics in health and educational settings relevant to SNA • “Do Good” (Beneficence) • “Do No Harm” (Non-maleficence) • Right to Autonomy: • Voluntary Participation • Informed Consent • Right to Privacy: • Participant Confidentiality • Data Security and Protection Balancing Harm (probability & severity) with Benefits

  7. SNA and “Do No Harm” • Categories of “Harm” in Social Science Research • Research Coercion / Involuntary Participation • Inducement – manipulation by reward • E.g. Giving away personal details for a free service • Coercion – manipulation by force • E.g. Participation induced by social or institutional pressure • Intrusion to Privacy and Right to Personal Integrity and Dignity • Revelation of more information about the individual than s/he volunteered to provide • E.g. identification of social positions / membership of subgroups / prediction of personal attributes not volunteered by participants themselves • Correct, Comprehensive or Sensitive Interpretation? • Perpetual Revelation?

  8. Is this “informed, broad consent”?(from “Experimental evidence of massive-scale emotional contagion through social networks”, Kramer et al, 2014) • Service Agreements ≠ “Informed Consent” for Research! • “LIWC [Linguistic Inquiry and Word Count software] was adapted to run on the Hadoop Map/Reduce system (11) and in the News Feed filtering system, such that no text was seen by the researchers. As such, it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.” • REALLY??? • Conflict of Interest? • Lead Author: Adam Kramer. Core Data Science Team, Facebook, Inc. • Data Confidentiality? • “Data processing systems, per-user aggregates, & anonymized results available upon request.”

  9. Lessons (Crudely Put) Still to be Learnt: 1. “Don’t be a dick” 2. “Don’t shit the bed” • Damage done to trustworthiness of academic & other non-commercial research • Kicker is the study was understood by PNAS editor as being approved by the Cornell IRB “for data analysis” not for “data collection” on a claimed “pre-existing dataset”! • Wellcome Trust et al says “Thanks a lot, Facebook” • “Informed Consent” does NOT mean “Service Agreement / Data Use Policy” • “Broad” does NOT mean “Perpetual” Consent • Scientific knowledge does not trump participant well-being especially without due consideration by independent ethics review Source: The Guardian online poll “Facebook's secret mood experiment: have you lost trust in the social network?” http://www.theguardian.com/technology/poll/2014/jun/30/facebook-secret-mood-experiment-social-network

  10. Privacy concerns from SNA predictive modelling Researchers able to accurately infer a Facebook user's race, IQ, sexuality, substance use, personality or political views using only a record of the subjects and items they had "liked" on Facebook (“User-Like Matrix” with 10m User-Like Pairs; i.e. “Preference networks”) – even if users had chosen not to reveal that information (The Guardian, 11th March, 2013; Kosinski et al, 2013)

  11. SNA could also be a force for public good and in public interest • E.g. Revealing nepotism, i.e. structural correlation between kinship and affiliation networks • The Wen Family Empire. Corruption via family “guanxi" linked to the top leader of Communist China, President WenJiabao. Analysis by China-based New York Times. “The family built a fortune by forming business networks aided by wealthy tycoons.” Reportage suppressed within China by the Communist regime.

  12. How do we balance risks and benefits in SNA studies? • Our 4-step approach • Identify new risks posed by SNA over and above standard concerns in more traditional research • Outline how various SNA researchers have managed these risks in health and education fields • Instigate debate and share learning across academic fields • Engage in positive dialogue with REC/IRBs to develop an ethical review process that is fit for purpose • Ensures beneficial research gets done while protecting and respecting participants’ rights

  13. SNA and Voluntary Participation: How SNA poses new risks • Network Sampling • Aka Snowball-sampling; Respondent-driven sampling • Previously the informant-driven sampling frame itself is not really the data – but just the means to get at data from contacting and then inviting the right person to the study • Now the contact database IS the data • Name Generator • Cannot protect participant identity in SNA • Co-option into research without formal consent process • Dealing with Unknown Network Boundaries • May not be able to initiate consent in “hidden” populations

  14. SNA and Informed Consent: How SNA poses new risks • Visualisation of Networks • Information revealed about participant social group membership and social position beyond individual narrative description • Key Player Metrics and Sub-group Analysis • “Objective” calculations of individual social standing and position beyond individual narrative description • “Objective” revelations of existence of cliques and their membership beyond individual narrative description • Perpetual Revelation? • Identification of attributes and/or network membership may be irreversible / have irreversible consequences (e.g. sexual orientation; terrorist networks) • Young People / Parental/Guardian Assent • Lack of understanding of wealth of information derived from sharing details on online social networks

  15. Informed Consent: How can we handle such risks in SNA? • E.g. Full explanation of risks by experienced field recruiters who have already developed trust and rapport with the target participants • E.g. Colorado Springs Study (Potterat et al., 1990s): • “It is our opinion that STD/HIV contact interviewing and contact tracing experience is crucial for fieldwork studies that explore sensitive personal behaviours and their social context… Participants were informed that positive results would be reported by name to health authorities and that risk partners would require notification. Because participants could not be given the option not to know their results, this presented a potential clash with legal requirements that subjects be able to “withdraw” from the study at any time. At no time did any participant elect to withdraw from the study.” (Potterat et al., 2004: 94).

  16. Research where risk of harm may be legitimate • Research which is deliberately opposed to the interests of the research subjects • E.g.- studies of power or inequality • aim to reveal and critique economic, political, or cultural disadvantage • may have negative impact on some subjects • Research which balances short-term risks to subjects against longer terms gains to beneficiaries

  17. Voluntary participation • Developing Trust with Target Participants • Definition of system boundaries and relevant actors; • Nonresponse -> is network sampling a suitable alternative? • E.g. Use positional and resource generators instead of name-generators in ego-network studies (Knoke and Yang, 2004: 25-26) • Respondent inaccuracy; • Who benefits from network analysis? Who bears the cost? (Kadushin, 2005). Ethical <-> Methodological issues

  18. Informed Consent: How can we handle such risks in SNA? • Visualisation of Networks • E.g. Include sample visualisation in study consent documentation (Borgatti and Molina, 2005) • Key Player Metrics and Sub-Group Analysis • E.g. Include description in consent documentation • E.g. Exclude certain metrics and sub-group analysis from dissemination of findings (see “Participant Confidentiality”) • Non-Perpetual Consent • E.g. Respect of Right to be Forgotten / Right to Erasure (e.g. Removal of Google search results) • Young People / Parental / Guardian Consent • E.g. Increase Public Awareness • E.g. Researcher Demos and Exploratory Meetings as part of consent • E.g. Exclusion from Research

  19. Participant Confidentiality: How SNA poses new risks? • Right to privacy and protection of personal integrity: -Data can be confidential but not anonymous (especially for longitudinal) “Individuals who opt out may be identified by someone who has consented to participate, and opting out of a study that involves SNA may not preclude them from being portrayed in a sociogram”(Borgatti & Molina, 2003). • Standard Qual Tactic: Anonymisation / Pseudonymisation and Removal of personal identifying particulars from raw data • Standard Quant Tactic: Aggregated data presentation • The more detailed the information they provide, the more difficult it is to protect respondent confidentiality

  20. Participant Confidentiality: How can we handle such risks? • E.g. Show only group-level rather than node-level ties

  21. SNA and Participant Confidentiality: How can we handle such risks? E.g. Provide network-level statistics only but not node-level metrics E.g. Provide selective node-level information about who is in which network as well as information regarding the resources embedded in each cluster; but not reveal the maps E.g. Provide locations in a sociogramunique for each respondent, indicating where that person is located (“you are here”). But figure does not include the lines from a sociogram, so respondents cannot infer others’ responses.

  22. SNA and Data Protection: How SNA poses new risks? • Identification Key being kept separate from Raw Data • Identities are traceable in the analysis itself? • Secure Data Storage • Organisational networks providing access to researchers a breach of individual’s right to secure data storage? • “The Internet of Things” and “Big Data” • Passive collection of even more personal data and increased risk of data leakage • Right to be forgotten / Right to Erasure • Potentially harder to do when the sociometric data about an individual come from shared group data

  23. Secondary Data • Ethical review of original research does not rule out issues over secondary use. • Researchers who initially collect data need to expect others to want to re-use it • Needs careful consideration – • Risks of disclosure - e.g. where birth date is included • Issues of presumed consent • Potential issues of ownership and control • Risks in archiving - e.g. inappropriate access

  24. SNA and Data Protection: How can we handle such risks? • E.g. Network identifier separate from personal identifier and raw data • E.g. Permanent destruction of raw data and any personal identifying information in hard copy and electronic form “After participants were assigned unique network identifiers, untraceable to personally identifying information, the data security protocol called for permanent destruction of personal data from hard copy and electronic records, including signed consent forms, thus preventing possible abuse” (Potterat et al., 2004: 95 “History and Lessons of the Colorado Springs Study”)

  25. Voluntary Participation and Informed Consent: How to translate principles into SNA research practice? • E.g. Developing Trust with Target Participants in Sustained Relationships in the Field • E.g. Colorado Springs Study (Potterat et al., 1990s) • “The crucial initial finding was that members of stigmatized groups in Colorado – many of whose behaviours were illegal – were willing to provide complete identifying information on their social, drug, and sexual partners for research purposes. Expecting members of marginalized populations to provide such information for disease control purposes may be rational… yet willingness to cooperate in the absence of disease was unexpected. We believe their cooperation was due to the trust our staff earned through sustained and sympathetic outreach efforts prior to and during the study. Thus, we learn that contact interviewing could succeed even in the absence of sexually transmissable or blood borne infection; properly approached, members of our study populations willingly revealed the most intimate secrets of their lives“ (Potterat et al., 2004: 106).

  26. Benefits of SNA in health and education research Learning in Low-Performing School Districts: Conceptual and Methodological Challenges Resulting from Network Churn Kara Finnigan, Alan Daly, ICLS,2014 SNOPM (Social Networks of People Living with Multiple Chronic Illness - UK)

  27. Conclusion… Balancing Risks and Benefits in SNA Research Our suggestions: SNA researchers should: • Beproactive and take the lead in addressing research ethical governance issues in conducting SNA research; • Share learning across academic fields and different institutional settings; • Engage in open, constructive and productive dialogues with RECs/ IRBs and other stakeholders (participant communities, research funders) to: • Update the ways & means by which we can appropriately translate ethical principles into feasible SNA research practices • Work together to develop an research ethical review process that is fit for purpose If we want to ensure our SNA studies do good work and minimise harm, what would an ideal research ethical review process look like if we were to start from scratch?

  28. SNA and Research Ethics How can we balance the Risks and Benefits of SNA for the public, our research participants and the wider research community? To help, Please take part in our SNA and Ethics Survey at: http://snaethics.wordpress.com/ Thank you, Filipa and Mandy Lee and Ribeiro, 2014

  29. European Group of Ethics on “Balancing Rights” • “The rights we are discussing in the context of security and surveillance technologies, such as the right to privacy and the right to data protection, or the right to information and transparency, are not absolute rights; they must be balanced against other rights and balanced against the rights of other persons or groups.” • “Balancing” does NOT mean “trade-off”: • “Human dignity is the core principle of the European moral framework, and as such it cannot be ‘traded off’.” (Opinion no. 28, 28th May, 2014 – Ethics of Security and Surveillance Technologies) Supplementary Slides

  30. Definition of Personal Data by European Group of Ethics • Personal Data refers to: • “Person identifiable data includes, as in the terms of the Directive 95/46/EEC, any data which either directly or indirectly identifies an individual by reference to her/his name, identification number or to one or more factors specific to her/his physical, physiological, mental, economic, cultural or social identity.” • Personal health data encompass a wide range of information about an individual, which all touch upon an individual’s private life. A health biography could include, not only basic medical data: a history of all medical diagnoses, diseases and medical interventions, medications prescribed, test results, including imaging, etc.,but could also include sensitive data: on mental health, relevant to family history, behavioral patterns, sexual life, social and economic factors, etc.,and healthcare administrative data: admissions and discharge data routine operational data, insurance and financial transactional data, etc. Almost all such data can be recorded in digital form and processed electronically and remain sensitive even after the death of an individual. (Opinion no. 13, 1999, Ethical issues of Healthcare in the Information Society; original emphasis) Supplementary Slides

  31. SNA and Voluntary Participation: How can we handle such risks? • E.g. Refrain from using name generators • E.g. NHSLS and CHSLS studies (Laumann et al., 1990s) • “The NHSLS and CHSLS raised special concerns about confidentiality both because of the sensitive nature of the questions asked, and because respondents are asked to reveal information not only about themselves, but also about their partners… in the egocentric approach used in these surveys… respondents were not asked to give uniquely identifying information, such as names, addresses, or telephone numbers of their partners. In the sociometric approach, in which the respondent’s network partners must also be contacted by the researcher, such questions are likely to be viewed as highly intrusive by the respondent and therefore met with non-disclosure, raising questions of validity and reliability“ (Laumann et al., 2004: 33). • E.g. Use positional and resource generators in ego-network studies (Knoke and Yang, 2004: 25-26)

  32. Key Research Governance Frameworks / Guidelines for Research with People • Healthcare Settings • Nuremberg Code (1948) • Declaration of Helsinki (1964) • WHO Ethical standards • EU Legislation on Biomedical Ethics • EU Data Protection Policies • Data Commissioner Rulings in Individual Countries • Educational Settings • University based Research Ethics Committees • Data Protection Act 1998 • The Joint Committee on Standards for Educational Evaluation, 1994 • Adhoc Codes of Practice… • External regulation and governance of good practice in research • Sponsors introduce guidelines and new terms & conditions based on principles of “good practice”. Supplementary Slides

  33. EU Data Protection Regulation 2014 • To replace the EU Data Protection Directive in 1995 • The Regulation is now being considered and amended by the European Parliament and Council before it is adopted. This process may take until 2015. • The Regulation covers the use of personal data across a wide range of sectors and will affect how patient data are used in research. • 3 main innovations: • One continent, one law: The Regulation will establish a single, pan-European law for data protection, replacing the current inconsistent patchwork of national laws. • One-stop-shop: The Regulation will establish a 'one-stop-shop' for businesses: companies will only have to deal with one single supervisory authority, not 28. • The same rules for all companies – regardless of their establishment: Today European companies have to adhere to stricter standards than their competitors established outside the EU but also doing business on our Single Market. With the reform, companies based outside of Europe will have to apply the same rules. European regulators will be equipped with strong powers to enforce this: data protection authorities will be able to fine companies who do not comply with EU rules with up to 2% of their global annual turnover. Supplementary Slides

  34. EU Data Protection Regulation 2014 • New rules will “put citizens back in control of their data”, notably through: • A right to be forgotten: • When you no longer want your data to be processed and there are no legitimate grounds for retaining it, the data will be deleted. This is about empowering individuals, not about erasing past events or restricting freedom of the press. • Easier access to your own data: • A right to data portability will make it easier for you to transfer your personal data between service providers. • Putting you in control (Specific, Explicit Consent): • When your consent is required to process your data, you must be asked to give it explicitly. It cannot be assumed. Saying nothing is not the same thing as saying yes. Businesses and organisations will also need to inform you without undue delay about data breaches that could adversely affect you. • Data protection first, not an afterthought: • ‘Privacy by design’ and ‘privacy by default’will also become essential principles in EU data protection rules – this means that data protection safeguards should be built into products and services from the earliest stage of development, and that privacy-friendly default settings should be the norm – for example on social networks. Supplementary Slides

  35. Eurobarometer Survey Findings on EU Citizens’ Attitudes on Data Protection & Electronic Identity (2011) (Special Eurobarometer 359) • 43% of Internet users say they have been asked for more personal information than necessary when they proposed to obtain access to or use an online service. • Almost six in ten Internet users usually read privacy statements (58%) and the majority of those who read them adapt their behaviour on the Internet (70%). • Over half of Internet users are informed about the data collection conditions and the further uses of their data when joining a social networking site or registering for a service online (54%). • Just over a quarter of social network users (26%) and even fewer online shoppers (18%) feel in “complete control” of their data. • To protect their identity in daily life, 62% of the Europeans give the minimum required information. • Less than one-third trust phone companies, mobile phone companies and Internet service providers (32%); and just over one-fifth trust Internet companies such as search engines, social networking sites and e-mail services (22%). • 70% of Europeans are concerned that their personal data held by companies may be used for a purpose other than that for which it was collected. • As regards the "right to be forgotten", a clear majority of Europeans (75 %) want to delete personal information on a website whenever they decide to do so. • Even though a majority of European Internet users feel responsible themselves for the safe handling of their personal data, almost all Europeans are in favour of equal protection rights across the EU (90%). Supplementary Slides

  36. Coercion in Contemporary Digital Life - “there is no alternative to disclose PI” to obtain services (58% vs 32%) yet EU citizens feel disclosing PI is a big issue (63% vs 33%) and mind the fact that they were asked to disclose PI in return for “free services online” (51% vs 29%) • Source for the below and later graphics about EU citizens’ experiences online and attitudes to data protection: Special Eurobarometer 359 (EU, 2011) Supplementary Slides

  37. Level of concern with online disclosure of personal information across EU-27 (Red bars indicate “Big issue” with disclosure – average 64%)

  38. What about this idea that you voluntarily give away your personal information to get a “free”, “personalised” service? Supplementary Slides

  39. Level of concern with having to disclose personal info for “free services online” across EU-27 (Those who “mind” vs those who don’t ~2:1)

  40. Are EU citizens comfortable of giving away personal information for a “better personalised experience” online? Supplementary Slides

  41. To what extent EU citizens had experienced unnecessary over-disclosure of personal information online, and how concerned are they in the cases of those who experienced this more than occasionally?

  42. Level of concern about over-disclosure of personal information online for those who experienced it more than occasionally across EU-27

  43. What worries most EU internet users about disclosure of personal information online -> importance of explicit consent Supplementary Slides

  44. Do different generations / sexes / people with different levels of education have different views of online risks? (Short answer: No, EU citizens have remarkably consistent risk perceptions across the demographics. It’s not a question of cool digital natives vs stick-in-the-mud Luddites or tech-savvy men vs clueless women!)

  45. How do EU citizens protect their privacy – the importance of trust for us researchers who want more than minimum info from our participants

  46. SNA and “Do No Harm” – How SNA poses new risks • Ego/Node-level Analysis • Affect perception by others • Revelation of relational vulnerabilities not volunteered by the participants themselves • Revelation of relational advantages that may cause problems for participants in their relationships • Potential false identification / prediction of egos about their network membership (e.g. terror networks) • Group/Network-level Analysis • Identification of cliques and their membership may be problematic for the group as a whole & for sub-groups • Degree of cohesion may contradict group narratives about their functioning Supplementary Slides

  47. Open Access and Public Scrutiny of Data – Data Protection • Increasing pressure on academics to publish their research on an open access basis; • Open data with difficult access; • New types of publications (e.g.: data papers, data descriptors); • Data-centric enterprises ??? Supplementary Slides Lee and Ribeiro, 2014

  48. Right to Respect of Personal Integrity and Dignity: How can we translate this into SNA research practice? • Avoid Reductionism • Refrain from overly simplistic interpretations of social relations and being complacent about the validity (accuracy, comprehensiveness, sensitivity) of network statistics • Do not reduce human beings into mere sums of their existing relations • Do not reduce human will to merely an error term in a predictive model (with thanks to Prof. Lomos) • Refrain from fixed labelling and categorising of people on the basis of their current activities and affiliations • Enable Participants to Talk Back • Preserving participants’ own voices and interpretations of their social relations via mixed methods approaches Supplementary Slides

  49. Our interactions & relations with others help us become who we are in a never-ending process… Thanks to my partner, I haven’t finished growing and learning and having fun! "Don't be afraid." That's what Ruby Bridges's mother told her on November 4, 1960. Little Ruby listened carefully to the advice. Soon, four United States federal court marshals, or officers, arrived at the Bridges family home in New Orleans, La., to drive the first grader to William Frantz Public School. A screaming mob was waiting. People stood near the building shouting. Ruby held her head high. With the marshals surrounding her, the 6-year-old walked into the school Supplementary Slides

  50. Conclusion… Balancing Risks and Benefits in SNA Research “On the benefit side, academic researchers always benefit, organizations, society and science may benefit, but individual respondents rarely do” (Kadushin, 2005) No research is entirely without risks; the challenge is to identify and manage the risks appropriately and adequately so that beneficial research gets done. Our suggestions: SNA researchers should: • Beproactive and take the lead in addressing research ethical governance issues in conducting SNA research; • Share learning across academic fields and different institutional settings; • Engage in open, constructive and productive dialogues with RECs/ IRBs and other stakeholders (participant communities, research funders) to: • Update the ways & means by which we can appropriately translate ethical principles into feasible SNA research practices • Work together to develop an research ethical review process that is fit for purpose Supplementary Slides

More Related