RESEARCH SKILLS
Ethics in practice
- Make ethics more than a box-ticking exercise
- Seek professional counsel when you really need it
HOW TO
Research tends to be costly and risky; its results often emerge slowly and with uncertain value. What’s more, positive outcomes rarely benefit the “subjects” who lend their time, bodies, and life stories to a research project. How can we ensure, at the very least, that we cause them no harm? And how can we be more accountable to the people we encounter through our research, as well as to the taxpayers who underwrite it?
These nagging questions have produced a wealth of ethical principles, guidelines, policies, and review processes. So many of them, in fact, that they often overwhelm researchers. Many find it impossible to reconcile abstract standards such as “beneficence” with the messy circumstances in which they work. You’d be surprised at how many researchers see formal ethical reviews as a box-ticking exercise—a set of performative constraints that offer little in the face of real-life dilemmas.
We must rank concerns to know which deserve most attention
The solution is twofold. First, we must rank concerns, so that researchers know which obligations deserve most attention, especially at the start of their careers. Second, we must make space for more dynamic, open-minded discussions that are informed by moral principles but also focus on the unforeseen, ambiguous, or highly specific situations that never fail to arise. Even obvious priorities like security are not necessarily clear-cut, and may conflict with other ethical best practices. In the context of authoritarian regimes, for example, asking interviewees to fill out a well-intentioned consent form may actually put them at serious risk.
A practical and sensible hierarchy always starts with protection from harm. Other concerns can be sequenced somewhat differently depending on the nature of the research. The order presented here applies generally to the social sciences.
1. Protection from harm
The researcher’s utmost duty is to cause no harm to the people involved in their research. The research methodology must think through how to ensure the safety of remote communications, in-person meetings, and data collection and retention. Given today’s abundance of cybersecurity risks, for example, a good first step is to entirely anonymize digital files containing interview transcripts and survey results. Any strategy should be reviewed by others, and refined based on their feedback.
Many other, more subtle forms of harm must also be accounted for. Research may reinforce existing stigmas surrounding vulnerable groups; exacerbate tensions between communities; or re-traumatize interviewees who are victims of violence. The more sensitive a topic or environment, the more we must seek specialized advice, and avoid relying on generic rules.
2. Falsehoods
Research, by its nature, involves a moral obligation to pursue truth—which also means banishing falsehoods. These may be self-evident, as with fabricating or falsifying data. However, endorsing unverified facts, making selective use of information to fit one’s bias, distorting the narratives of interviewees through subjective note-taking, circulating unsourced images, exaggerating the benefits to expect from research projects, and plagiarizing other sources are all examples of intellectual dishonesty.
Sadly, some researchers believe that such practices are minor sins. In reality, they undermine the methodological foundations and ethical underpinnings of scientific research. If a gray zone does exist, it concerns how researchers present themselves and their projects: At times, partial disclosure of our identity, goals, and methodology may be advisable, if only to protect interviewees from possible retribution. But this is a delicate and dangerous line to tread, and in such cases, researchers must proactively seek out competent oversight.
3. Positionality
Our position within society has countless facets, making our interactions with others complex and uncertain. We must therefore analyze how this position of ours shapes the kinds of information we receive and the understandings we reach. Our research can easily and unwittingly reproduce gender biases, class distinctions, colonial legacies, and so on.
To offset our own influence on the research, it helps to let our interviewees define the rules. If we ask them to choose the setting of an encounter, it will likely produce a different kind of discussion than if we impose our own preferences. Our language and vocabulary should adapt to theirs, to avoid talking down to them or overwhelming them with unnecessary jargon. Our dress code may have to shift slightly too, as a sign of respect, humility, or professionalism. We may also have to endure (and still analyze) harsh criticism and some amount of venting. Self-reflection becomes an even more stringent responsibility when approaching vulnerable groups suffering from ostracism, exclusion, or violence. This implies not just an extra dose of sensitivity, but greater competence and more careful oversight, which we must secure from our peers.
Being aware of positionality isn’t restricted to extreme cases. It amounts to reflecting, throughout a research project, on how we select interviewees, how we address them, how we write our findings, how we make use of other materials such as photography (of poverty, say), and how we compensate and credit the people who have assisted us in our work.
4. Accountability
Accountability, as an ethical principle, is essential but poorly defined. It involves proving the social value of our work. If indeed our research does no harm, but does absolutely no good either, is it anything more than a waste of precious intellectual and financial resources? To prevent this, we must ask ourselves: Are our results made public in ways that are truly accessible, respectful to interviewees, and useful to them, at least as a form of open knowledge? Have funds been employed in ways that could satisfy ordinary taxpayers (and not just the institutional patrons that funnel them)? Is our work reaching the right people, to increase the chances it will produce tangible effects? Do our contributions fit into a collective enterprise, in which we build on others’ efforts and credit them, rather than just reinvent the wheel?
Accountability is essential yet poorly defined
Some accountability problems are all too familiar: researchers who refuse to collaborate with their peers; cursory desk research ignoring most of the existing literature; datasets that remain private property when they could serve the public interest; articles that are never translated into the languages of interviewees; academic books that are unaffordable to anyone but universities; publications brimming with jargon to the point of being unreadable; and larger budgets for get-together events than for the research that informs them.
Such basic lapses in accountability are so frequent that they jeopardize the sector as a whole, reinforcing a growing disaffection with research that can be felt across most societies—with dire consequences for our future.
* * *
When we’re satisfied with filling out an ethics form, that’s a cop-out. A real ethical posture would see us disclose and reflect upon our hesitations, and be honest and open about our mistakes, especially the more serious ones. Above all, we must be open to learning from others, seeking out role models who have most likely faced—and overcome—similar problems before. So, we might neatly encapsulate the principles above in the form of a likeable archetype: a researcher careful not to harm people, exacting and attentive to detail, aware of the sensitivities involved in studying others, and driven to make their work useful.
No one starts out like this. Instead, we get there by treating ethics as an ongoing discussion, with both our research subjects and our professional peers, throughout our careers. We look up to role models for inspiration. And only then, in turn, may we become one of them.
20 September 2022
Illustration credit: Tengyart on Unsplash / licensed by Unsplash.