Know Your Cognitive Biases
Understanding why we believe what we believe—and why others see things differently—is essential in discussions about MAiD. Our thinking is shaped not just by facts, but by cognitive biases that influence how we interpret information and form conclusions. Recognizing these biases can help us reflect on our own views, better understand opposing perspectives, and spot when messaging is appealing to emotion rather than evidence.
For example, the illusory truth effect shows how repeated claims can start to feel true over time—even when they aren’t—something that appears regularly in public conversations about MAiD.
This is not an exhaustive list, but below are some common cognitive biases, with definitions, explanations, and examples of how they can appear in discussions about MAiD.
-
What it is:
It’s important to first recognize we don’t always recognize our own biases. Then to recognize that people we disagree with are in the same boat.
Bias blind spot is the tendency to recognize cognitive bias in other people more readily than in ourselves. People often accept that bias exists in general while treating their own judgments as relatively objective.
How it is used / how it shows up:
This is especially important in MAiD discussions because both opponents and supporters can become highly confident that only the other side is emotionally distorted, selective, ideological, or captured by narrative. Once that happens, the concept of bias becomes a weapon rather than a tool for self-examination.
-
What it is:
Confirmation bias is the tendency to seek out, notice, interpret, and remember information in ways that confirm what we already believe, while giving less weight to information that complicates or challenges those beliefs. The bias does not just affect what evidence we accept; it also affects what evidence we even bother to look for in the first place.
How it is used / how it shows up:
In MAiD debates, opponents may collect only stories that seem alarming and ignore the larger legal and clinical context, while supporters may focus only on well-functioning cases and too quickly dismiss troubling edge cases or implementation problems. This is one of the most important biases in misinformation because it can make people’s beliefs feel highly “evidence-based” while they are actually curating a one-sided evidentiary world for themselves.
-
What it is:
The availability heuristic is the tendency to judge how common, likely, or important something is based on how easily examples come to mind. Events that are vivid, emotional, recent, dramatic, or heavily covered are easier to recall and therefore feel more frequent or representative than they really are.
How it is used / how it shows up:
A shocking MAiD story, or even a disputed one, can loom much larger in public consciousness than ordinary cases that draw no attention. Opponents may take a handful of vivid stories as proof of the whole system. Availability is one reason emotionally gripping anecdotes can overpower dry but important legal or statistical context. Fear resonates more than facts.
-
What it is:
The illusory truth effect is the tendency to judge repeated statements as more true simply because they are familiar. Repetition increases perceived accuracy, even when the claim is false and even when people have reasons to know better.
How it is used / how it shows up:
This is one of the most important mechanisms in misinformation ecosystems. If a false MAiD claim is repeated across podcasts, activist sites, social posts, clips, and interviews, it starts to feel established. The danger is not only that people believe it; repetition can also increase willingness to share it, which helps misinformation propagate. Again, the claim about drowning is a perfect example. The Euthanasia Prevention Coalition continues to spread this claim, even though their own co-founder and medical doctor, Dr. Johnston, said that it isn’t true. Repetition = truth.
-
What it is:
The framing effect is the tendency for choices and judgments to change depending on how the same information is presented. Equivalent facts can produce different reactions when cast in terms of gains versus losses, autonomy versus vulnerability, rights versus risks, or compassion versus protection.
How it is used / how it shows up:
In MAiD, one side may frame an issue as autonomy and relief from suffering; the other may frame the same issue as social abandonment or failure of care.
Neither frame is necessarily fabricated, but framing determines which moral features become psychologically salient. Because people often mistake frames for complete reality, framing can function as a subtle but powerful persuasion tool.
In 2024, 16,499 people made the autonomous choice to end their suffering. Opponents of MAiD say 16,499 people were abandoned and coerced towards having MAiD. Two different frames.
The framing effect is not always that obvious:
MAiD vs. Euthanasia
the provision of MAiD vs. the administration of a lethal injection
legal medical practice vs. legalized killing
Each of those are technically true, but how they are framed changes how the listener or reader experiences them.
-
What it is:
Motivated reasoning is the tendency to process information in a biased way because we are motivated to reach a preferred conclusion, protect an identity, defend a moral worldview, or avoid psychological discomfort. In other words, reasoning becomes partly goal-directed toward a desired end state, not merely truth-tracking.
How it is used / how it shows up:Someone who sees MAiD as fundamentally compassionate may interpret every criticism as bad-faith propaganda. Someone who sees MAiD as fundamentally wrong may interpret every expansion, policy dispute, or anecdote as proof of moral collapse. Motivated reasoning is particularly powerful in culture-war issues because facts are often filtered through identity, loyalty, fear, and moral meaning before they are consciously “analysed.”
-
What it is:
Negativity bias is the tendency for negative information, threats, harms, and losses to attract more attention and carry more psychological weight than neutral or positive information. Bad news sticks harder.
How it is used / how it shows up:
Misinformation campaigns often exploit negativity because fear travels well. A grisly or morally shocking MAiD claim will usually spread farther than a careful explanation of law, safeguards, or patient experience.
-
What it is:
The false consensus effect is the tendency to overestimate how many other people share our views, reactions, or values. Our own social circles often become an echo chamber that feels like society at large.
How it is used / how it shows up:
A strongly anti-MAiD network may convince itself that “everyone” is horrified and that only elites support MAiD. A strongly pro-MAiD network may convince itself that opposition is fringe, unserious, or merely performative. False consensus distorts both advocacy strategy and reading of public opinion. For anti-MAiD groups, maintaining that echo chamber is critical to maintaining the false consensus effect. That is probably why most of them have banned us from commenting in their social media spaces.
-
What it is:
Authority bias is the tendency to give undue weight to statements from people seen as authoritative, expert, credentialed, or high-status, even when the topic lies outside their expertise or the evidence is thin.
How it is used / how it shows up:
A physician, lawyer, academic, or commentator can become unusually persuasive on MAiD simply because of title or confidence. This is one reason misinformation can spread through impressive messengers.
The engineering professors at Western University who published a speculative paper on potential MAiD-related cost savings are a clear example: despite operating well outside their area of expertise, their work was treated as authoritative—and in some cases distorted into the claim that it reflected actual Health Canada policy, rather than a hypothetical analysis. A published paper by university professors was taken as truth.
-
What it is:
The Dunning–Kruger effect describes a pattern in which people with low competence in a domain may overestimate their understanding because the skills needed to perform well are often also needed to recognize one’s own errors. More broadly, overconfidence is the tendency to overestimate the accuracy of our judgments or abilities.
How it is used / how it shows up:
Complex legal-clinical topics like MAiD can produce strong opinions after only a little exposure. A person who has watched a few clips, read a few anecdotes, or absorbed a few slogans may feel unusually certain that they understand eligibility, safeguards, coercion, capacity, or clinical practice.
Kelsi Sheren is a perfect example of this. Despite claiming to be an expert, Sheren makes simple factual errors about MAiD on a nearly daily basis. For example, Sheren repeatedly claims the MAiD medications are dangerous because they are not FDA approved for use in MAiD. Despite multiple efforts to explain to Sheren that the FDA does not regulate medications in Canada, she calls us “idiots” and restates the claim with even more confidence. That’s the Dunning-Kruger effect.
-
What it is:
The identifiable victim effect is the tendency to respond more strongly to a vivid, specific individual than to abstract groups, statistics, or structural patterns. One face often outweighs many numbers.
How it is used / how it shows up:
The classic example of this is the fact that one starving child on TV will garner more support and donations than a statistic of 10,000 children dying from starvation.
In MAiD discourse, a single emotionally compelling personal story can dominate public understanding, whether it is used to argue that the system is merciful or that it is dangerous. Personal stories matter enormously, but this bias reminds us that a single case should not automatically be treated as the whole landscape.
-
What it is:
In-group bias is the tendency to evaluate our own group more favourably and interpret its motives more charitably than those of outsiders. Out-groups are more easily caricatured, distrusted, and morally flattened.
How it is used / how it shows up:
MAiD may see critics as uniformly religious, reactionary, or dishonest. Anti-MAiD communities may see supporters as uniformly callous, utilitarian, or captured by ideology. Once that divide hardens, people stop hearing claims as claims and start hearing them as tribal signals.
In a discussion with a disability rights advocate, we agreed with their concerns about the possibility that allowing MAiD might give people the message that life with a disability is not worth living. But that concession was viewed with suspicion and criticism because we were an out-group.
This is also why it was important to share the story about the anti-MAiD doctor saying people do not drown to death when they have MAiD. He was part of the in-group, and therefore his opinion was more likely to be listened to.
-
What it is:
Belief perseverance is the tendency for beliefs to persist even after the evidence that originally supported them has been discredited or contradicted. Once a story has “taken root,” it can survive the collapse of its factual foundation.
How it is used / how it shows up:
A dramatic but false MAiD claim can continue shaping opinion long after it has been corrected. That matters because public memory often preserves the emotional gist of a claim more than the later correction. Once a person has built a moral narrative around an alarming story, merely disproving one factual plank often does not fully undo the conclusion.
A perfect example of this is the false claim that the MAiD medications cause fluid to collect in the person’s lungs, causing drowning. A claim that persists despite being demonstrably wrong, and disproven by reliable sources.
-
What it is:
Naive realism is the tendency to believe that we see the world as it really is, and that people who disagree are uninformed, irrational, or biased. It is the quiet assumption that our own perspective is simply the unvarnished truth.
How it is used / how it shows up:
In MAiD debates, naive realism leads people to interpret disagreement as evidence of stupidity, malice, or corruption rather than as the predictable result of different experiences, values, fears, and informational environments. It fuels contempt, polarization, and the collapse of charitable interpretation. We should try to take the most charitable interpretation of opponents' views, beliefs, and statements, with the understanding that you can only assume good faith up to a point.
-
What it is:
Omission bias is the tendency to judge harmful actions as worse than equally harmful omissions, even when the omission is foreseeable and consequential. People often feel more morally responsible for what they do than for what they knowingly allow.
How it is used / how it shows up:
In MAiD, omission bias can shape how people compare actively providing MAiD with allowing prolonged suffering, delayed access, forced transfers, or non-intervention. Some people treat action as morally contaminated and omission as morally cleaner, even when the omission also predictably causes harm. That does not settle the moral issue, but it does explain why some intuitions feel stronger than their underlying logic.
-
What it is:
Hindsight bias is the tendency, after an outcome is known, to see it as having been more predictable than it really was. People reconstruct the past so that warning signs look more obvious and uncertainty seems smaller than it was.
How it is used / how it shows up:
After a disputed MAiD case becomes public, critics may say the risks were always obvious, while supporters may say the controversy was unforeseeable and manufactured. Hindsight bias encourages blame narratives and false certainty, especially in emotionally intense cases where everyone now knows the ending.
-
What it is:
The Affect Heuristic is the tendency to let our immediate emotional reaction guide judgments about risk, truth, and value. If something feels frightening, disgusting, compassionate, noble, or sinister, that feeling can quietly do much of the reasoning for us.
How it is used / how it shows up:
MAiD is especially vulnerable to this because it touches death, suffering, dependency, medicine, religion, and vulnerability. A single word choice can trigger a moral-emotional state that changes how all later information is processed. Language such as “care,” “choice,” “killing,” “poison,” or “dignity” does not merely describe; it often pre-loads judgment.
-
What it is:
The sunk cost fallacy is the tendency to continue investing in a course of action because of resources already spent, even when those past investments should not determine the best current decision.
How it is used / how it shows up:
Organisations, campaigns, and public figures can become attached to narratives they have invested heavily in. Once someone has built a brand, movement, or identity around a particular story of MAiD, backing away becomes psychologically and socially expensive. That can keep bad arguments alive long after they should have been retired. This is one reason it is very difficult to convince someone they are mistaken, even if you have inconclusive evidence.
If you’ve made it all the way down here, there’s a decent chance you’ve just experienced the sunk cost fallacy in real time—you got halfway through and figured you might as well keep going.
-
What it is:
The empathy gap is the tendency to underestimate how much another person’s emotional or physical state affects their reasoning, choices, and priorities, or to mispredict how our own state would change our judgment in the same circumstances.
How it is used / how it shows up:
People not living with severe suffering, dependency, fear of deterioration, or loss of bodily function may badly misjudge why someone would request or refuse MAiD. Conversely, those who are deeply familiar with end-of-life suffering may underestimate how frightening MAiD feels to others. The empathy gap can make each side interpret the other as morally incomprehensible.