Algorithmic Amplification
One of the easiest mistakes to make online is assuming that if something is widely seen, widely shared, or constantly appearing in your feed, it must have some credibility behind it. But on modern platforms, visibility is often driven less by accuracy than by engagement. Content rises because it gets clicks, comments, outrage, reactions, and shares—not because it is true.
Posts that frame MAiD in shocking or alarming ways often perform better than careful explanations of law, safeguards, or clinical reality. A dramatic anecdote, an angry accusation, or a misleading headline can spread far more quickly than a nuanced discussion of eligibility criteria or oversight processes. The result is that many people encounter the most emotionally charged version of the topic first, and often most often.
Over time, repetition creates its own kind of credibility. If people repeatedly see claims that MAiD is “out of control,” that vulnerable people are routinely being pressured, or that extreme cases are common, those claims can begin to feel established simply because they are familiar. In many cases, what people are responding to is not evidence, but algorithmic exposure.
This does not require a conspiracy or deliberate platform bias. It is often the predictable outcome of systems built to maximize attention. Content that provokes fear, anger, or moral conflict tends to outperform content that is cautious, technical, or evidence-based. That means misinformation can be rewarded by design, even when no one explicitly intends it.
Recognizing this dynamic is important. A claim appearing everywhere online does not tell you whether it is accurate. It may only tell you that it is effective at capturing attention. In debates about MAiD, where the stakes are personal and emotional, understanding that difference can help people pause, verify, and look beyond what the algorithm chose to show them.
References
The Social Dilemma - Netflix (YouTube Trailer)
Tristan Harris (2017). How a handful of tech companies control billions of minds every day - TED Talk
Pariser, E. (2011). The Filter Bubble.
Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook.Science.
Max Fisher (2022). The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World