The word “cult” is often bandied about by critics of the transgender movement, but is it an accurate or useful descriptor? What is a cult, exactly, and how can an understanding of cult dynamics help us understand the tenacity of gender ideology in our culture?
Cults are often sensationalised in the media as fringe religious groups led by smooth-talking gurus or frenzied doomsday prophets. The reality, however, is much more complex and often far more disturbing. Behind every cult is a system of control, manipulation, and psychological influence. Members become embroiled, but their families and surrounding society are affected too. A cult might be ruled by doctrine dictated by a charismatic leader or an ideology, but the emotional and psychological needs of members are the lifeblood that keeps it going.
What Is a Cult?
Broadly defined, a cult is a group or movement with a shared commitment to a leader or an ideology. Cults use manipulative techniques of control and persuasion to isolate and dominate members. They can be religious, political, therapeutic, ideological, commercial, or even self-improvement-based.
Sociologist Janja Lalich, a former cult member herself, defines a cult as a “self-sealing social system” meaning that once someone is in a cult, all information and relationships are interpreted in a way that supports the cult’s ideology. Cults don’t tolerate outside views or questioning.
Psychiatrist Robert Jay Lifton has defined a cult as a group or movement that exhibits excessive devotion to a person, idea, or cause, and employs unethical and manipulative techniques of persuasion and control to further the goals of the group’s leaders. Lifton’s work is based on his research into Chinese Communist re-education programmes during the 1950s. His analysis draws on interviews with Western missionaries, doctors, and teachers who had been imprisoned by the Chinese Communist regime, American POWs from the Korean War, and Chinese intellectuals and students who had gone through “thought reform” or ideological training - what some would call brainwashing. In fact, the term “brainwashing” comes from Chinese informants describing indoctrination techniques used following the communist takeover.
Defining Characteristics
The defining characteristics of a cult, as identified by experts including Lifton, Lalich, and Steve Hassan - a mental health counsellor and former member of the Unification Church (aka the “Moonies”) - centre around coercive control, ideological totalism, and the suppression of critical thinking. Their models differ slightly in focus and terminology but all describe how cults manipulate behaviour, belief, and identity to serve a central authority or ideology.
Hassan developed the BITE Model, which outlines how cults control members through:
Behaviour control - regulating where members live and work, who they associate with, and how they spend their time.
Information control - censoring or distorting information, limiting access to outside sources, and promoting propaganda.
Thought control - requiring members to internalise the group’s beliefs and encouraging an “us versus them” mentality.
Emotional control - using guilt, fear, and love-bombing to keep members obedient and emotionally dependent.
Lifton’s model emphasises psychological manipulation and ideological rigidity and has been applied to cults, terrorist organizations, political movements, and high-control therapy or corporate groups. He identified eight criteria of thought reform:
Milieu control - control of communication within an environment (both information and social contact).
Mystical manipulation - leaders claim to be divinely guided or to have a higher purpose.
Demand for purity - a sharp boundary between “good” and “evil”, with pressure for moral perfection.
Confession - forced or encouraged disclosure of past wrongs or thoughts, often used to shame or control.
Sacred science - the group’s ideology is considered the ultimate truth and beyond questioning.
Loading the language - use of jargon to shape thought and limit independent reasoning.
Doctrine over person - personal experiences are subordinated to group beliefs.
Dispensing of existence - outsiders are seen as lesser, evil, or not fully human.
Janja Lalich in her classic book on cult recovery Take Back Your Life focuses on how people become trapped in cults by internalising ideological systems, even when they appear to be acting voluntarily. The cultic influence is not just external pressure, but internalised mental confinement, according to Lalich. Her core concepts are:
Bounded choice - over time, a member’s perception of reality is reshaped so that all decisions feel internally motivated but are actually constrained or bounded by the group’s ideology.
Charismatic authority - the leader becomes the ultimate source of truth and purpose.
Transcendent belief system - the group claims to have absolute truth, purpose, or salvation.
Systems of control and influence - behavioural and emotional reinforcement mechanisms maintain loyalty.
Who Joins Cults?
It’s tempting to think that people who get drawn into cults are weak and gullible, or not very sharp, but that’s generally not the case. Research shows that people who join cults are often idealistic and educated, and seeking meaning, purpose, or belonging. The key is not weakness, but vulnerability. According to Steve Hassan in his book, Combating Cult Mind Control, people are most vulnerable to cult recruitment during periods of transition or distress such as bereavement, divorce, relocation, or career uncertainty. Circumstances which are challenging or precarious can, understandably, cause a person to be more drawn to groups that offer certainty, community, and a sense of purpose.
Indoctrination into a cult is rarely a sudden and obvious process. It’s more likely to be gradual, relying on incremental commitment and emotional bonding. Cults do not often disclose their true beliefs or structure upfront, so vulnerable people may be recruited and caught in the web before they or anybody around them realise what has happened.
A Brief History
High control groups built around a charismatic leader or rigid ideology have probably always existed in some form, but the 1960s-80s saw them flourish for a few reasons. The social upheaval of the ‘60s with the Vietnam War, the civil rights movement, and a growing distrust in institutions led many young people to seek alternative lifestyles and belief systems. The growth of mass media with television and radio meant ideas and messages could spread quickly and widely.
A few particularly high profile cults have captured public attention since that era due to their destructive practices and tragic outcomes:
The People’s Temple, a religious community founded in the US in the 1950s and led by Jim Jones, started as a progressive, racially integrated church but evolved into an authoritarian cult. In 1978 over 900 of its members, including Jones himself, died in a mass murder-suicide orchestrated by Jones in Jonestown, Guyana.
Heaven’s Gate was another religious group founded in the US in the 1970s by Marshall Applewhite and Bonnie Nettles. They preached that followers would ascend to a spaceship trailing the Hale-Bopp comet. In 1997, 39 members died by suicide, believing they’d be beamed up to a higher existence.
In recent years NXIUM, marketed as a self-help and business development organisation, has come to public attention. NXIUM was founded by Keith Raniere in 1998 and used self-improvement courses to recruit members - many of whom were young women - into practices that involved branding, coercion, and sexual exploitation. In 2020, Raniere was sentenced to 120 years in prison for sex trafficking and forced labour, among other crimes.
Access All Areas
Times have changed though. The advent of the internet and social media have dramatically altered how we communicate, socialise, learn, and navigate the world. Access to ideas, organisations, and people has become almost limitless for anyone with an internet connection, but the scale and speed of change in how we live have far outpaced our ability to understand or manage the consequences.
The opportunities to manipulate and be manipulated are endless. Social media is a ubiquitous presence in young people’s lives and they are particularly vulnerable to its appeal and its dangers. The prefrontal cortex - the part of the brain responsible for self-regulation, impulse control, and critical thinking - doesn’t fully mature until the mid-20s making young people more susceptible to addictive platform design, persuasive marketing, pornography, ideological messaging, and extremist political content. Today’s youngsters spend large portions of their time online, mostly unsupervised, interacting with friends and strangers, being influenced by algorithms, and absorbing ideas they are often not equipped to critically evaluate. Irish research shows that 74% of 12 year olds own a smartphone with the number rising to 90% by 14 years of age. Irish children are on average nine years old getting their first phone.
Psychologist Jonathan Haidt has written about how the explosion of smartphones and social media around 2010 coincided with a steep rise in mental health issues among teenagers. Haidt suggests that platforms like Instagram and TikTok have rewired the adolescent experience. Real life personal interaction and unstructured play have been replaced with curated self-presentation, comparison, and exposure to potentially harmful content.
The digital landscape is fertile ground for modern cults. A massive audience of susceptible people combined with algorithms, targeted marketing, and manipulative online techniques similar to those used in traditional cults, creates the perfect storm. We are human. We’re wired for connection, community, and creativity, but our devices are designed to hook our attention. If we can’t or won’t get off them, we are likely to be drawn to inadequate and potentially dangerous substitutes online - groups and ideologies disguised as self-help, activism, or community become almost irresistible.
A Rose by Any Other Name
So what might a cult look like in this age of connectivity? Cult-like communities are emerging left, right, and centre and they’re highly effective at recruiting followers without any physical or formal infrastructure. They might be based around diet, health and wellness, or parenting for example, and consist of influencers marketing themselves and their ideas to followers who exhibit intense devotion, conformity and intolerance of dissent. These communities share cultic traits - often charismatic leadership, rigid beliefs and adherence to rules, and an expectation of all or nothing commitment.
While cult-like thinking around food, health, or parenting might be irritating for those around the devotee, these groups mostly attract adults and are relatively harmless. There are far more sinister and harmful ideologies capturing young people - groups based on misogyny and the manosphere (think Andrew Tate), eating disorders, self harm, extreme politics, and trans ideology. Young people are being radicalised in the privacy of their own bedrooms.
Trans ideology is an outlier in that it’s somewhat embedded now as a positive in popular culture. We are told, and we are mostly inclined, to “be kind” and make allowances for people to “be themselves”. But this idea - that man, woman, both, or neither is based on how you feel, rather than your actual biology - has no basis in fact or science. It is a notion that some people, more so young people, uncritically hold. And yet, in certain activist and institutional contexts, trans ideology is adapted and enforced in ways that closely resemble cult dynamics as described above by Lalich, Lifton, and Hassan. There is a demand for total belief, silencing of dissent, control of language, and psychological pressure on people - especially detransitioners and feminists - who speak out.
As the saying goes, if it looks like a duck, walks like a duck, and quacks like a duck, then it just may be a duck.
In part 2, we’ll examine how the defining features of cults, as described above by the experts, manifest in the context of trans ideology.
Catherine Monaghan is an Irish women’s rights activist and founding member of Wicklow Women 4 Women.
Genspect publishes a variety of authors with different perspectives. Any opinions expressed in this article are the author’s and do not necessarily reflect Genspect’s official position. For more on Genspect, visit our FAQs.
Who Put the Cult in Culture?
Join Genspect at The Bigger Picture conference in Albuquerque, New Mexico, September 27–28, 2025 where we’ll dig deeper into how trans ideology became internalised by masses of people — and what we can do about it. Register now at genspect.org
Tickets selling fast - secure your seat now.
Excellent article. During the Eighties I volunteered with an organization that sought to lead young people out of the cults that were flourishing at the time. The anti-cultists in organizations of this type tended to argue that cult members were victims of predatory cult leaders, rather than having agency and reasons of their own for joining the cult and working their way up the hierarchy. I found that many of the anti-cult people were highly authoritarian, and wanted their family members to return to the authoritarian family system or to a more acceptable authoritarian religion, but did not want cult members to question and come to their own conclusions. There was a lot of emphasis on the idea that "Anyone can be brainwashed into a cult."
Eventually, there was some peer reviewed research into the psychology of cult members, and unsurprisingly, the investigators found that people with certain kinds of mental illnesses, particularly Cluster B personality disorders, were overrepresented in the membership of cults. I no longer have these references on file, but I encourage everyone to look deeper into how members create cults and cult leaders as much or more than vice-versa. Luke Conway's book "Liberal Bullies," which focuses on leftist authoritarianism, explains how people with pre-existing authoritarian personality traits predominate in authoritarian mass movements.
Calling trans ideology a cult might feel clarifying or cathartic to some—especially detransitioners, distressed parents, or others who’ve experienced firsthand what they see as psychological manipulation or coercion. But as a strategy for persuasion or policy reform, the term is likely to do more harm than good.
First, there’s no identifiable cult leader, which undermines the analogy at the outset. Unlike classical cults built around charismatic figures like Jim Jones or Keith Raniere, gender ideology is a decentralized belief system enforced through institutions, professional bodies, and social media platforms. That makes the “cult” label less precise and more vulnerable to being dismissed as hyperbolic or conspiratorial.
Second, the term doesn’t map neatly onto legal or policy mechanisms. The U.S. has no statutes criminalizing “being a cult,” and prosecutors can only act when there are concrete violations (fraud, abuse, coercive confinement). No Republican attorney general is going to launch a prosecution based on cult analysis, nor will Democratic legislators rethink their positions because someone called gender ideology a cult. If anything, using that label plays into existing narratives that critics are hysterical or bigoted.
Third, it risks alienating the very people reformers need to reach. Centrist Democrats, liberal professionals, and institutional actors might agree that gender ideology has gone too far—particularly around youth medicalization, language policing, and the erosion of women’s rights—but they are unlikely to respond to language that sounds moralistic or sectarian. Calling something a cult doesn’t invite dialogue; it closes it down. Even people with deep concerns may tune out if the rhetoric feels more like denunciation than analysis.
Fourth, the cult label doesn’t help craft solutions. While it might illuminate certain mechanisms (thought reform, language control, emotional coercion), these insights can be expressed in clinical or psychological terms that are more digestible to professionals and policymakers. If the goal is to regulate medical practices, revise school policies, or challenge professional guidelines, the language of institutional capture, unquestionable dogma, and suppression of dissent is more likely to resonate than “cult.”
Lastly, there are more strategic ways to use cultic insight. One can describe the psychological dynamics—bounded choice, confession rituals, us-vs-them thinking—without ever using the word. This makes the critique accessible to a broader audience and harder to dismiss. It also shows respect for those who’ve adopted these beliefs out of vulnerability or idealism rather than malice or manipulation.
In short, calling trans ideology a cult may feel like a rhetorical win, but it’s unlikely to persuade wavering centrists, change Democratic minds, or create new legal avenues. If the goal is reform, not just condemnation, then it’s more effective to describe the dynamics soberly, avoid moral panic language, and frame the conversation in terms that policymakers and professionals can hear.