Why cult members go along with demands and against their self-interests

We recently consulted with Dr. Janja Lalich, a world-recognized expert in cults and coercion, about how cults can harm their youngest members — and the roles their parents and other adults in their lives often play in keeping them in abusive systems of control. Among Dr. Lalich’s insights:

My book, Bounded Choice: True Believers and Charismatic Cults,1 builds on the foundational work of classic researchers Solomon Asch,2 Stanley Milgram,3 and Leon Festinger.4 Their works demonstrate that humans are strongly inclined to act on the basis of authority and that we change our behaviors to be more consistent with that of other group members. Bounded Choice addresses the confounding issue of why cult members go along with the demands made on them, many of which may go against their very own self-interest.

I explain how a charismatic leader is able to dominate his devotees by leading them through an indoctrination, or resocialization, program of concerted manipulations that are meant to deprive the individual of critical thinking abilities and convince him or her that the professed path to “salvation” (with all its twists and turns) is an absolute necessity in order to be accepted by both the leader and one’s peers in the group and to eventually reach the purported salvation. By establishing internal discipline, complete with rules and behavioral norms, the leader is able to enforce compliance and conformity, obedience and loyalty. In order to accomplish this, the cult’s social structure is comprised of four specific dimensions:

  • the charismatic authority,
  • a transcendent belief system,
  • systems of control, and
  • systems of influence.

A bounded-choice environment is one in which the expression of only a limited and tightly regulated repertoire of beliefs, behaviors, and emotions is permissible. The illusion of choice is created for the subjects in this controlled milieu (which I call “bounded reality”). The imbalance of power between leader and followers is maintained through a battering and debilitating emphasis on the member’s identity, which is systematically broken down and remolded to fit the preferred image. This resocialization process (see, e.g., Goffman5) is also known as thought reform (see Lifton6) or coercive persuasion (see Schein7). One of the consequences of this process is a state called “hyper-compliance”8 – that is, an unquestioning belief and blind obedience to the cult’s or cult leader’s demands.

While every member’s experience is unique and different from others’ experiences, there are some factors that contribute to the types of post-cult aftereffects a former member might encounter. These factors include:

  • the age of the person while in the cult and at the time of leaving;
  • prior history of emotional problems;
  • coping skills learned or not learned;
  • length of time in the group;
  • intensity of the indoctrination process;
  • extent of psychological and emotional manipulation;
  • severity of daily life in the cult;
  • physical and/or sexual harm and threats of violence;
  • poor or inadequate medical care;
  • lack of outside contact and/or support; and
  • lack of empathy and/or support within the group.

Especially for those who were raised in a cult, independent thinking is an unknown practice and has to be learned after leaving the cultic environment. Getting to the point of being an independent thinker can take some time because of typical aftereffects, such as confusion, difficulty making decisions, anxiety, fear and paranoia, guilt and shame, reacting to triggers (reminders of the cult or the cult leader), skewed or lost memories, emotional uncertainties, depression and a sense of loss, and an inability to trust others or self. The cult has instilled looking at the world, at others, and at one’s self in black-and-white terms. Grey areas do not exist, are not allowed.

1 Lalich, Bounded Choice.

2 Asch, S. “Effects of Group Pressure upon the Modification and Distortion of Judgment,” in M. H. Guetzkow, ed., Groups, Leadership, and Men (Pittsburgh, Carnegie, 1951).

3 Milgram, S. Obedience to Authority: An Experimental View (New York: Harper & Row, 1974).

4 Festinger, L. A Theory of Cognitive Dissonance (New York: Row, Peterson, 1957); L. Festinger, H. W. Rieken, and S. Schachter, When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the World (New York: Harper & Row, 1956).

5 Goffman, E. Asylums (New York: Anchor Books, 1961).

6 Lifton, Thought Reform and the Psychology of Totalism.

7 Schein, E. H. Coercive Persuasion (New York: Norton, 1961). Also “Groups and Intergroup Relationships,” in W. E. Natemeyer and J. S. Gilberg, eds., Classics of Organizational Behavior (Danville, IL: Interstate Printers & Publishers, 1970), pp. 172-178.

8 Zablocki, B. “Hyper Compliance in Charismatic Groups,” in D. D. Franks and T. S. Smith, eds., Mind, Brain and Society: Toward a Neurosociology of Emotion, Social Perspectives on Emotion (Stamford, CT: JAI Press, 1999), pp. 287-310.