From Wikipedia, the free encyclopedia
Brainwashing, also known as thought reform or re-education, is the application of persuasive techniques to change the beliefs or behavior of one or more people usually for political or religious purposes. Whether any techniques at all exist that will actually work to change thought and behavior to the degree that the term “brainwashing” connotes is a controversial and at times hotly debated question.
• 1 Origin of the term
• 2 Present use of the term
• 3 Political brainwashing
o 3.1 Studies of the Korean War
o 3.2 The use of coercive persuasion techniques in China
o 3.3 Mass brainwashing
o 3.4 Refutation of political brainwashing
• 4 Brainwashing controversy in new religious movements and cults
o 4.1 The APA, DIMPAC, and the brainwashing theories
o 4.2 Other voices
• 5 Brainwashing in fiction
• 6 References
• 7 See also
• 8 Bibliography
• 9 External links
Origin of the term
The term brainwashing is a relatively new term in the English language. Before 1950, it did not exist. Earlier forms of coercive persuasion had been seen during the Inquisition, the show trials against “enemies of the state” in the Soviet Union, etc., but no specific term emerged until the methodologies of these earlier movements were systematized during the early decades of the People’s Republic of China for use in their struggles against internal class enemies and foreign invaders. Until that time, descriptions were limited to concrete descriptions of specific techniques.
The term xǐ năo (洗脑, the Chinese term literally translated as “to wash the brain”) was first applied to methodologies of coercive persuasion used in the “reconstruction” (改造 gǎi zào) of the so-called feudal (封建 fēng jiàn) thought patterns of Chinese citizens raised under prerevolutionary regimes. The term first came into use in the United States in the 1950s during the Korean War, to describe those same methods as applied by the Chinese communists to attempt deep and permanent behavioral changes in foreign prisoners, and especially during the Korean War to disrupt the ability of captured United Nations troops to effectively organize and resist their imprisonment.
It was consequently used in the U.S. to explain why, compared to earlier wars, a relatively high percentage of American GIs defected to the Communists after becoming prisoners of war. Later analysis determined that some of the primary methodologies employed on them during their imprisonment included sleep deprivation and other intense psychological manipulations designed to break down the autonomy of individuals. American alarm at the new phenomenon of substantial numbers of U.S. troops switching their allegiances to the enemy was ameliorated after prisoners were repatriated and it was learned that few of them retained allegiance to the Marxist and anti-American doctrines that had been inculcated during their incarcerations. The key finding was that when rigid control of information was terminated and the former prisoners’ natural methods of reality testing could resume functioning, the superimposed values and judgments were rapidly attenuated.
Although the use of brainwashing on United Nations prisoners during the Korean War produced some propaganda benefits, its main utility to the Chinese lay in the fact that it significantly increased the maximum number of prisoners that one guard could control, thus freeing other Chinese soldiers to go to the battlefield.
In later times the term “brainwashing” came to apply to other methods of coercive persuasion and even to the effective use of ordinary propaganda and indoctrination. And in the formal discourses of the Chinese Communist Party, the more clinical-sounding term “sī xǐang gǎi zào” (thought reform) came to be preferred.
Present use of the term
Many people have come to use the terms “brainwashing” or “mind control” to explain the otherwise intuitively puzzling success of some methodologies for the religious conversion of inductees to new religious movements (including cults).
The term “brainwashing” is not widely used in psychology and other sciences, because of its vagueness and history of being used in propaganda, not to mention its association with hysterical fears of people being taken over by foreign ideologies. It is often more helpful to analyze “brainwashing” as a combination of manipulations to promote persuasion and attitude change, propaganda, coercion, capture-bonding, and restriction of access to neutral sources of information. Note that many of these techniques are more subtly used (usually unconsciously) by advertisers, governments, schools, parents and peers, so the aura of exoticism around “brainwashing” is undeserved. At the same time, nuanced forms of indoctrination and propaganda in religious, political and commercial venues may occasion wider and deeper impacts than do outright coercive tactics. Mirroring George Orwell’s doublespeak, strategists of indoctrination and propaganda frequently disguise themselves as promoters of freedom and liberation.
Thought reform is the alteration of a person’s basic attitudes and beliefs by outside manipulation. The term usually relates closely to brainwashing and mind control.
One of the first published uses of the term thought reform occurred in the title of the book by Robert Jay Lifton (a professor of psychology and psychiatry at John Jay College and at the Graduate Center of the City University of New York): Thought Reform and the Psychology of Totalism: A Study of ‘Brainwashing’ in China (1961). (Lifton also testified at the 1976 trial of Patty Hearst.) In that book he used the term thought reform as a synonym for brainwashing, though he preferred the first term. The elements of thought reform as published in that book are sometimes used as a basis for cult checklists and are as follows.   • Milieu Control
• Mystical Manipulation
• The Demand For Purity
• Sacred Science
• Loading the Language
• Doctrine Over Person
• Dispensing of Existence
Benjamin Zablocki sees brainwashing as “term for a concept that stands for a form of influence manifested in a deliberately and systematically applied traumatizing and obedience-producing process of ideological resocializations” and states this same concept had historically also been called thought reform and coercive persuasion.
Popular speech continues to use the word brainwashed informally and pejoratively to describe persons subjected to intensive influence resulting in the rejection of old beliefs and in the acceptance of new ones; or to account for someone who holds strong ideas considered to be implausible and that seem resistant to evidence, common sense, experience, and logic. Such popular usage often implies a belief that the ideas of the allegedly brainwashed person developed under some external influence such as books, television programs, television commercials (as producing brainwashed consumers), video games, religious groups, political groups, or other people. Mind control expresses a conception only mildly less dramatic than brainwashing, with thought control slightly milder again. With thought reform and coercion we start to move into acceptably neutral academic jargon and into the areas of propaganda, influence and persuasion.
Studies of the Korean War
The Communist Party of China used the phrase “xǐ nǎo” (“wash brain”) to describe their methods of persuasion in ensuring that members who did not conform to the Party message were brought into orthodoxy. The phrase was a play on “xǐ xīn”, (洗心”wash heart”) a monition found in many Daoist temples exhorting the faithful to cleanse their hearts of impure desires before entering.
In September 1950, the Miami Daily News published an article by Edward Hunter (1902-1978) titled “‘Brain-Washing’ Tactics Force Chinese into Ranks of Communist Party.” It contained the first printed use of the English-language term “brainwashing,” which quickly became a stock phrase in Cold War headlines. Hunter, a CIA propaganda operator who worked under-cover as a journalist, turned out a steady stream of books and articles on the subject. An additional article by Hunter on the same subject appeared in New Leader magazine in 1951. In 1953 Allen Welsh Dulles, the CIA director at that time, explained that “the brain under [Communist influence] becomes a phonograph playing a disc put on its spindle by an outside genius over which it has no control.”
In his 1956 book “Brain-Washing: The Story of the Men Who Defied It”, Edward Hunter described “a system of befogging the brain so a person can be seduced into acceptance of what otherwise would be abhorrent to him.” According to Hunter, the process is so destructive of physical and mental health that many of his interviewees had not fully recovered after several years of freedom from Chinese captivity.
Later, two studies of the Korean War defections by Robert Lifton and Edgar Schein concluded that brainwashing had a transient effect when used on prisoners of war. Lifton and Schein found that the Chinese did not engage in any systematic re-education of prisoners, but generally used their techniques of coercive persuasion to disrupt the ability of the prisoners to organize to maintain their morale and to try to escape. The Chinese did, however, succeed in getting some of the prisoners to make anti-American statements by placing the prisoners under harsh conditions of physical and social deprivation and disruption, and then by offering them more comfortable situations such as better sleeping quarters, better food, warmer clothes or blankets. Nevertheless, the psychiatrists noted that even these measures of coercion proved quite ineffective at changing basic attitudes for most people. In essence, the prisoners did not actually adopt Communist beliefs. Rather, many of them behaved as though they did in order to avoid the plausible threat of extreme physical abuse. Moreover, the few prisoners influenced by Communist indoctrination apparently succumbed as a result of the confluence of the coercive persuasion, and of the motives and personality characteristics of the prisoners that already existed before imprisonment. In particular, individuals with very rigid systems of belief tended to snap and realign, whereas individuals with more flexible systems of belief tended to bend under pressure and then restore themselves when the external pressures were removed.
Two researchers working individually, Lifton and Schein, discussed coercive persuasion in their analysis of the treatment of Korean War POWs. They defined coercive persuasion as a mixture of social, psychological and physical pressures applied to produce changes in an individual’s beliefs, attitudes, and behaviors. Lifton and Schein both concluded that such coercive persuasion can succeed in the presence of a physical element of confinement, “forcing the individual into a situation in which he must, in order to survive physically and psychologically, expose himself to persuasive attempts.” They also concluded that such coercive persuasion succeeded only on a minority of POWs and that the end result of such coercion remained very unstable, as most of the individuals reverted to their previous condition soon after they left the coercive environment.
The use of coercive persuasion techniques in China
Following the armistice that interrupted hostilities in the Korean War, a large group of intelligence officers, psychiatrists, and psychologists was assigned to debrief United Nations soldiers being repatriated. The government of the United States wanted to understand the unprecedented level of collaboration, the breakdown of trust among prisoners, and other such indications that the Chinese were doing something new and effective in their handling of prisoners of war. Formal studies in academic journals began to appear in the mid-1950s, as well as some first-person reports from former prisoners. In 1961, two books were published by specialists in the field who synthesized these studies for the non-specialists concerned with issues of national security and social policy. Edgar H. Schein wrote on Coercive Persuasion, and Robert J. Lifton wrote on Thought Control and the Psychology of Totalism. Both books were primarily concerned with the techniques called “xǐ nǎo” or, more formally “sī xiǎng gǎi zào” (reconstructing or remodeling thought). The following discussion is based in large part on their studies.
Although American attention came to bear on thought reconstruction or brainwashing as one result of the Korean War, the techniques had been used on ordinary Chinese citizens after the establishment of the Peoples Republic of China. The PRC had refined and extended techniques earlier used in the Soviet Union to prepare prisoners for show trials, and they in turn had learned much from the Inquisition. In the Chinese context, these techniques had multiple goals that went far beyond the simple control of subjects in the prison camps of North Korea. They aimed to produce confessions, to convince the accused that they had indeed perpetrated anti-social acts, to make them feel guilty of these crimes against the state, to make them desirous of a fundamental change in outlook toward the institutions of the new communist society, and, finally, to actually accomplish these desired changes in the recipients of the brainwashing/thought-reform. To that end, brainwashers desired techniques that would break down the psychic integrity of the individual with regard to information processing, with regard to information retained in the mind, and with regard to values. Chosen techniques included: dehumanizing of individuals by keeping them in filth, sleep deprivation, partial sensory deprivation, psychological harassment, inculcation of guilt, group social pressure, etc. The ultimate goal that drove these extreme efforts consisted of the transformation of an individual with a “feudal” or capitalist mindset into a “right thinking” member of the new social system, or, in other words, to transform what the state regarded as a criminal mind into what the state could regard as a non-criminal mind.
The methods of thought control proved extremely useful when they came to be employed for gaining the compliance of prisoners of war. Key elements in their success included tight control of the information available to the individual and tight control over the behavior of the individual. When, after repatriation, close control of information ceased and reality testing could resume, former prisoners fairly quickly regained a close approximation of their original picture of the world and of the societies from which they had come. Furthermore, prisoners subject to thought control often had simply behaved in ways that pleased their captors, without changing their fundamental beliefs. So the fear of brainwashed sleeper agents, such as that dramatized in the novel and the films The Manchurian Candidate, never materialized.
Terrible though the process frequently seemed to individuals imprisoned by the Chinese Communist Party, these attempts at extreme coercive persuasion ended with a reassuring result: they showed that the human mind has enormous ability to adapt to stress and also a powerful homeostatic capacity. John Clifford, S.J. gives an account of one man’s adamant resistance to brainwashing in In the Presence of My Enemies that substantiates the picture drawn from studies of large groups that were reported by Lifton and Schein. Allyn and Adele Rickett  wrote a more penitent account of their imprisonment (Allyn Rickett had by his own admission broken PRC laws against espionage) in “Prisoners of the Liberation,” but it too details techniques such as the “struggle groups” described in other accounts. Between these opposite reactions to attempts by the state to reform them, experience showed that most people would change under pressure and would change back when the pressure was removed. The other interesting result was that some individuals derived benefit from these coercive procedures due to the fact that the interactions, perhaps as an unintended side effect, actually promoted insight into dysfunctional behaviors that were then abandoned.
In societies where the government maintains tight control of both the mass media and education system and uses this control disseminate propaganda on a particularly intensive scale the overall effect can be to brainwash large sections of the population. This is particularly effective where nationalist or religious sentiment is invoked and where the population is poorly educated and has limited access to independent or foreign media.
Refutation of political brainwashing
According to research and forensic psychologist Dick Anthony, the CIA invented the brainwashing ideology as a propaganda strategy to undercut communist claims that American POWs in Korean communist camps had voluntarily expressed sympathy for communism and that definitive research demonstrated that collaboration by western POWs had been caused by fear and duress, and not by brainwashing. He argues that the CIA brainwashing theory was pushed to the general public through the books of Edward Hunter, who was a secret CIA “psychological warfare specialist” passing as a journalist. He further asserts that for twenty years starting in the early 1950s, the CIA and the Defense Department conducted secret research (notably including Project MKULTRA) in an attempt to develop practical brainwashing techniques (possibly to counteract the brainwashing efforts of the Chinese), and that their attempt was a failure.
Brainwashing controversy in new religious movements and cults
The main disputes regarding brainwashing exist in the field of cults and NRMs. The controversy about the existence of cultic brainwashing is one of the most polarizing issues which separate the camps of cult sympathizers and cult critics. There is no agreement about the existence of a social process attempting coercive influence and neither about the existence of the social outcome that people are influenced against their will.
The issue gets even more complicated through the existence of several brainwashing definitions, some of them almost strawman caricatures, and through the introduction of the similarly controversial mind control concept in the 1990s, which is at times interchangeably used for brainwashing and at other times differentiated from brainwashing. Additionally, some authors refer to brainwashing as recruitment method (Barker) while others refer to brainwashing as a method of retaining existing members (Kent 1997, Zablocki 2001).
Another factor is, that brainwashing theories have been discussed in the court, where the experts had to pronounce their views before the jury in simpler terms than those used in academic publications and where the issue had to be presented rather black and white to make a point in the case. Such cases including their black and white colorings have been taken up by the media.
In 1984 British sociologist Eileen Barker said in her book The Making of a Moonie: Choice or Brainwashing, which was based on her first hand studies of British Unification Church members, that she had found no extraordinary persuasion techniques being used to recruit or retain members.
The APA, DIMPAC, and the brainwashing theories
In the early 1980s, some U.S. mental health professionals became controversial figures due to their involvement as expert witnesses in court cases against new religious movements. In their testimony, they stated that anti-cult theories of brainwashing, mind control, or coercive persuasion were generally accepted concepts within the scientific community. The American Psychological Association (APA) in 1983 asked Margaret Singer, one of the leading proponents of coercive persuasion theories, to chair a taskforce called DIMPAC to investigate whether brainwashing or “coercive persuasion” did indeed play a role in recruitment by such movements. Before the taskforce had submitted its final report, however, the APA submitted on February 10, 1987 an amicus curiæ brief in an ongoing case. The brief stated that
[t]he methodology of Drs. Singer and Benson has been repudiated by the scientific community, that the hypotheses advanced by Singer were little more than uninformed speculation, based on skewed data and that “[t]he coercive persuasion theory … is not a meaningful scientific concept..
The brief characterized the theory of brainwashing as not scientifically proven and suggests the hypothesis that cult recruitment techniques might prove coercive for certain sub-groups, while not affecting others coercively. On March 24, 1987, APA filed a motion to withdraw its signature from this brief, as it considered the conclusion premature, in view of the ongoing work of the DIMPAC taskforce. The amicus as such was kept, as only APA withdraw the signature, but not the co-signed scholars among them Jeffrey Hadden, Eileen Barker, David Bromley and J. Gordon Melton. On May 11, 1987, the Board of BSERP rejected the DIMPAC report because
the brainwashing theory espoused lacks the scientific rigor and evenhanded critical approach necessary for APA imprimatur”, and concluded Finally, after much consideration, BSERP does not believe that we have sufficient information available to guide us in taking a position on this issue.”
Several scholars in the NRM sympathizers camp have since interpreted this in the way that APA had then rejected the brainwashing theories and that there was no scientific support for them (e.g. Introvigne, 1998, Bromley and Hadden In their 1993 Handbook of Cults and Sects in America.)
Zablocki (1997) and Amitrani (2001) cite APA boards and scholars on the subject and conclude that there is no unanimous decision of the APA regarding this issue. They also write that Margaret Singer despite the rejection of the DIMPAC report continued her work and was respected in the psychological community, which they corroborate by mentioning that in the 1987 edition of the peer-reviewed Merck’s Manual, Margaret Singer was the author of the article “Group Psychodynamics and Cults” (Singer, 1987).
Benjamin Zablocki, professor of sociology and one of the reviewers of the rejected DIMPAC report, writes in 1997:
“Many people have been misled about the true position of the APA and the ASA with regard to brainwashing. Like so many other theories in the behavioral sciences, the jury is still out on this one. The APA and the ASA acknowledge that some scholars believe that brainwashing exists but others believe that it does not exist. The ASA and the APA acknowledge that nobody is currently in a position to make a Solomonic decision as to which group is right and which group is wrong. Instead they urge scholars to do further research to throw more light on this matter. I think this is a reasonable position to take.”
APA Division 36 (then Psychologist interested in Religion Issues, today Psychology of Religion) in its 1990 annual convention approved the following resolution:
“The Executive Committee of the Division of Psychologists Interested in Religious Issues supports the conclusion that, at this time, there is no consensus that sufficient psychological research exists to scientifically equate undue non-physical persuasion (otherwise known as “coercive persuasion”, “mind control”, or “brainwashing”) with techniques of influence as typically practiced by one or more religious groups. Further, the Executive Committee invites those with research on this topic to submit proposals to present their work at Divisional programs.” (PIRI Executive Committee Adopts Position on Non-Physical Persuasion Winter, 1991, in Amitrano and Di Marzio, 2001)
In 2002, APA’s then president, Philip Zimbardo wrote in Psychology Monitor:
“A body of social science evidence shows that when systematically practiced by state-sanctioned police, military or destructive cults, mind control can induce false confessions, create converts who willingly torture or kill “invented enemies,” engage indoctrinated members to work tirelessly, give up their money–and even their lives–for “the cause.” (Zimbardo, 2002)
The often quoted Fishman Case the court concluded:
“At best, the evidence establishes that psychiatrists, psychologists, and sociologists disagree as to whether or not there is agreement regarding the Singer-Ofshe thesis. ”
Social scientists who study new religious movements, such as Jeffrey K. Hadden (see References), understand the general proposition that religious groups can have considerable influence over their members, and that that influence may have come about through deception and indoctrination. Indeed, many sociologists observe that “influence” occurs ubiquitously in human cultures, and some argue that the influence exerted in “cults” or new religious movements does not differ greatly from the influence present in practically every domain of human action and of human endeavor.
The Association of World Academics for Religious Education, states that “… without the legitimating umbrella of brainwashing ideology, deprogramming — the practice of kidnapping members of NRMs and destroying their religious faith — cannot be justified, either legally or morally.”
The American Civil Liberties Union (ACLU) published a statement in 1977 related to brainwashing and mind control. In this statement the ACLU opposed certain methods “depriving people of the free exercise of religion.” The ACLU also rejected (under certain conditions) the idea that claims of the use of ‘brainwashing’ or of ‘mind control’ should overcome the free exercise of religion. (See quote)
In the 1960s, after coming into contact with new religious movements (NRMs, popularly referred to as “cults”), some young people suddenly adopted faiths, beliefs, and behavior that differed markedly from their previous lifestyles and seemed at variance with their upbringings. In some cases, these people neglected or even broke contact with their families. All of these changes appeared very strange and upsetting to their families. To explain these phenomena, the theory was postulated that these young people had been brainwashed by these new religious movements by isolating them from their family and friends (inviting them to an end of term camp after university for example), arranging a sleep deprivation program (3 a.m. prayer meetings) and exposing them to loud and repetitive chanting. Another alleged technique of religious brainwashing involved love bombing rather than torture.
James Richardson, a Professor of Sociology and Judicial Studies at the University of Nevada, claims that if the NRMs had access to powerful brainwashing techniques, one would expect that NRMs would have high growth rates, while in fact most have not had notable success in recruitment, most adherents participate for only a short time, and that the success in retaining members has been limited. This claim has been rejected by Langone who compared the figures of various movements some which do by common consent not use brainwashing and others who are by some authors reported to use brainwashing. (Langone, 1993)
In their Handbook of Cults and Sects in America, Bromley and Hadden present one possible ideological foundation of brainwashing theories that they claim demonstrates the lack of scientific support: They argue that a simplistic perspective they see as inherent in the brainwashing metaphor appeals to those attempting to locate an effective social weapon to use against disfavored groups, and that any relative success of such efforts at social control should not detract from any lack of scientific basis for such opinions.
Note that some religious groups, especially those of Hindu and Buddhist origin, openly state that they seek to improve the natural human mind by spiritual exercises. Intense spiritual exercises have an effect on the mind, for example by leading to an altered state of consciousness. These groups state, however, that they do not [condone the] use [of] coercive techniques to acquire or to retain converts.
On the other hand, several scholars in sociology and psychology have in recent years claimed that there is among many scholars of NMRs a bias to deny any brainwashing possibility and to disregard actual evidence. (Zablocki 1997, Amitrani 1998, Kent 1998, Beit-Hallahmi 2001)
Psychologist Steven Hassan, author of the book Combatting Cult Mind Control, has suggested that the influence of sincere but misled people can provide a significant factor in the process of thought reform. However, many scholars in the field of new religious movements do not accept Hassan’s Bite model for understanding cults.
Brainwashing in fiction
Spoiler warning: Plot and/or ending details follow.
To meet Wikipedia’s quality standards, this article or section may require cleanup.
Please discuss this issue on the talk page, or replace this tag with a more specific message. Editing help is available.
This article has been tagged since May 2006.
• In George Orwell’s Nineteen Eighty-Four, brainwashing is used by the totalitarian government of Oceania to erase nonconformist thought and rebellious personalities.
• The alarmist concept of brainwashing functioned as a central theme in the 1962 movie The Manchurian Candidate and again in the 2004 remake in which Communist brainwashers turned a soldier into an assassin through something akin to hypnosis. The idea that one person could be so enslaved to another as to do their bidding even when no longer under duress, has fascinated dramatists and movie viewers throughout the ages.
• The Charles Bronson movie Telefon had a similar plot to The Manchurian Candidate but featured water supply tampering as the brainwashing technique instead of hypnotic suggestion.
• In the Stanley Kubrick film A Clockwork Orange, criminals are re-educated in an attempt to remove their violent tendencies.
• The Ipcress File, where Michael Caine tries to resist his re-programming.
• The first film in the The Naked Gun trilogy, where Reggie Jackson, among others, becomes a tool in an effort to kill Queen Elizabeth II
• The NBC miniseries V, where the alien Visitors use a “conversion chamber” to turn humans into obedient allies.
• Zoolander, which depicts male model Derek Zoolander (Ben Stiller) becoming brainwashed/hypnotized into trying to kill a fictional Prime Minister of Malaysia. Zoolander, of these, is probably the closest to the Chinese methods. He is sleep deprived, and isolated. However, it also uses the hypnotic trigger idea, which is less realistic.
• In the 1978-81 BBC series Blake’s Seven, former freedom fighter Roj Blake undergoes brainwashing therapy (referred to as “the treatment”‘) to eradicate his revolutionary ideals and turn him into a model citizen exhibit. The treatment wears off, however, and he continues fighting against the corrupt Federation who gave him “tranquillised dreams”.
• In the video game Psychonauts, Boyd Cooper, the security guard at Thorney Towers, was hypnotised and had a second personality dubbed “The Milkman” implanted into his mind, that would be triggered when certain actions were performed, or certain commands given to Boyd.
• Amitrani, Alberto et al.: Blind, or just don’t want to see? “Brainwashing”, mystification and suspicion, 1998,  • Amitrani, Alberto et al.: Blind, or just don’t want to see? “”Mind Control” in New Religious Movements and the American Psychological Association, 2001, Cultic Studies Review  • Anthony, Dick. 1990. “Religious Movements and ‘Brainwashing’ Litigation” in Dick Anthony and Thomas Robbins, In Gods We Trust. New Brunswick, NJ: Transaction. Excerpt
• APA Amicus curiae, February 11, 1987  • APA Motion to withdraw amicus curiae March 27, 1987 • APA Board of Social and Ethical Responsibility for Psychology, Memorandum on Brainwashing: Final Report of the Task Force, May 11, 1987  • Bardin, David, Mind Control (“Brainwashing”) Exists, in Psychological Coercion & Human Rights, April 1994,  • Benjamin Beith-Hallahmi: Dear Colleagues: Integrity and Suspicion in NRM Research, 2001  • David Bromley, A Tale of Two Theories: Brainwashing and Conversion as Competing Political Narratives in Benjamin Zablocki and Thomas Robbins (ed.), Misunderstanding Cults, 2001, ISBN 0-8020-8188-6
• Hadden, Jeffrey K., The Brainwashing Controversy,  November 2000
• Hadden, Jeffery K., and Bromley, David, eds. (1993), The Handbook of Cults and Sects in America. Greenwich, CT: JAI Press, Inc., pp. 75-97
• Hassan, Steven Releasing The Bonds: Empowering People to Think for Themselves, 2000. ISBN 0-9670688-0-0.
• Hindery, Roderick, Indoctrination and Self-deception or Free and Critical Thought? 2001.
• Huxley, Aldous, Brave New World Revisited. Perennial (2000); ISBN 0-06-095551-1
• Introvigne, Massimo, “Liar, Liar”: Brainwashing, CESNUR and APA, 1998  • Kent, Stephen A., Brainwashing in Scientology’s Rehabilitation Project Force (RPF)”, November 7, 1997  • Stephen A. Kent and Theresa Krebs: When Scholars Know Sin, Skeptic Magazine (Vol. 6, No. 3, 1998).  • Kent, Stephen A.: Brainwashing Programs in The Family/Children of God and Scientology , in Benjamin Zablocki and Thomas Robbins (ed.), Misunderstanding Cults, 2001, ISBN 0-8020-8188-6
• Langone, Michael: Recovering from Cults, 1993
• Robert J. Lifton, Thought Reform and the Psychology of Totalism (1961), ISBN 0-8078-4253-2
• Richardson, James T., “Brainwashing Claims and Minority Religions Outside the United States: Cultural Diffusion of a Questionable Concept in the Legal Arena”, Brigham Young University Law Review circa 1994
• Scheflin, Alan W and Opton, Edward M. Jr., The Mind Manipulators. A Non-Fiction Account, (1978), p. 437
• Schein, Edgar H. et al., Coercive persuasion;: A socio-psychological analysis of the “brainwashing” of American civilian prisoners by the Chinese Communists, (1961)
• Shapiro, K. A. et al, Grammatical distinctions in the left frontal cortex, J. Cogn. Neurosci. 13, pp. 713-720 (2001).  • Singer, Margaret “Group Psychodynamics”, in Merck’s Manual, 1987.
• Wakefield, Hollida, M.A. and Underwager, Ralph, Ph.D., Coerced or Nonvoluntary Confessions, Institute for Psychological Therapies, 1998
• West, Louis J., “Persuasive Techniques in Religious Cults, 1989
• Zablocki, Benjamin: The Blacklisting of a Concept: The Strange History of the Brainwashing Conjecture in the Sociology of Religion. Nova Religion, Oct. 1997
• Zablocki, Benjamin, Towards a Demystified and Disinterested Scientific Theory of Brainwashing, in Benjamin Zablocki and Thomas Robbins (ed.), Misunderstanding Cults, 2001, ISBN 0-8020-8188-6
• Zablocki, Benjamin, “Methodological Fallacies in Anthony’s Critique of Exit Cost Analysis”, ca. 2002,  • Zimbardo, Philip Mind Control: Psychological Reality or Mindless Rhetoric? in Monitor on Psychology, November 2002  See also
• Aversion therapy
• A Clockwork Orange
• Cults and mind control controversies
• Milgram experiment
• Opposition to cults and new religious movements
• Religious conversion
• Stockholm syndrome
• Media and ethnicity
theory of conversion exit tactics
• Anthony, Dick, Brainwashing and Totalitarian Influence. An Exploration of Admissibility Criteria for Testimony in Brainwashing Trials, Ph.D. Diss., Berkeley (California): Graduate Theological Union, 1996, p. 165.
• Barker, Eileen, The Making of a Moonie: Choice or Brainwashing, Oxford, UK : Blackwell Publishers, 1984 ISBN 0-631-13246-5
• Committee on Un-American Activities (HUAC), Communist Psychological Warfare (Brainwashing), United States House of Representatives, Washington, D. C., Tuesday, March 13, 1958
• Hassan, Steven. Releasing The Bonds: Empowering People to Think for Themselves, 2000. ISBN 0-9670688-0-0.
• Hunter, Edward, Brain-Washing in Red China. The Calculated Destruction of Men’s Minds, New York: The Vanguard Press, 1951; 2nd expanded ed.: New York: The Vanguard Press, 1953
• Robert J. Lifton, Thought Reform and the Psychology of Totalism (1961), ISBN 0-8078-4253-2
• Sargant, William, Battle for the Mind: A Physiology of Conversion and Brainwashing, 1996, ISBN 1-883536-06-5
• Taylor, Kathleen, Brainwashing: The Science Of Thought Control, 2005, ISBN 0-19-280496-0
• Benjamin Zablocki and Thomas Robbins (ed.), Misunderstanding Cults, 2001, ISBN 0-8020-8188-6
• Mind Control and Ritual Abuse
• Thought Reform: A Brief History of the Model and Related Issues: Part I By Lawrence A. Pile Pile works for the Wellspring Retreat & Resource Center, a residential treatment facility for victims of thought reform and cultic abuse, located in the USA
• Brainwashing: a Synthesis of the Communist Textbook on Psychopolitics, with an Introduction by Eric D. Butler
• Brainwashing and the Cults: The Rise and Fall of a Theory (lengthy essay) by J. Gordon Melton
• Report of the APA Task Force on Deceptive and Indirect Techniques of Persuasion and Control, November 1986
• “Brainwashing” : Career of a Myth in the United States and Europe – Paper delivered by Dr Massimo Introvigne at the CESNUR-REMID conference held in Marburg, Germany, on March 27-29,1998
• Communist Psychological Warfare (Brainwashing), Consultation With Edward Hunter, Author And Foreign Correspondent, By Committee On Un-American Activities, House Of Representatives, Eighty-Fifth Congress, Second Session, March 13, 1958
• Lifton’s research on “thought reform”
• Brainwashing @ pHinnWeb
• SSSR Resolution on New Religious Groups
• Marci Hamilton, The Elizabeth Smart Case: Why We Need Specific Laws Against Brainwashing
• Emile Coue’s book on Autosuggestion
• NSA Whistleblower John St. Clair Akwei’s Lawsuit Against The NSA And It’s Remote Neural Monitoring Brainwashing Technology