The Culture of Cults
Michael Martella
Introduction
Cult Mind Control
Religious Freedom & Moral
The Nature of Personal Belief
(and) Recruitment By Cults
Leaving A Cult
Problems in Exposing Cults
The Culture of Cults
Summary
Religious organizations and movements are free to practice their
religion as they choose, subject to the laws of the land. In practice, this
means that cults, in promoting their religious beliefs and gaining adherents,
are free to use deception, misrepresentation, psychological coercion or any
other techniques which do not leave physical traces and are difficult to prove
in a court of law.
Initially using conventional marketing techniques, cults promote their
particular belief systems. The trick is that through influencing a person’s
beliefs, it is possible to influence or indirectly control a person’s mind. The
actual controlling of mind is done by the person themselves, as they attempt to
train and discipline their mind in accordance with the tenets of their new
belief system. It is the belief system itself which is the primary active agent
in cult mind control.
Cult belief systems differ from conventional belief systems in a number
of subtle but significant ways, which may not be apparent to an outsider. To
understand the nature of these differences is to understand the nature of a
cult.
Cult belief systems are typically:
Independent and non-accountable
- believers follow their own self-justifying moral codes: e.g. a Moonie may, in
their own mind, justify deceptive recruiting as ‘deceiving evil into goodness’.
Aspirational - they appeal to ambitious, idealistic people. The
assumption that only weak, gullible people join cults is not necessarily true.
Personal and experiential - it is not possible to exercise informed
free choice in advance, about whether the belief system is valid or not, or
about the benefits of following the study and training opportunities offered by
the group. The benefits, if any, of group involvement can only be evaluated
after a suitable period of time spent with the group. How long a suitable
period of time might be, depends on the individual, and cannot be determined in
advance.
Hierarchical and dualistic - cult belief systems revolve around ideas
about higher and lower levels of understanding. There is a hierarchy of
awareness, and a path from lower to higher levels. Believers tend to divide the
world into the saved and the fallen, the awakened and the deluded, etc.
Bi-polar - believers experience alternating episodes of faith and
doubt, confidence and anxiety, self-righteousness and guilt, depending how well
or how badly they feel they are progressing along the path.
Addictive - believers may become intoxicated with the ideals of the
belief system, and feel a vicarious pride in being associated with these
ideals. Cults tend to be cliquey and elitist, and believers can become
dependent on the approval of the group’s elite to maintain their own
self-esteem. At an extreme, believers fear they will fall into hell if they
leave the group.
Psychologically damaging - when established members leave or are
expelled, they may develop a particular kind of cult-induced mental disorder,
marked by anxiety and difficulty in making decisions. The disorder exhibits
similarities to (but is not identical to) post-traumatic stress disorder, and
certain types of adjustment disorders. [ICD 10, F60.6, F66.1, etc.]
Non-falsifiable - a cult belief system can never be shown to be invalid
or wrong. This is partly why critics have low credibility, and why it can be
difficult to warn people of the dangers of a cult.
The Culture of Cults
Introduction
Preamble
The intended purpose of this analysis, written by a former cult member,
is to explain the nature of a cult, to warn others of the dangers of
involvement with a cult group, and to support calls for society to be more
pro-active in protecting the rights of individuals targeted by cults.
Preliminary Definitions
A cult can be defined in general as any group of people holding to a
common belief system, but in practice the term cult is often used pejoratively,
to refer specifically to ‘a quasi-religious organization using devious psychological
techniques to gain and control adherents’ (Collins English Dictionary) , and
this is the sense in which ‘cult’ is used in this analysis. This analysis will
seek to explain what those ‘devious psychological techniques’ are, how they
work, and why they are devious.
Various terms have been used to describe the devious psychological
techniques allegedly used by cults, the most common being ‘brainwashing’,
‘mind-control’, and ‘thought reform ’and‘ mental manipulation.
The term 'mind control' can be misleading. It suggests that a person's
mind can be robotically controlled by some outside agency, or that thoughts can
somehow be hypnotically implanted in a person’s mind. This is not at all what
happens in a cult. This analysis will argue that in fact a cult controls its
members primarily through the promotion and inculcation of a hierarchical,
cult-type belief system within a person’s own mind, rather than by means of
external, physical restraints. It is the belief system itself which is the
primary active agent in cult mind control.
Cults actively promote and market their belief systems. Commercial
companies use marketing and public relations techniques to promote an idealized
image of their product or service to potential consumers, and cults do much the
same.
However, the difference with a cult is that both their products, and
any consequences resulting from purchase and use of their product, are entirely
subjective and intangible in nature. The ‘product’ that is marketed by a cult
is its belief system, together with the attitudes and behavior codes that are
part of that belief system. Because of the nature of their product, cults do
not really operate in the public domain. They operate in a private world,
within an individual’s personal religious framework or set of beliefs, and
within an individual’s own subjective world of self-esteem and self-confidence.
They operate within a person’s mind.
The fact that cults operate within a person’s mind has a number of
consequences. A person’s mind (or consciousness) is something which is
difficult to define or to measure, and so it something which tends to be
outside the scope of scientific and academic inquiry. From a legal point of
view, a concept like 'freedom of mind' is equally difficult to define, and therefore
it is difficult to specifically protect such a freedom. Personal free will is a
cherished axiom of Western democracies, but neither individual free will, nor
its restriction, can actually be objectively verified or measured with any
certainty. It is never possible to know for sure to what degree a person is
acting out of their own free will, or not. It is always partly a matter of
opinion.
Consequently, the whole area of cults and mind control is contentious
and difficult. Cults as organizations are usually opaque to outside scrutiny,
and the actual process of mind control is difficult to define or to analyze. It
is possible to identify and classify a number of apparent ‘techniques’ of mind
control as used by cults, and among the best known of these classifications are
Robert J. Lifton’s ‘eight criteria of mind control’ , and Steven Hassan’s BITE
acronym - control of behavior, information, thoughts, and emotions.
Outwardly, these techniques may appear similar to many of the
techniques and strategies of social compliance experienced within society at
large. (For example, some years ago a Canadian government inquiry into cults
concluded that the techniques used by cults to secure the loyalty of their
members were essentially the same as those used in the
However, it is important to consider the possibility that these
apparent techniques of mind control may not actually be the primary causative
factors, but may only be the secondary symptoms, or external indicators, of
other hidden and subjective processes. If they are, then an external
examination of an alleged cult group, which relies on objectively demonstrable
factors, such as the publicly stated belief system or the social organization
of the group, may yield little insight into the subjective psychological
processes occurring within a mind control environment, and may be of little
help in distinguishing between a cult and a non-cult organization.
It is possible to take the view that society at large embraces a
variety of organizations and institutions which could be interpreted as being
somewhat cult like in their nature. It is often argued that there is a fine
line between socialization and indoctrination, or between persuasion and mind
control. Nevertheless, society does attempt to make a distinction between
acceptable and unacceptable behavior: Persuasion through physical force or
through the denial of food and water, for example, are clearly illegal. In the
view of this writer, there is a need for society to be more active in
protecting its citizens from other more subtle processes of persuasion, which
can be equally abusive of personal freedoms, and which are sometimes made use
of by individuals and organizations in order to gain personal power over those
they claim to help, but whom often they merely manipulate and exploit. This
analysis will seek to unravel some of the processes of persuasion used by
cults, and to show in what ways they may be devious and an infringement of
personal liberty.
Cult Mind Control
Outline of a Cult Persuasion Process - ‘Bi-polar Mind Control’
Some cults promote an overtly religious type of belief system. Others,
such as so-called therapy cults, promote a secular type of belief system, based
on quasi-scientific or quasi-psychological principles. Some so-called New Age
cults combine religious and secular elements in their belief system. In
general, cult organizations promote utopian ideals of self awareness or
self-transcendence, ostensibly for the benefit both of the individual and of
the world at large. For example:
‘The central teaching of the
Buddha is that we can change our lives. Buddhism offers clear and practical
guidelines as to how men and women can realize their full potential for
understanding and kindness. Meditation is a direct way of working on ourselves
to bring about positive transformation. We teach two simple and complementary
meditations. One helps us develop a calm, clear, focused mind; the other
transforms our emotional life, enabling us to enjoy greater self-confidence and
positivity towards others.’
Of course, not every organization which promotes this kind of ideal is
necessarily a cult. However, it can be quite difficult to tell from the outside
whether a group is a cult or not. In general, as long as a cult maintains a
respectable public image, it will attract followers who aspire to this kind of
ideal.
Cult belief systems present a vision in which any individual, through
following the group’s teachings, can begin to realize their own higher
potential. Believers begin to aspire to a ‘new life’ or a ‘new self’, based on
these ideals. At the same time as they begin to aspire to this improved new
self, believers begin to see their old self, their pre-cult personality, as
having fallen short of the ideal. An old self - new self dichotomy can grow up
within a cult member’s mind, as they gradually eschew beliefs and behavior
associated with their old self, and adopt attitudes and affiliations that seem
appropriate for their new self. They may even come to see their unreformed old
self as the enemy of their emerging new self.
A cult does not control its members by using external coercion. It is
the belief system itself which is the primary active agent in cult mind
control. The controlling of mind is done by the person themselves, as they
attempt to discipline their mind and reform their personality, in accordance
with the tenets of their new belief system. Effectively, a cult uses a person’s
own energy and aspirations against them.
This analysis proposes the term ‘Bi-polar mind control’ to denote a
generic class of ‘devious psychological techniques’ used by cult organizations
to gain and control adherents. Essentially, bi-polar mind control works by
encouraging an aspirant to identify with an imagined ideal new self, and then,
from the perspective of this new self, to see their old self as comparatively
inferior and flawed. It is ego-utopia or hubris for the new self, and
ego-dystonia or shame for the old self.
Bi-polar mind control was identified in all but name by Dr. R. J. Lifton
in his seminal work ‘Thought Reform and the Psychology of Totalism’. One of the
criteria Lifton proposed for determining whether or not a given environment has
the potential to exert ‘thought-reform’ or mind control is:
‘The Demand for Purity: The creation of a guilt and shame milieu by
holding up standards of perfection that no human being can accomplish. People
are punished and learn to punish themselves for not living up to the group’s
ideals.’
A cult belief system ‘guilt-trips’ aspiring individuals, by first
holding up a utopian goal, and then encouraging aspirants to feel ashamed when
they are unable to fully realize this goal. Aspirants are encouraged to see
their recalcitrant old self as an obstacle and a hindrance, preventing them
from realizing their full potential. This type of old self - new self dichotomy
is implicit in cult-type belief systems.
While they are in the process of training their mind and transforming
their personality in accordance with the tenets of their new belief system,
believers are in the position of students, and are therefore somewhat dependent
on the guidance of their teachers. Cults gain influence over their members by
promoting a belief system which undermines members’ confidence in their own
judgment, or more specifically in the judgment of their unreformed old self, so
that they find it difficult to make decisions for themselves, independently of
guidance from the group’s teachers and preceptors.
Marketing a Cult Belief System
Of course, no-one is forced to join a cult. No-one is forced to adopt a
new belief system, either as a whole or in part. Equally, no-one can make an
informed assessment of a belief system in advance, without first having had
some personal experience of it.
Cults have to compete to market their belief systems and gain
adherents, just as ordinary commercial organizations have to compete to market
their products or services and gain customers. Indeed some of the marketing
techniques are not entirely dissimilar. Commercial businesses often use aspirational
marketing techniques, promoting their products and services to potential
customers by implying that purchase of a particular product will enhance an
owner’s self esteem and social status. Consumers are sometimes encouraged to
measure their own self-worth in terms of the quality of their possessions.
However, cults have two significant marketing advantages compared with
a normal commercial organization, because of the intangible nature of the ‘product’
they market. The product which a cult markets is its belief system, together
with the attitudes and behavior codes that are part of that belief system.
The first marketing advantage enjoyed by a cult is that, as a
quasi-religious organization, it is protected from outside investigation, by a
legal system which attempts to protect freedom of religion and freedom of
belief. Broadly, freedom of religion allows cults to use their own
self-referential ethical codes to justify their own behavior, and to remain
unaccountable to any outside agency. There are no consumer protection laws to
regulate the marketing of personal or religious belief, and no independent
quality control of the product.
A second advantage enjoyed by a cult stems from the fact that it does
not really operate in the public domain; it operates primarily within the
private and subjective realm of a person’s mind. Both the actual product
marketed by a cult, and any consequences resulting from purchase or use of the
product, are largely subjective and intangible in nature. This means that no
criticisms of the allegedly harmful effect that a cult’s belief system may have
had upon a member’s mind or behavior can ever be proved objectively, because
the whole subject of personal belief is by nature largely or entirely
subjective and therefore unprovable either way. So long as the burden of proof
remains with the critic, a cult can never lose.
In order to examine how these two advantages characteristically enjoyed
by a cult, can give a cult organization an unfair advantage over individuals to
whom they promote and market their belief systems, and can enable them to use
possibly devious processes of persuasion when recruiting new members, this
analysis now moves on to look more closely at the nature of cult belief
systems, and how they differ from conventional belief systems.
Religious Freedom & Moral
The Quasi - Religious Spectrum
As discussed in note 1, words like religion, sect, and cult have
complex derivations, related to ideas about culture and way of life, and their
meanings are subject to change over time. This analysis acknowledges the range
of meanings that these terms may be expected to carry, and has no wish to
exclude any nuances of meaning. This analysis interprets the various dictionary
definitions not as separate and distinct meanings, but more like reference
points along a continuum or spectrum of meaning.
Towards one end of this spectrum are established mainstream religious
and secular/humanist systems of belief and practice, in the middle are
non-conformist sects and fashionable fads of various kinds and towards the
other end are various religious and secular groups which tend towards being
exclusive coteries. These more cultish groups are sometimes characterized by a
collective hubris among their members, who, seeing themselves as part of an
elect, look down rather sniffily upon the mores and values of established
mainstream institutions.
With this perspective in mind, this analysis uses the term cult
primarily in the Collins 3 sense of: ‘a quasi-religious organization using
devious psychological techniques to gain and control adherents’. Leaving aside
for now the alleged ‘devious psychological techniques’, cult is here defined as
being a quasi-religious organization as distinct from a religious one. The
cultish-ness is related to the quasi-ness of the religion or belief system.
Therefore, a theological definition of a cult, based on examination of the
group’s belief system (or a tenelogical definition, based on examining the
tenets of a secular belief system), may be more useful here than a sociological
definition, based on the observation of outward characteristics. For example,
Alan Gomes, in his book ‘Unmasking the cults’ gives the following definition of
a (Christian - based) cult:
‘A cult of Christianity is a group of people, which claiming to be
Christian, embraces a particular doctrine system taught by an individual
leader, group of leaders, or organization, which denies (either explicitly or
implicitly) one or more of the central doctrines of the Christian Faith as
taught in the sixty-six books of the Bible.’
The main criterion being put forward here for a cult is that it
‘denies’ (either explicitly or implicitly) one or more of the central doctrines
of the related mainstream belief system. To deny something is to declare it
untrue, or to refuse to accept the existence, validity, or truth of something.
Denial is more than merely quibbling over the fine details of a belief system.
Denial implies a distinctly different belief system.
A theological/tenelogical definition of cult provides a means of
broadly differentiating between cults, sects, and mainstream religious or
secular belief systems, by considering the degree to which a particular group’s
belief system and culture originates from within the group, and is separate and
distinct from the relevant mainstream belief system and culture. From this
perspective, sects can be characterized as tending to disagree with some
details of the relevant mainstream belief system, while cults can be
characterized as tending to deny and reject outright significant parts of the
relevant mainstream belief system.
In this perspective, a sect (religious or secular) tends to distinguish
itself from the mainstream by having an individual interpretation of some or
all of an agreed set of scriptures, or secular tenets, but a sect doesn’t tend
to invent entirely new scriptures or tenets of belief. In general, they seek to
improve upon existing traditions, rather than to replace them entirely. A sect
might perhaps be said to have an underlying respect for the relevant mainstream
belief system and culture, in that a sect tends to hanker after some degree of
respectability and acceptance (on its own terms of course) in the eyes of the
mainstream belief system.
A cult, in comparison, tends to have little or no underlying respect
for established belief systems. Holding their own partly or wholly
self-originated belief system in high esteem, cult leaders and their followers
tend to disdain existing belief systems as inferior and outmoded, and will
consequently tend to separate off or isolate themselves from the mainstream
more than a sect. A cult will tend to either invent completely new scriptures
or tenets of belief, or at least to radically reinterpret existing scriptures
and tenets. Cult leaders may claim some special revelation or insight which is
accessible to them, but not to those outside the group. They may claim a
special ability to go back to first principles and to practice a more pure
version of the tradition, or claim a special ability to re-interpret
traditional teachings in a way which is more appropriate for the modern world.
Cults tend to be cliquey, elitist, and hierarchical, and there is usually a
distinct difference of status (in the eyes of cult leaders and their followers)
between believers and unbelievers, between the committed and the uncommitted,
and between the saved and the fallen.
Thus it may be difficult for an outsider, as an unbeliever, to
investigate a group’s belief system. Any group of people, cult or non-cult, may
be economical with the truth, presenting a sanitized version of its belief
system to public gaze, while keeping some aspects confidential to trusted
members.
It is easier to estimate to what extent a particular group is
unorthodox, if its belief system is based on a religion like Christianity, and
the group explicitly denies one or more of the central doctrines of the source
religion. Often it is not so easy, and denial may be only implicit; or there may
be no commonly agreed checklist by which to compare what is orthodox and what
is not. There is no reliable orthodoxy meter available to measure a group’s
belief system, and the process of investigation is likely to be complex and
difficult.
The purpose of this kind of investigation, attempting to locate a
particular group’s belief system along a spectrum between orthodox belief
systems and increasingly quasi or unorthodox belief systems, is not to make
value judgments between orthodox and unorthodox beliefs as such. On the one
hand, orthodox belief systems have evolved over time, and incorporate
experience and understanding distilled over generations. On the other hand, it
is important to acknowledge both that personal freedom is an established tenet
of Western democracy, and also that various forms of unorthodoxy (in the sense
of deviation from established norms) may well have evolutionary benefits for
society as a whole. Both tradition and innovation have their respective merits.
Rather, the intention in the present context is to point to a
particular characteristic of independent, self-originated and innovative belief
systems, which can give a considerable advantage to an organization promoting
such a belief system. The particular characteristic of independent belief
systems is that they can set their own moral codes.
Religious Freedom & Moral
A belief system implies some kind of ethical or moral code, whether
explicit or implicit. A moral code is concerned with the distinction between
right and wrong, and with the goodness or badness of human behavior.
Depending on how strongly a particular group, in terms of its belief
system, denies particular tenets of the relevant mainstream belief system, and
how strongly it distinguishes and separates itself off from the mainstream
belief system, to that degree its belief system may be said to be
self-originated, in the sense that the belief system originates primarily from,
and is also interpreted by, the group’s leadership.
If the belief system originates primarily from the group’s leadership,
then the group’s leadership is also the ethical preceptors and moral arbiters.
They act as both law maker and judge, and can therefore make up the rules as
they go along. The danger is that their ethical standards may become fungible,
if they succumb to the temptation to deflect any criticism of themselves or
their behavior, by adjusting definitions of right and wrong to put themselves
in the right. If the moral arbiters are unwilling to modify their behavior,
they can instead modify their moral codes to justify their behavior. Freedom of
belief can become freedom without responsibility.
The former Bishop of Durham, UK, the Revd Dr David Jenkins, usually
regarded as someone whose views tend to the heterodox as much as to the
orthodox, commented that: 'Nowadays, freedom of belief is defined in terms of a
post-modern relativism. Freedom of belief can mean; anything goes.'
Many organized groups holding to wholly or partly self-originated
belief systems are keen to defend religious freedom, because of the ‘anything
goes’ potential inherent in freedom of belief. For example, a lawsuit filed in
1999 by a coalition of plaintiffs, including the Seventh-day Adventists and the
‘International Coalition for Religious Freedom’ (Moonies) claims that the State
of Maryland’s task force studying religious cults on college campuses is
violating constitutional rights and conducting a ‘religious inquisition’.
Representing the plaintiffs,
If a group is classified as a religion, then leaders of that group have
considerable freedom, within the group at least, to set their own moral codes
and to adjudicate between right and wrong. Potentially, any methods of
persuasion, short of those which involve actual, legally provable physical
coercion, may be considered reasonable within the ethical codes of a
self-originated belief system. A sincere believer may feel that, in religious
matters, the end justifies the means, and therefore various deceptive and
devious practices may, in the mind of a believer, be justified as skillful
means, crazy wisdom, or heavenly deception.
In most cases, cult leaders probably do not set out to establish cults.
However, in practice they can tend to fall into cultish patterns almost by
default, as they attempt to assert and defend unorthodox positive visions of
the world, in the face of potentially destructive (to them) cynical or
questioning orthodoxies. Actions which, to an outsider, might seem devious or
immoral, may, in the mind of a believer, seem perfectly just and ethical. For
example:
‘Outright lying means to tell people things that you do know are not
true - Scientologists do that at times, when they are honestly convinced that
this is better for Scientology. …
‘Scientologists do have their own definition about ethics which does not fully
correspond with the general understanding about ethics. …
‘Scientologists are formally and informally told how to best explain
Scientology to others, what to mention, how to mention it, what not to mention
- all with the best intent, but the result is still, that you cannot, as an
outsider, get fully informed about Scientology by a Scientologist. I do know
that, because I did it myself and I taught it myself for years.’
Moonies, in promoting their belief system, may justify deception and
misrepresentation as part of a greater process of ‘deceiving evil into
goodness’. The FWBO rejects ‘conventional morality’ in favor of its own
self-serving ‘natural morality’, and advises senior members to be discrete
about discussing the movement’s more radical principles with the public. This
sort of secrecy and deception means that it is not really possible for a
newcomer to exercise informed free choice about the benefits or otherwise of
becoming involved with such a group.
This characteristic of independent belief systems, that they
independently set their own moral codes, does not necessarily mean that devious
psychological techniques will be employed to promote the belief system and gain
adherents, only that they can be, and that there is little to stop them.
Religious and quasi-religious groups alike are largely left to regulate
themselves, and to adjudicate for themselves on what is devious and what is
not.
Organizations and their Belief Systems
It is the combination of an independent belief system, together with an
organization devoted to promoting that belief system and gaining adherents that
creates a situation in which cultish patterns of behavior may develop. In order
to identify whether a group or organization might be a cult, it is necessary to
inquire both into the group’s organizational structure, and also into the
underlying belief system.
Questions which might be asked about a group’s organizational structure
include: how tightly organized is the group? What sort of leadership structure
does the group have? Is it a loose and informal association, or does the group
have a significant proportion of full-time members? Does the group actively
seek new members? Are there formal rituals for admitting new members? How
strongly are new members encouraged to orientate their lives full time around
the group’s belief system?
Questions which might be asked of a group’s belief system include: What
sort of position does the belief system occupy along the quasi-religious
spectrum? Does the group maintain good communication with a mainstream source
belief system, or are they independent and self-referential? Is the group and
its belief system open to investigation by outside agencies, or is the group
inclined to resist this and to cry ‘religious inquisition’? Who are the
preceptors for the group? Who decides what is devious and what is not?
Moral non-accountability is one advantage of following an independent,
self-originated belief system. There are other advantages which a cult enjoys,
in terms of defending itself against investigation and criticism. These
advantages stem from the subjective, intangible nature of personal belief and
free will.
The Nature of Personal Belief
Free Will, Free Choice, and Personal Belief
The subjective, non-material nature of free will means that, strictly
speaking, neither free will itself, nor any restriction of free will, can be
objectively proven. In any given case, it is always at least partly a matter of
opinion as to whether a person might have acted out of their own free (and
informed) choice, or whether their mental liberty and freedom of choice might
have been restricted or impeded in some way. It is ultimately unprovable either
way.
The personal and experiential nature of the belief system promoted by a
cult means that it is not possible for a person to exercise informed free
choice in advance, about whether the belief system is valid or not, or about
the benefits of following the study and training opportunities offered by the
group. The benefits, if any, of group involvement can only be evaluated through
personal experience, through spending a suitable period of time with the group.
How long a suitable period of time might be, depends on the individual, and
cannot be determined in advance.
Unfortunately, the subjective, non-material nature of free will means
that a person who becomes involved with a cult and its belief system, and who
subsequently comes to regret this, can never actually prove that they had not
been acting entirely out of their own free will in becoming involved, or that
their free choice had been in any way restricted, even if it had.
If they claim that the cult’s descriptions of its belief system were
false or misleading, preventing them from making a reasonable and informed free
choice before becoming involved, they will not be able to prove this. This is
because the subjective, non-material nature of personal belief is such that any
descriptions of a belief system are also subjective and a matter of personal
interpretation. Therefore it is virtually impossible to prove that a belief
system might have been misrepresented or falsely described. An allegation of
misrepresentation can always be countered with an allegation of misinterpretation.
A cult can always claim that critics have misunderstood the belief system
itself, and have therefore simply misunderstood and misinterpreted the cult’s
description of its belief system.
The subjective, non-material nature of personal belief also means that
any criticisms of the effect that exposure to a cult belief system may have had
upon a person’s mind or behavior are unprovable, since it is impossible to
prove what someone does or doesn’t believe, and therefore it is impossible to
prove any consequences. As long as the burden of proof remains with the critic,
a cult can never lose, and criticism is impotent.
The net result is twofold: firstly, cults are spared any obligation to
prove to the outside world that their members became involved purely out of
their own free will and choice, and secondly, they are not obliged to prove
that involvement is safe and not psychologically damaging.
It is possible to broadly place a group’s belief system along the
quasi-religious spectrum, based on investigating the belief system from the
outside, as a non-believer. However, it requires more of an insider’s
perspective to understand the interior dynamics of the belief system, and in
particular the effect that involvement with a cult belief system may have upon
a person’s mind or behavior.
The Hermeneutics of Personal Belief
This kind of investigation, into the inner workings of a group’s belief
system, could be described as hermeneutical. Hermeneutics, derived from the
Greek for ‘interpret’, is a philosophical tradition concerned with the nature
of interpretation and understanding of human behavior and social traditions. An
influential contributor to this tradition was the German philosopher Wilhelm
Dilthey (1833 - 1911), who argued that the ‘human sciences’ could not employ
the same methods as the natural sciences, but needed to use the procedure of
‘understanding’ (Verstehen) to grasp the ‘inner life’ of an alien culture or
historical period. To understand the inner life of a belief system, an
investigator has to go native and enter into the belief system to some extent.
They have to become a believer, or at least suspend disbelief.
A belief system has both objective and subjective aspects; there are
the formal doctrines and tacitly agreed behavior codes of a particular faith or
culture or group, and there is the complex of sometimes conflicting ideas,
convictions, attitudes, and affiliations within the mind of an individual
member of that particular faith or culture or group. Understanding a belief
system requires some ability to empathize with the mind and outlook of a
believer, and some ability to experience the emotions of a believer.
Clearly, there is a tension between being an impartial investigator and
being a believer. Hermeneutics is sometimes divided into the ‘hermeneutics of
suspicion’ and the ‘hermeneutics of faith’. The French philosopher, Paul
Ricoeur, wrote of this tension: ‘Hermeneutics seems to me to be animated by
this double motivation: willingness to suspect, willingness to listen; vow of
rigor, vow of obedience.’
A hermeneutic of faith allows an investigator to enter the inner world
of a belief system. However, there is the danger that this may compromise an
investigator’s impartiality and objectivity, because an investigator has to
make a paradigm shift and adopt, at least temporarily, a new set of beliefs.
They have to go native to some extent, if they hope to understand the inner
life of the belief system and see it through the eyes of a believer. Otherwise,
they will always be an outsider, and not really able to comprehend the belief
system. However, if they do adopt a new belief system, even partially and
tentatively, how can they at the same time maintain an impartial and objective
perspective on it? Their old belief system may be incompatible with the new
belief system, and in time the original reasons for undertaking an
investigation may no longer seem entirely valid under the new belief system. An
investigator needs some equivalent of Ariadne's thread, if they want to be
confident of finding their way back out of the new belief system.
Of course, no-one is forced to join a cult. No-one is forced to adopt a
new belief system, either as a whole or in part. Equally however, no-one, be
they an independent academic investigator, a curious onlooker, or a potential
new member, can understand a belief system, without trying it out first. Its
not really a belief system, if you don’t believe in it. Without some ability to
see a belief system through the eyes of a believer, and to experience the
emotions of a believer, an investigator will always remain an outsider. Without
a hermeneutic of faith, and a ‘vow of obedience’, in Paul Ricoeur’s phrase, an
objective investigation (‘vow of rigor’) will be impotent and ineffectual.
Consequently, it is not really possible to make an informed choice in
advance about whether or not to adopt, either wholly or in part, a new belief
system. It is only really possible to make an informed evaluation of a belief
system after having tried it out it and lived it for a period of time. However,
cult beliefs systems tend have a particular set of characteristics which make
it dangerous even to experiment with them. The dangerous characteristics of
cult belief systems are that they are hierarchical and bi-polar in nature.
Hierarchical, Bi-polar Belief Systems
In general, cult organizations promote utopian ideals of self awareness
or self-transcendence, ostensibly for the benefit both of the individual and of
the world at large. For example:
‘The central teaching of the Buddha is that we can change our lives.
Buddhism offers clear and practical guidelines as to how men and women can
realize their full potential for understanding and kindness. Meditation is a
direct way of working on ourselves to bring about positive transformation. We
teach two simple and complementary meditations. One helps us develop a calm,
clear, focused mind; the other transforms our emotional life, enabling us to
enjoy greater self-confidence and positivity towards others.’
The type of belief system implied above is not unique to cults. Many
belief systems could be described as aspirational and soteriological, and even
utopian, in the sense that they proclaim an ideal to be realized, and propose a
path or a lifestyle for believers that leads towards realization of that ideal.
However, cult belief systems have two additional characteristics.
Firstly, they tend to be strongly hierarchical in perspective, revolving around
ideas about lower and higher levels of personal insight. Secondly, cult belief
systems also tend to be dualistic and bi-polar, in the sense that they make a
clear distinction between lower and higher, between the mundane and the
ultimate. For example:
'I see it [the spiritual path] in terms of a very definite transition
from what we regard as a mundane way of seeing the world and experiencing the
world, to what we would describe as a transcendental way, seeing it in terms of
wisdom, seeing it in terms of real knowledge, seeing it in terms of ultimate reality,
seeing it in terms of a truer, wider perspective.'
However, this type of belief can be insidious, like a Trojan horse,
because it carries the hidden corollary that the believer’s present level of
awareness and understanding is inferior to the ideal to which they newly
aspire. Aspiration has its shadow, self-doubt.E.g.:
'spiritual life begins with awareness, when one becomes aware that one
is unaware, or when one wakes up to the fact that one is asleep.'
'We always have to be aware that our . . .um . . . what we think, is
not true, until enlightenment.'
It is this ego-dystonic undertow, this artificially created self-doubt,
which is characteristic of cult-type belief systems. Essentially, cults
undermine members’ confidence in their own judgment, or more specifically in
the judgment of their unreformed old self, so that they find it difficult to
make decisions for themselves, independently of guidance from the group’s
teachers and preceptors.
Cult belief systems promote a utopian vision in which any individual,
through following the group’s teachings, can begin to realize their own higher
potential, and can ultimately transcend the mundane. As outlined earlier,
believers begin to aspire to a ‘new life’ or a ‘new self’, one which embodies
the ideals and insights of the belief system. Simultaneously, believers begin
to see their old self, their pre-cult personality, as having fallen short of
these ideals. They begin to see the thoughts and opinions of their old self as
flawed and unreliable, and may even begin to see their recalcitrant old self as
a drain and a hindrance.
There are two intertwined aspects to this process:
From the ego-utopian side, a cult-type belief system presents a vision
of an ideal new self, and this vision can become a source of vicarious pride
for believers, as they identify with the ideal and bask in its reflected glory.
Believers can begin to experience a feeling of intoxication with the ideals of
the belief system, and a sense of pride in being associated with these ideals.
As their commitment is recognized and acknowledged by the group’s leaders, they
may also develop a sense of pride in being admitted into an exclusive coterie.
Often, established cult members will tend to divide the world into the saved
and the fallen, and seeing themselves as members of an elect, will look down
compassionately upon those not yet fortunate enough to be initiated into their
belief system. This vicarious pride or hubris-by-proxy may possibly be one of
the most attractive and even addictive aspects of cult involvement.
From the ego-dystonic side, a hierarchical cult-type belief system,
combined with the aspirational and idealistic ethos of a community of
believers, tends to create an ego-dystonic dynamic within the group.
Ego-dystonia tends to perpetuate itself given the two necessary conditions,
which are a hierarchical, bi-polar belief system and a community of believers.
The doctrines of a hierarchical belief system encourage ego-dystonia and also
define the community; the community perpetuates the doctrines, and the
aspirational nature of the doctrines consolidates the community’s appeal as a
refuge for ego-dystonics. In other words, the belief system both creates a
problem (ego-dystonia) and simultaneously offers a solution (ego-utopia). Like
the proverbial chicken and egg, the whole thing can tend to become a
self-perpetuating symbiosis.
In the absence of alternative sources of emotional nourishment, a
believer can develop a psychological dependence on the feeling of enhanced
self-confidence associated with being accepted as a member of an elite group.
Like any addict, they may become dependent on their supplier. A believer can
become dependent on the granting of recognition and appreciation by the
believer’s adoptive peer group, and by its leaders and hierarchs. This
appreciation and recognition can usually be earned, like brownie points, by
supporting the group financially or by working for the group in various ways.
Cults don’t usually try to induce extreme or pathological levels of
ego-dystonic guilt in their members; milder levels can be just as effective.
Mild guilt tends to be a good motivator, while excessive guilt tends to be
disabling, and a disabled, de-motivated believer is of no use to a cult.
Cult leaders don’t necessarily plan all this out in advance; these
processes tend to occur naturally, given the necessary conditions. In fact,
given the necessary conditions, it takes a positive effort to avoid becoming a
cult.
In general, believers feel a pleasant ego-utopia or hubris so long as
they remain in favor with the group and unpleasant ego-dystonic guilt if they
are out of favor. For an outsider to be able to understand whether this type of
psychological dependency may be a factor or not, it is necessary to have some
understanding of the ‘inner life’ of the group’s belief system, and to be able
to empathize with the mind of a believer.
Hierarchies and the Politics of Personal Belief
The two conditions necessary for a cult to form are: a hierarchical,
bi-polar type of belief system, and an organization devoted to promoting that
belief system.
Many belief systems, both cult and non-cult, are associated with a
church or with an organized community of believers, which ostensibly supports
and encourages individual believers in their efforts to realize the ideal for
themselves In the earlier section ‘Organizations and their belief systems’,
various criteria were proposed as a means of investigating the nature of a
particular community of believers. Inquiring about roughly where the belief
system fits in the quasi-religious spectrum, and about who the preceptors and
moral arbiters of the group are, can be helpful in gaining an understanding of
the group. The particular area of inquiry proposed in this section, is into the
interior dynamics or politics of how an individual believer interacts with the
organized body of believers.
Is the group’s hierarchical belief system, with its beliefs about
higher and lower levels of personal realization, used to justify a hierarchical
power structure within the organization? Do the institutions of the belief
system tend to support the aspirations of believers, or do they tend to
subordinate believers to the interests of the organization and its leaders?
It is sometimes suggested that Christianity began as a cult. The
Christianity of Jesus and the disciples was unorthodox in its time, and might
have met several of the criteria proposed so far for identifying a cult.
However, a saying such as ‘The Sabbath was made for man, and not man for the
Sabbath.’ does imply a spirit and an
ethos centered on the needs of a believer, rather than on the needs of the
belief system and its institutions. While it is admittedly rather difficult to
know the ‘inner life’ of believers two thousand years ago, early Christianity
would appear to fail Lifton’s criterion for a cult of ‘Doctrine over Person’.
Potentially, any belief system can be interpreted either in a cultish
or in a non-cultish manner. Personal belief can sometimes become
institutionalized and harden into group ideology, and it is not always easy for
an outsider to know to what extent this has happened. The Uruguayan theologian
Juan Segundo, who considers that ‘the alienating sin of the world is ideology’,
writes that:
‘liberation from ideology requires opting for the exercise of an
ideological suspicion in order to unmask the unconscious ideological structures
which dominate and which favor a powerful, privileged minority.’
In the case of a cult, the ‘powerful, privileged minority’ are the cult
leaders and hierarchs. The difficulty is that an investigator (or potential new
member) has to exercise a hermeneutic of faith as well as of suspicion, if they
are to succeed in penetrating into the mind set of a cult member and in
unmasking therein any ‘unconscious ideological structures’. Only someone with
hands-on experience of the ethos and interior dynamics of the group actually
knows the point of view of an engaged believer. Only an insider can really tell
us if the ‘inner life’ of the belief system serves the members or a privileged
hierarchy.
This places an investigator in a dangerous paradox. On the one hand,
they have to go native and enter into the belief system to some extent, if they
want to know whether the ‘inner life’ of the belief system serves the members
or the hierarchy. But the difficulty with experimenting with this sort of
belief system is that at no point is it possible for an investigator to know
for sure when to stop. Having begun to adopt, cautiously and on a trial basis,
elements of a hierarchical, dualistic belief system, it is never possible to
know when the belief system has been given a fair trial.
At no stage can an investigator or a newcomer
eliminate the possibility that they have failed to attain any more than a
mundane level of insight into the group’s beliefs. They can never be sure that
a breakthrough to a deeper level of understanding is impossible, or that
valuable insights will definitely not result from attending the next training
course or residential weekend offered by the group. Or from the next course
after that. A hierarchical cult-type belief system is like an endless road to
an uncertain destination.
This is because a hierarchical type of belief system,
with its ideas about lower and higher levels of awareness and understanding, is
intrinsically non-falsifiable. No
counter observations or criticisms of a hierarchical type of belief system can
ever be established as objectively true. It is impossible for an investigator
to prove any fault with the tenets of a hierarchical belief system, even after
long term personal experience of the belief system, or to censure any of the
methods (short of physical force) which might be used to promote such a belief
system. It is never actually possible to prove that a group promoting such a
belief system has used ‘devious psychological techniques to gain and control
adherents’, even if they have, because critics can never prove that their
criticisms are not based merely on mundane ignorance and misunderstanding. From
the perspective of a hierarchical, dualistic type of belief system, critics are
deemed to be at a lower level of awareness, and are thus effectively
disenfranchised.
Any attempt at debate with the hierarchs of a cult is
doomed, because a critic can never disprove the hierarchs’ claim to a special
revelation, or to a more profound understanding of the group’s core beliefs. So
attempts to reform a cult from within tend to be futile. It may also be
difficult to warn outsiders what the ‘inner life’ of the belief system is
actually like, because critics can never actually prove that their criticisms
are objectively valid, not just personal and subjective.
These difficulties tend to be characteristic of
cult-type belief systems, and help to put cult organizations beyond the reach
of any outside authority.
Recruitment
by Cults
Recruitment by Cults
By no means everyone who encounters a cult will be
drawn in, so clearly mind control ‘techniques’ are not all-powerful. In
general, less than 10%, and probably closer to 1% of people who attend a cult’s
introductory talk or a short course, might go on to become full members of the
group. The process of recruitment involves befriending and then mentoring or
discipling a newcomer, and this takes time. An established member may only be
able to effectively befriend and mentor two or three newcomers at a time, so
there is an arithmetical limit to the rate at which a cult can recruit new
members, however many people may attend their introductory events.
There is often an element of deception or
disingenuousness in the way that cults present themselves to the public.
Someone encountering a group such as 'Sterling Management' (Scientologists) or
'Women's Federation for World Peace' (Moonies) may have no particular reason to
be cautious of the group. Initial contact is usually achieved via an ostensibly
neutral agency which has no visible cult associations, such as a meditation
center or a stress management course. Once initial contact has been
established, selected individuals are targeted by the group’s recruiters. In
that sense, a person doesn't choose a cult, the cult chooses them.
Established members acting as recruiters will not wish
to feel that their efforts have been wasted, and will tend to target
individuals who appear more open to the ideals of the group. Recruiters are
instinctively able to spot people who are similar in outlook and temperament to
themselves, with whom they can simply re-enact the same processes by which they
themselves were originally drawn into the group. Of course, recruiters are
unlikely to consciously think of themselves as ‘recruiters’; they are more
likely to see themselves as altruists, reaching out to share their aspirations
and beliefs with others. It can be rather like a chain letter or pyramid sales
scheme.
A cult recruiter’s role is essentially to make a
newcomer feel welcome and appreciated, and to encourage them to feel an
affinity for the idealistic belief system of the group. If this can be
achieved, the belief system itself will largely do the rest. It is the belief
system itself which is the primary active agent in cult mind-control.
One of the ways in which established members may gain
ego-utopian brownie points is by attracting new members to the group. A
successful recruitment tends both to enhance a recruiter’s status within the
group, and also to confirm their own faith and confidence in the group’s belief
system. This ego-utopian feedback process provides cults with a well motivated
sales force that would be the envy of many conventional businesses.
The young and idealistic may be vulnerable to
recruitment, as may individuals who are undergoing some change or uncertainty
or re-evaluation in their lives, for example when leaving home to begin
college, leaving college to enter the job market, changing jobs, or after a
bereavement. This kind of situation can present a chance for a cult recruiter.
People who maintain an established career and circle of friends are less likely
to be drawn in to any depth.
The processes of recruitment are largely invisible to
an onlooker. To understand the kinds of processes that occur within the minds
of recruiter and recruit, it is necessary to have some understanding of, and
empathy for, the ‘inner life’ of the group’s belief system.
Leaving a Cult
Disability Arising from Cult Involvement
As discussed earlier, the personal and experiential
nature of belief systems in general means that it is not really possible to
exercise informed free choice in advance, about the merits or otherwise of a
belief system and set of attitudes promoted by a particular group. In order to
make a meaningful evaluation of a belief system, it is necessary to go native
and become a believer to some extent. Some affirmation of faith in the belief
system is usually required of a believer, and there may be formal rituals of
initiation into membership.
However, the hierarchical, bi-polar nature of cult
belief systems in particular makes it dangerous even to experiment with them.
Unless they are an investigator who became involved with a cult purely as a
research project, with the clear intention of rejecting the belief system and
leaving the cult environment after a set period of time, it may not be all that
easy for someone to escape from a cult belief system.
Cults will do their best to ensure initial apparent
benefits for new members. A cult might be compared to a card sharp, which will
let a newcomer win the first few games in order to take all their money in the
long run. There is no problem, so long as a member is happy to continue their involvement.
However, should a member become unhappy with their involvement, or develop
serious doubts about the belief system, the process of disentanglement may not
be very straightforward. Rejecting the belief system in its entirety may not be
easy, or even desirable. Even after physical contact with the group has ceased,
elements of the cult belief system are likely to linger in the mind of an
ex-member for some time, depending how deeply and for how long they were
involved with the group. They may experience feelings of anxiety and
disorientation as they try to rid themselves of the unwanted remnants of the
cult belief system and the cult way of relating to the world, while
simultaneously trying to regain some confidence either in their old, pre-cult belief
system and ways of relating to the world, or alternatively, in some new,
post-cult belief system.
For a while, an ex-member may exist in a sort of limbo
between the cult world and the outside world, unsure which to believe in. To the
extent that the cult belief system retains any degree of respect or credibility
within an ex-member’s mind, then to that extent leaving the group will seem
like abandoning the ideals and aspirations of the group’s belief system, and
therefore a failure. On the other hand, to the extent that the cult belief
system fails to retain credibility and is eschewed, to that extent an ex-member
will tend to feel shame at their foolishness and gullibility in having once
adopted beliefs and aspired to ideals which they now regard as unrealistic. So
either they are a failure, or a gullible fool. Either way their self-esteem
takes a knock, and they may find it difficult to have confidence in their own
judgment, or in their ability to come to reasonable decisions.
In general, there appears to be little systematic
research done into the after effects of cult involvement. Although there is a
good deal of anecdotal evidence, it is difficult to quantify this or to put it
into an appropriate perspective. It is difficult to agree on the definition of
terms, and some researchers consider the testimonies of ex-cult members to be
inherently unreliable. The threat of litigation may also inhibit research in
some cases. Consequently, it is not clear whether adverse reactions to cult
involvement are relatively common and typical, or uncommon and atypical. It is
difficult to know to what extent any adverse reactions may be due to cult
involvement and to what extent they may be due to other independent factors.
Nevertheless there is a substantial body of concern
about cults and the effect that they can have upon their victims. This concern
has been expressed for example by former RAF psychiatrist, Dr. Gordon Turnbull,
who debriefed
'There are, obviously, great similarities between
hostages who have been kidnapped, who have been rescued, and cult victims who are
re-emerging into normal life. The phenomena, the features of their regaining of
control and regaining of responsibility are very similar. The symptoms that
they display are similar too. The great difficulty is, however, that the
symptoms closely resemble symptoms of major psychiatric illness.'
The difficulties which ex-members may face in
regaining a sense of control and responsibility in their lives tend to result
from their difficulties in finding a belief system they can believe in, rather
than from weakness of character. A person’s belief system includes moral codes
and codes of conduct; it includes their beliefs about right and wrong, and
about what constitutes responsible behavior. A person’s belief system
determines the way they behave. When there are two distinct and somewhat
incompatible and conflicting belief systems, the cult and the non-cult, vying
for supremacy in an ex-member’s mind, the conflict between them may cause
mental disorientation. It may cause uncertainty and confusion about right and
wrong and about what is an appropriate and responsible way to behave in a given
set of circumstances.
The two rival belief systems, the cult and the
non-cult, may coexist in an ex-members mind for some time, and an ex-member may
oscillate or flip-flop between the two. For a while, they may not know what to
believe, what to think, or who to trust. Using the metaphor of two selves, the
cult self and the non-cult self, to represent these two states of belief, the
psychologist Robert J. Lifton commented:
'The two selves [cult and non-cult] can exist
simultaneously and confusedly for a considerable time, and it may that the
transition periods are the most intense and psychologically painful as well as
the most potentially harmful.’
This analysis argues that understanding the ‘inner
life’ of a belief system is crucial to discriminating between cults and
non-cult organizations, and that it is also a key to understanding the
particular nature of a cult. This analysis also suggests that the concept of
‘bi-polar mind-control’, with its implicit pressure to reject the flawed ‘old
self’ in favor of an ideal ‘new self’ (or cult self), can be a useful tool in
understanding the sort of mental space a cult member or former member will be
operating in. It can be helpful to understand which self, either the old self
with its old set of beliefs or the new self with its new set of cult beliefs,
is more dominant at any particular time.
If you criticize a cult member, this may just
encourage their tendency to see themselves (their old self) as flawed, and may
push them further into the cult. If you criticize their church or group, the
cult-member will go into cult-self mode and will see your criticisms as tending
to confirm the cult’s warnings about the outside world and its negative
effects. A better approach may be to acknowledge and encourage a cult member’s
old self, without criticizing or threatening the new cult self. If a cult
member feels valued in themselves, and their old self does not feel devalued,
then this weakens the cult’s attraction for them.
As former Moonie Steven Hassan writes:
‘I will never forget the simple gift of a cold drink
on a hot day from a stranger as I sold flowers on a
Problems
in Exposing Cults
Difficulties in Identifying a Cult
It is difficult for an outsider to know whether a
particular group is a cult, or may have developed cultish undercurrents.
Although there are some pointers and external indicators, it really takes an
insider’s perspective to know what goes on inside a particular group. Only
insiders can really blow the whistle on any abuses within cults.
In theory, it is possible for a cult to be a harmless
or even a beneficial organization. Mind control can be used beneficially, for
example to cure people of drug addiction, through reorienting their beliefs and
self-image away from addiction. One of the UK's leading cult experts said that
she first became interested in cults when she became aware that cults were
using techniques similar to those that were being used therapeutically within
the medical profession in order to cure people of drug addiction. Rev. Jim
Jones (of the Jonestown massacre) started off as a drugs counselor in
The problem is that abuses can occur when powerful
techniques are used in a situation without proper checks and balances. So while
it may be theoretically possible for a cult to be entirely beneficent, given
human nature and the non-accountability of cult leaders, such cults are comparatively
rare. Most cults sooner or later are revealed to have fallen prey to some
degree to their leadership's desires for adulation, money, power, or sex.
A cult will tend to deny and cover up any abuses by
its leadership, and details may only emerge years later. A cult is more or less
immune from outside investigation or regulation, because psychological coercion
in the form of brainwashing or mind control is almost impossible to prove. This
difficulty of proof stems mostly from the subjective nature of personal belief
itself, as discussed earlier, but there are some additional practical obstacles
which may face a whistleblower, someone who becomes openly critical of the cult
they were once a member of.
Difficulties Facing Critical Ex-members
In general, cults have a hierarchical or pyramid type
of structure. At the lowest level, members are part-timers who are only
partially committed to the group and are who are only lightly brainwashed. All
the cult leadership really requires of this level is that members should speak
well of the group and be generally positive. Members at this level have little
power or influence, and are unlikely to be aware of the full range of the
cult's teachings, knowledge of which is restricted to a trusted inner circle of
committed, full-time members.
Members at a part-time level of commitment are less
likely to be manipulated or abused to any significant extent, because achieving
strong influence over a person really requires that they be exposed to a mind
control environment on a more full-time basis. Mind control only works on a
foundation of personal friendship and trust, and it takes time and effort to
establish this foundation. Strong mind control is partly a one on one process,
in which the controlee is assigned a personal mentor, a more senior and
experienced member, who is willing to devote the patience and effort needed to
coach the aspirant/controlee in the beliefs and practices of the group.
For this practical reason, therefore, strong mind
control is generally only applied to selected individuals who are perceived to
be not only receptive, but who also have something in particular that the group
leadership wants. Sometimes this is money or sex, or it may be some practical
or business skill which is desired by the group leadership in order to expand
the group or to raise money. The greater majority of members are not specially
targeted, and are only relatively lightly brainwashed.
A person involved at a more superficial level may find
it genuinely difficult to believe what goes on in some of the more committed
levels of membership. Members who have not been specially targeted, and who
have enjoyed the warmth and friendship of the group without having been exposed
to its darker side, will tend to think well of the group, and may be puzzled by
criticisms of it. These positive and supportive members can be used as a public
relations shield, to counter any allegations against the group, and to reassure
new members. Individual critics can be simply outnumbered and their criticisms
discredited.
Even if a member involved at a less committed level is
not swayed by the general air of positivity, and does develop suspicions about
the group, they are unlikely to have enough inside information about the group
to be able to verify their suspicions, or to be in a position to effectively
warn others of potential problems. Nevertheless, the mere suspicion that a
group might be a cult can be enough to deter a person from becoming involved,
and so it can still be worth making relevant criticisms and sowing the seeds of
suspicion.
If a critic is an insider, someone who has been more
deeply involved and who has enough inside knowledge about a cult to be able to
make detailed criticisms, they will still be unable to prove anything (because
of the subjective nature of personal belief in general, and the non-falsifiable
nature of cult belief systems in particular). They will be unable to prove that
the group used deception or misrepresentation in marketing the benefits of
participation in group run courses and activities.
If an ex-member claims that they were subjected to
brainwashing or mind-control techniques, not only is this again unprovable, but
it is tantamount to admitting that they are a gullible and easily led person
whose opinions, consequently, can’t be worth much. If an ex-member suffers from
any mental disorientation or evident psychiatric symptoms, this is likely to
further diminish their credibility as a reliable informant.
Additionally, dissatisfied members or other critics
have great difficulty in disproving ad-homonym arguments, such as that they
just have a personal axe to grind, that they are trying to find a scapegoat to
excuse their own failure or deficiency, or that they are simply being
subjective and emotional. Cults have a vested interest in challenging the
personal credibility of their critics, and may cultivate academic researchers
who attack the credibility and motives of ex-members.
In general, the public credibility of critical
ex-cultists seems to be somewhere in between that of Estate Agents and flying
saucer abductees.
Summary of
Advantages Enjoyed by Cult Organizations
To summarize, a cult - defined as an identifiable,
organized group of people holding to an independent belief system which
primarily originates or is primarily interpreted from within the group, and
which has a hierarchical organizational structure based on that belief system -
is to a large extent immune from outside criticism, either of its belief system
or of the methods used to recruit followers, because:
1. Legal
criticism is ineffectual, firstly because freedom of belief laws largely
protect cults from outside investigation or regulation, and secondly because of
the subjective, non-provable nature of personal belief itself.
2. Moral
criticism is ineffectual, because a cult belief system can set its own
self-justifying moral codes.
3.
Philosophical or theological criticism is ineffectual, because a cult
belief system follows its own internal logic, which is impenetrable to an
outsider.
4. Empirical or
scientific criticism is ineffectual, because the tenets of a cult belief system
are non-falsifiable.
5. Criticism by
ex-members is ineffectual, because apostates tend to lack credibility for a
variety of reasons.
Immunity from outside criticism and regulation does
not, in itself, necessarily mean that a group will develop and use what might
be considered, by the standards of the mainstream, deceptive or devious
psychological techniques to gain or control adherents. It only means that they
can, and that there is little come-back if they do. Religious freedom and
freedom of belief laws tend to protect the rights of religious and
quasi-religious organizations, much more than they protect the rights of
individuals who may become involved with those organizations and their belief
systems.
Text © Mark Dunlop 2001.
Unknown Internet artist
Notes
1 Collins English Dictionary definition of cult:
1. a specific system
of religious worship, esp. with reference to its rites and deity.
2. a sect devoted to such a system.
3. a quasi-religious organization using devious psychological techniques to
gain and control adherents.
4. Sociol. a group having an exclusive ideology and ritual practices centered
on sacred symbols, esp. one characterized by lack of organizational structure.
5. Intense interest in and devotion to a person, idea, or activity: the cult of
yoga.
6. the person, idea, etc., arousing such devotion.
7a. something regarded as fashionable or significant by a particular group. b.
(as modifier): a cult show.
8. (modifier) of, relating to, or characteristic of a cult or cults: a cult
figure.
[from Latin cuJtus cultivation, refinement, from colere to till].
Comments:
The Collins 1 definition refers to the term religious
worship. Religion (meaning a particular system of faith and worship) derives
originally from the Latin religion - onis, which meant ‘obligation, bond, and
reverence’.
The Collins 2 definition refers to the term sect. This
word comes from the Late Latin secta, which means an "organized church
body." That in turn is rooted in the Latin sequi, which means ‘to follow,’
and is used of a ‘way of life’, or a ‘class of persons’. Sect can refer to:
·
a
religious denomination
·
a
dissenting religious group, formed as the result of schism (division;
separation, from Greek skhisma -atos ‘cleft’, from skhizo ‘to split’). In this
case, the term sect also borrows from the Latin sectus, which means ‘cut’ or
‘divide’.
·
a group
adhering to a distinctive doctrine or leader
Theologically, sect is used of a group which has
divided from a larger body or movement - generally over minor differences in
doctrine and/or practice - but whose teachings and practices are not considered
unorthodox or cultic (theologically and/or sociologically). However, in some
countries sect is used instead of - or interchangeably with - cult.
In general, there are two main ways in which the word
cult is used: the academic and the popular. Academic usage tends to try and
make cult cover all eight (in Collins’ classification) shades of meaning, while
popular usage is more specific, and tends to equate cult with either the
Collins 3 or the Collins 7 meaning. Likewise, popular usage of the word
religion tends to be more specific, and tends to imply an established system of
faith and worship, like Christianity. It is arguable that, in the West, the
modern usage of the term religion, to imply an established system of faith and
worship (usually one which believes in a single creator Deity) dates from the
third century AD. In 313 Constantine the Great issued an edict of toleration
for all religions, and in 380, Theodosius I made Christianity the official
religion of the
For a long time subsequently, within Western academic and scholastic
traditions, religion meant Christianity, and anything else tended to be
dismissed as a ‘native cult’ (eg. ‘the cult of Lamaism’). Over the last century
or so, the West has gained a greater understanding of foreign cultures, and
systems of faith like Islam and Buddhism have, in both popular and academic
usage, been promoted from ‘native cults’ to the status of ‘world religions’.
The faith systems of even fairly obscure and only recently discovered ethnic
tribes are nowadays mostly referred to as native ‘religions’. The term cult in
this context, while possibly technically correct, tends to be viewed as
disparaging and carrying undesirable overtones of cultural imperialism.
Religion comes from the Latin religion- onis, which meant ‘obligation,
bond, reverence’, and did not originally necessarily imply belief in a single
creator God. If the terms ‘religion’ and ‘worship’ can legitimately be used in
a secular context (eg. ‘Football is their religion’), as a synonym for the
honor, respect, and reverence paid, in varying degrees, to various secular
ideals and personalities, the second and subsequent meanings of cult given
above can, with only a little stretching, be interpreted as referring to
specific systems of secular ‘worship’. For example, Collins gives: ‘7a.
something regarded as fashionable or significant by a particular group. b. (as
modifier): a cult show.’
An example in this context might be J.D. Salinger's 'Catcher in the
Existentialism can be related both to later philosophical attitudes,
such as the alienated vision of Generation X, and to earlier cultural and
artistic movements like Surrealism and Dada, which had in turn partly arisen
out of a desire to make sense of the experiences of the first world war. The
term Existentialism, much like the terms religion and cult, can acquire
different shades of meaning, depending on whether the term is being used in a
popular, academic, literary, artistic, or political context. The various
flavors of Existentialism coexist with assorted flavors of other current
philosophies, like post-modernism, structuralism and situationism, None are
easy to define.
Existentialism and its cousins can be regarded as sub-sets of the
mainstream Western secular ideal of democracy, the ideal of the responsible
individual exercising informed freedom of choice. While it is probably
stretching the language a little to say that mainstream Western secular society
actually ‘worships’ this ideal, nevertheless the ideal is granted considerable
honor, respect, and even reverence. (For example, from the American Declaration
of Independence, 1776: ‘…We hold these truths to be self-evident, that all men
are created equal, that they are endowed by their Creator with certain
unalienable Rights, that among these are Life, Liberty and the pursuit of
Happiness. ----- That to secure these rights, Governments are instituted among
Men, deriving their just powers from the consent of the governed, -----…’ or,
for example, J.S. Mill, On
This ideal, of the responsible individual, was born out of the
secular-humanist concept of the individual personality, or self. This self or
individual personality had, in turn, been the replacement (radical at the time)
for the ‘soul’ of the Christian ideology that had dominated
Both culture and personal belief are complex matters, and this
complexity is reflected in the difficulty of precisely defining terms such as
cult, sect, religion, soul, self, liberty and responsibility. These terms refer
to processes and behaviors as much to finite states They are all ‘big’ words
which are difficult to define neatly or precisely, and whose meaning may
include various emotional associations or contextual nuances which may change
over time. Language is a fluid medium.
The intention of the above comments on the definition of ‘cult’ and
related terms is neither to set out a precise terminology, nor to excuse sloppy
terminology, but somewhere in between. This analysis acknowledges the range of
meanings that the term cult may be expected to carry, and has no wish to
exclude any nuances of meaning. The main text introduces the notion of the
‘quasi-religious spectrum’ (page 6) as a means of allowing an appropriate
flexibility or ‘fuzziness’ in the definitions of the terms cult, sect, and
religion.
2 The Concise Oxford Dictionary defines brainwash as: to ‘subject
[a person] to a prolonged process by which ideas other than and at variance
with those already held are implanted in the mind’ The term 'brainwashing'
was first used in 1953 to describe techniques used by the Chinese Communists to
subvert the loyalty of American prisoners captured in Korea. Brainwashing in
this original sense involved physical coercion: imprisonment, food and sleep
deprivation, and sometimes torture. In recent years, various people concerned
about cults have tended to use terms like 'mind control' or ‘thought reform’ to
describe a brainwashing or indoctrination process which does not involve
physical coercion. This kind of non-coercive process has the great advantage
(from a cult's point of view) of not leaving any physical evidence, and of
therefore being very difficult to prove. Whilst there is evidence that some
cults have used physical coercion, in general cults are keen to distance themselves
from such practices. There has been a kind of Darwinian evolution among cults,
in that those which have survived and prospered have tended to be those which
have succeeded in developing effective, but non-physically coercive, processes
to ‘brainwash’ a person.
3 Philip G. Zimbardo, PhD, professor of psychology at
‘A remarkable thing about cult mind control is that it’s so ordinary in
the tactics and strategies of social influence employed. They are variants of
well-known social psychological principles of compliance, conformity,
persuasion, dissonance, reactance, framing, emotional manipulation, and others
that are used on all of us daily to entice us: to buy, to try, to donate, to
vote, to join, to change, to believe, to love, to hate the enemy.’
There are a number of
interesting books on the subjects of marketing and persuasion, eg. 'Influence,
the Psychology of Persuasion', by Robert Cialdini
xxxxxxx by Wolff Olins
4 see: ‘Possible Legal Protection against Cults using Mind Control -
Allen v Flood, AC 1898, p 72 -74’
5 Robert J. Lifton’s
eight criteria of mind control:
http://www.freedomofmind.com/srlifton.htm
Adapted from Robert Jay Lifton’s Thought Reform and the Psychology of Totalism
(Norton, 1961, now reprinted by the University of North Carolina Press)
Dr. Lifton’s work was the outgrowth of his studies for military
intelligence of Mao Tse-Tung’s "thought-reform programs" commonly
known as "brainwashing." In Chapter 22, Lifton outlines eight
criteria which can be used as indicators when investigating whether an
environment can be understood as exercising "thought-reform" or mind
control. Lifton wrote that any group has some aspects of these indicators.
However, if an environment exhibits all eight of these indicators and
implements them in the extreme, then there is the possibility of unhealthy
thought reform taking place.
Milieu
Control
Environment control and the control of human communication. Not just
communication between people but communication within people’s minds to
themselves.
Mystical
Manipulation
Everyone is manipulating everyone, under the belief that it advances the
"ultimate purpose." Experiences are engineered to appear to be
spontaneous, when, in fact, they are contrived to have a deliberate effect.
People misattribute their experiences to spiritual causes when, in fact, they
are concocted by human beings.
Loading
the Language
Controlling words help to control people’s thoughts. A totalist group uses
totalist language to make reality compressed into black or white -
"thought-terminating clichés." Non-members cannot simply understand
what believers are talking about. The words constrict rather than expand human
understanding.
Doctrine
Over Person
No matter what a person experiences, it is the belief of the dogma which is
important. Group belief supersedes conscience and integrity.
Sacred
Science
The group’s belief is that their dogma is absolutely scientific and morally
true. No alternative viewpoint is allowed. No questions of the dogma are
permitted.
The
Cult of Confession
The environment demands that personal boundaries are destroyed and that every
thought, feeling, or action that does not conform with the group’s rules be
confessed; little or no privacy.
The
Demand for Purity
The creation of a guilt and shame milieu by holding up standards of perfection
that no human being can accomplish. People are punished and learn to punish
themselves for not living up to the group’s ideals.
The
Dispensing of Existence
The group decides who has a right to exist and who does not. There is no other
legitimate alternative to the group. In political regimes, this permits state
executions.
It could
be argued that all eight of Lifton’s criteria (for example, milieu control or
information control) are applicable to society at large, and can be observed in
operation within various groups, both cult and non-cult. It could equally be
argued that all eight of Lifton’s criteria in fact primarily reflect the nature
and interior dynamics of a hierarchical belief system, one which includes
beliefs about higher and lower levels of personal awareness and understanding,
and ideas about rejecting the old self and developing a new self. Lifton’s
criteria may be more illuminating about cults when the criteria are interpreted
as descriptions of the interior world or self-view of a person who believes in
such a hierarchical, cult-type belief system. In this perspective, Lifton’s
Demand for Purity could be broadly interpreted as the desire of a believer for
the purification of their old self and the creation of a pure new self. The
term ‘ego-dystonic mind control’ effectively means the same thing.
Lifton also wrote the following about ‘The demand for
purity’ in the essay ‘Cults: Religious Totalism and Civil Liberties’, (included
in the book ‘The Future of Immortality and Other Essays for a Nuclear Age’, by
Robert J. Lifton, pub.
‘The demand for purity can create a Manichean quality
in cults, as in some other religious and political groups. Such a demand calls
for radical separation of pure and impure, of good and evil, within an
environment and within oneself. Absolute purification is a continuing process.
It is often institutionalized; and, as a source of stimulation of guilt and
shame, it ties in with the confession process. Ideological movements, at
whatever level of intensity, take hold of an individual's guilt and shame
mechanisms to achieve intense influence over the changes he or she undergoes.
This is done within a confession process that has its own structure. Sessions
in which one confesses to one's sins are accompanied by patterns of criticism
and self-criticism, generally transpiring within small groups and with an
active and dynamic thrust toward personal change.‘
6.‘Combatting Cult Mind Control,
Steven Hassan Pub. Aquarian Press, 1988 http://www.freedomofmind.com
7
8 A utopian vision of an ideal ‘new world’ or ‘new
self’ does imply an opposite pole, a dystopian or dystonic vision of a dysfunctional
old world or old self. Applying these terms to the inner world of a person’s
ego, to their subjective sense of themselves as an individual, rather than to
the world at large, ‘ego-utopia’ could be said to be a tendency towards an
unreasonably high level of self-esteem or hubris (overweening personal pride
and arrogance), with ‘ego-dystonia’ as its opposite, a tendency towards shame
and guilt and an unreasonably low level of self-esteem.
It might seem more logical to adopt the term
ego-dystopic rather than ego-dystonic. The word dystopia is used, for example
in ‘a dystopian, nightmarish vision of society’, but there is also the
established term ‘ego-dystonic sexual orientation’, and it seems more sensible
to be consistent with the latter usage.
The term ego-dystonia is used by the American
Psychiatric Association (APA) in their Diagnostic and Statistical Manual of
Mental Disorders. For example, when homosexuality was declassified as a
‘disorder’ in 1973, a diagnosis was left as a residual of the former; that is,
the diagnosis of ‘Ego-dystonic Homosexuality.’ Ego-dystonic here means that the
person's (homo) sexual orientation is not compatible with where they think they
‘should’ be to such an extent that it is causing ‘marked decline in social functioning,’
or ‘dysfunctionality’. While it may result in shame, the suffering does not
come in and of itself of being gay, but rather it comes of the fear and
self-loathing learned within a hostile family and/or social system which
defines gay people as second class citizens. Ego-dystonic sexual orientation is
also recognized in the International Classification of Diseases, where it is
classified as an adjustment disorder (F66.1, ICD 10).
Ego-dystonia, or a diminished sense of self worth
compared to a peer group, is not necessarily confined to the area of sexual
orientation. Ego-dystonia can result from exposure to a variety of situations
encountered within society at large. It can result from experiences of sexual
or racial stereotyping, or as a result of bullying at school or in the
workplace, or sometimes as a result of social deprivation or a difficult family
background. Of course, individual experiences do vary; some people seem more
robust and thick-skinned than others, and better able to withstand a difficult
environment. It is rarely possible to make any direct causal or predictive link
between the environment and the personal psychology of a given individual,
because similar situations may affect different individuals in different ways.
One person may react to a feeling of social exclusion, for example, by becoming
hostile and blaming others, perhaps developing ideas of revenge, while another
may interiorize their reaction and, tending to blame themselves, may develop
some form of ego-dystonic depression, and may possibly turn to self-destructive
activities.
Personal self-esteem is a fluid and subjective factor,
and it is never possible to measure self-esteem objectively. It is often
difficult to assess another person’s level of self-esteem, or to place it on a
scale between excessive self-esteem (hubris) and inadequate self-esteem
(ego-dystonia). Nevertheless, individuals clearly do experience various and
varying levels of self-esteem, and it seems reasonable to suggest that
inappropriate levels of self-esteem may tend to result in inappropriate
behavior. This analysis puts forward the hypothesis that cults can indirectly
control behavior, by inducing and then exploiting an inappropriately low level
of personal self-esteem in the minds of their followers. Once some degree of
ego-dystonic guilt has been established in a person’s mind, then the behavior
of that person can to some extent be manipulated, by that person’s peer group
granting or withholding emotional approval and support.
Ego-dystonia is not to be confused with the medical
condition dystonia, which is an illness characterized by involuntary spasms and
muscle contractions that induce abnormal movements and postures. This is
obviously different from the psychological condition of ego-dystonic sexual
orientation. There are other instances where the same term is used to denote
two quite different and distinct conditions; for example, the term
‘hypertension’ may be used either to describe a particular measurable level of
high blood pressure, or it may be used to describe a psychological state which
cannot be measured by material indicators.
9 Alan Gomes, Unmasking the Cults, Zondervan 1995
10 Panel discussion with Dr David Jenkins et al,
broadcast in July 1999 as part of the ITV series on the history of
Christianity, ‘2000 Years’, hosted by Melvyn Bragg.)
11 Lawsuit filed on 16 August 1999 in Baltimore,
Maryland, USA, by a coalition of plaintiffs, including the Seventh-day
Adventists and the ‘International Coalition for Religious Freedom’ (funded primarily
by the Unification Church or Moonies)
12
Re: justification of lies and deception:
From a conversation posted on the Internet newsgroup, alt.support.ex-cult on
Poster 1
>>Lying implies some kind of malicious intention <<
Poster 2
'Lying does not by definition imply malicious intent - you can lie out of
politeness, out of pity, because you think the truth would be bad for that
person, etc. etc. etc.
'Outright lying means to tell people things that you
do know are not true - Scientologists do that at times, when they are honestly
convinced that this is better for Scientology.
'But there are finer variations where one can convince
oneself, that one is not really lying: You can tell only a small part of the
truth, you can tell the truth in a way that the other person does understand it
differently as it is, you can consciously omit important facts, you can
formulate things generally and unspecific - all with the intent that the other
person judges the situation according to your wishes while you objectively do
not give the person the information necessary to judge the situation
impartially.
'Sure you can say that you did not lie - but you did
not tell what is, in your conviction, the full truth.'
Poster 1
>>Because scientologists believe in their tech does not make them liars.
For the most part, scientologists are honest and well-meaning. Just like most
critics. <<
Poster 2
'The point is not, that scientologists believe in their tech - they have the
right to that.
'Also Scientologists themselves do try their best to
act ethical (as they define it) and they are sincere in that.
'But Scientologists do have their own definition about
ethics which does not fully correspond with the general understanding about
ethics.
'And also Scientologists do have their own
understanding about reality, about what to tell other people as truth about
Scientology - again their view of these things is in conflict with the general
understanding.
'This does not only concern staff - also public
Scientologists are formally and informally told how to best explain Scientology
to others, what to mention, how to mention it, what not to mention - all with the
best intent, but the result is still, that you cannot, as an outsider, get
fully informed about Scientology by a Scientologist. I do know that, because I
did it myself and I taught it myself for years.
'Hardly any Scientologist lies consciously to you -
but he tells you only a truth he thinks acceptable to you (and this might be so
small a part of truth that it results in disinformation, not information)'
Poster 1
>>Maybe this is true in any religion. If there is really no heaven, have the
priests all lied to us? <<
Poster 2
'A voodoo adherent who firmly believes in voodoo, does not lie to you, when he
tells you about voodoo, if he is sincere in telling you what he believes.
'A Moslem who believes that non-Moslems will go to
hell, does not lie, if he tells what he sincerely believes - no matter, if the
Moslem hell factually exists or not.
'A Protestant priest who does not believe that Jesus
rose from the dead and still preaches he sure did is lying - he tells something
as truth which he does not believe in.
'A Scientologist who says Scientology is a religion
and privately thinks it is a technology for self-betterment does lie to you -
what he is saying is not what he believes. A Scientologist who believes
Scientology is a religion and says it is a philosophy because a as religion it
would not be acceptable in e.g.
'One of the problems of Scientology is, that people
are actually taught to tell outsiders not what insiders see as the truth - and
that Scientologists feel it is ethical to do that. This sort of re-education
about what is ethical or not does lead to conflicts with non-Scientologists.'
13 Jayamati, in Shabda, August 1998, p.59. Also:
Ratnavira, in Shabda, April 1998. ‘There is … the question of public image and
reality. There is a public image of the Order which is presented through our
publications … and there is what the Order really is … the two are quite
different.’
14 Re: Organizations and their belief systems The
first three definitions of cult given in Collins Dictionary all imply an
element of organization, while Collins’ fourth and subsequent definitions of
cult refer to phenomena which are characterized by a relative lack of
organization: ‘4. Sociol. a group having an exclusive ideology and ritual
practices centered on sacred symbols, esp. one characterized by lack of
organizational structure.’
Cults in the sense of fads and fashions (as in: 7a.
something regarded as fashionable or significant by a particular group. b. (as
modifier): a cult show.) tend to be relatively disorganized phenomena; there
may be a belief system, or a set of attitudes, but there tends to be relatively
little desire to set up an organization to promote those beliefs and attitudes.
Again, there is something of a spectrum among ‘fads and fashions’, ranging from
the anarchic to the slightly organized.
15 Paul Ricouer, 'Freud and Philosophy: An Essay on
Interpretation',
16 In Greek mythology, Ariadne was the daughter of
Minos, King of Crete. When Theseus came from
17 Sangharakshita, 'Going for Refuge', T.V. programme,
BBC East 12.11.92.
18 Sangharakshita, 'Mind - Reactive and Creative',
page 8, pub. FWBO 1971.
19 Alaya (FWBO Order member), telephone conversation
28.3.1994
20 ‘The Sabbath was made for man, and not man for the
Sabbath.’ - Parable of the wheat field, Mark chapter 2, verse 27.
There may be equivalent ‘believer centered’ statements
in other religions. In Buddhism for example, there is the parable of the raft
(Digha-Nikaya, ii, 89; Majjhima-Nikaya, i, 134), in which the raft,
representing the beliefs and practices of Buddhism, is to be cast aside once
the further shore (enlightenment) is reached.
21 J. L. Segundo, 'Evolution and Guilt',
It should be pointed out that Segundo was not writing
specifically about cults as such, but from the perspective of ‘Liberation
Theology’, which is concerned with the hermeneutics of issues such as land
ownership and education.
22 ‘Falsifiability’ was first put forward as criterion
for testable scientific truth by Sir Karl Popper, Professor of Logic and
Scientific Method at the London School of Economics from 1949 - 1969.
A statement of the form 'All crows are black' is a
falsifiable statement, because one properly authenticated observation of a
white (albino) crow is sufficient to show that the statement is false, despite
any number of observations of black crows. In other words, the statement is
capable of being disproved through empirical evidence.
Statements of the form: 'Spiritual life begins when
one realizes that one is not as aware as one could be' or 'We always have to be
aware that our . . . um . . . what we think, is not true, until enlightenment’
are non-falsifiable statements. While any number of people within a group may
observe (or say that they believe) that they have become more aware following
the group's spiritual guidance, a person who questions this or who observes (or
believes) that they themselves have not become more aware following the group's
spiritual guidance, cannot establish this either as a valid observation, or as
a reasonable belief. This is because it can easily be argued that this
'negative' observation results from that person's own deficiencies of spiritual
awareness and not from any deficiencies in the group's spiritual guidance, or
in the truth of the group’s doctrines.
23 Re: Critics disenfranchised and the non-democratic
nature of cults:
'There's no democracy in the Western Buddhist Order!
.... It's a hierarchy, but a spiritual one.... It is the broad feeling that
there is in someone, or in certain people, something higher and better than
yourself to which you can look up.... It's a good, positive thing to be able to
look up to someone! If you can't, you're in a pretty difficult position. You're
in a sad state.... like a child that hasn't even got a mother and father to
look up to....But this sort of assertion, that you're just as good as anybody
else in the egalitarian sense, is really sick.'
From 'The Endlessly Fascinating Cry.' A seminar by
FWBO leader Sangharakshita on the Bodhicaryavatara, transcribed and published
FWBO, 1977, p.74-5.
'Spiritual hierarchy' appears to be a variety of
'doctrine over person' and 'dispensing of existence' described by Robert J.
Lifton in his 'Eight Criteria of Mind Control'
24 The ‘card
sharp’ simile for a cult is courtesy of Verdex, an ex-FWBO member from Germany,
who maintains the Internet site, www.fwbo-files.com Verdex also likens the FWBO
to a black hole; the attraction increases as a person moves closer to the
group, and there is an event horizon, beyond which communication with the
outside universe is lost.
25 Newsnight report on cults (BBC 2 16th July 1993)
26 From the essay
‘Cults: Religious Totalism and Civil Liberties’, included in the book ‘The
Future of Immortality and Other Essays for a Nuclear Age’, by Robert J. Lifton,
pub.
27 Hassan, Steve, 'Releasing The Bonds: Empowering
People To Think For Themselves',
28 For details of numerous allegations of abuse within
cults, follow relevant links (eg ‘ex-member stories’) on the following
websites:
www.fwbo-files.com
www.csj.org/aff/affindex.htm
www.cultinformation.org.uk
www.catalyst-uk.freeserve.co.uk
www.freedomofmind.com
www.rickross.com
www.ex-cult.org
www.apologeticsindex.org
www.trancenet.org
29 Academic researchers who attack the credibility and
motives of ex-members. Eg:
Dr J. Gordon Melton testified as an expert witness in
a lawsuit, that:
‘When you are investigating groups such as this [The
Local Church], you never rely upon the unverified testimony of ex-members….To
put it bluntly; hostile ex-members invariably shade the truth. They invariably
blow out of proportion minor incidents and turn them into major incidents, and
over a period of time their testimony almost always changes because each time
they tell it they get the feedback of acceptance or rejection from those to
whom they tell it, and hence it will be developed and merged into a different
world view that they are adopting.’
From the expert testimony of Dr J. Gordon Melton in
Lee vs. Duddy et al, a lawsuit involving the Local Church and the Spiritual
Counterfeits Project. Quoted from http://www.hightruth.com/experts/melton.html
And Bryan Wilson, Emeritus Professor at
‘Informants who are mere contacts and who have no
personal motives for what they tell are to be preferred to those who, for their
own purposes, seek to use the investigator. The disaffected and the apostate
are in particular informants whose evidence has to be used with circumspection.
The apostate is generally in need of self-justification. He seeks to
reconstruct his own past, to excuse his former affiliations, and to blame those
who were formerly his closest associates. Not uncommonly the apostate learns to
rehearse an 'atrocity story' to explain how, by manipulation, trickery,
coercion, or deceit, he was induced to join or to remain within an organization
that he now forswears and condemns. Apostates, sensationalized by the press,
have sometimes sought to make a profit from accounts of their experiences in stories
sold to newspapers or produced as books (sometimes written by 'ghost'
writers).’
Bryan
Wilson, The Social Dimensions of Sectarianism, Oxford: Clarendon Press, 1990,
p.19.
http://www.neuereligion.de/ENG/Wilson/index.htm
Further examples of these sorts of ad-homonym
arguments may be found at the cult-apologists FAQ at: http://www.snafu.de/~tilman/faq-you/cult.apologists.txt