Propaganda Techniques
Back Home Up

 

 

Recognizing Propaganda Techniques and Errors of Faulty Logic

Propaganda Techniques

What are Propaganda Techniques? They are the methods and approaches used to spread ideas that further a cause - a political, commercial, religious, or civil cause.
Why are they used?
To manipulate the readers' or viewers' reason and emotions; to persuade you to believe in something or someone, buy an item, or vote a certain way regardless of the truth.  Beware that these methods also work for liars.  God is the judge of liars.  These methods don't work for Christians who have gifts of the Holy Spirit and acknowledge the Lord in all their ways, no matter how obvious it sounds.  Be wise as serpents, yet as innocent as doves.
What are the most commonly used propaganda techniques?
See which of the ten most common types of propaganda techniques you already know.

Types:

Black Propaganda:  This technique consists of pretending to be a member of the opposition and pretend to be stupid, bigoted, evil, violent and generally repugnant.  There have even been people who manipulated their way into positions of prominence in opposition movements so they can make it look bad.  This is often done in liberal movies.

Name calling: This technique consists of attaching a negative label to a person or a thing. People engage in this type of behavior when they are trying to avoid supporting their own opinion with facts. Rather than explain what they believe in, they prefer to try to tear their opponent down.

Glittering Generalities: This technique uses important-sounding "glad words" that have little or no real meaning. These words are used in general statements that cannot be proved or disproved. Words like "good," "honest," "fair," and "change" are examples of "glad" words.

Transfer: In this technique, an attempt is made to transfer the prestige of a positive symbol to a person or an idea. For example, using the American flag as a backdrop for a political event makes the implication that the event is patriotic in the best interest of the U.S.  President Obama refused to say the pledge of allegiance, but he always wanted an American flag as a background.

False Analogy: In this technique, two things that may or may not really be similar are portrayed as being similar. When examining the comparison, you must ask yourself how similar the items are. In most false analogies, there is simply not enough evidence available to support the comparison.

Testimonial: This technique is easy to understand. It is when "big name" personalities are used to endorse a product. Whenever you see someone famous endorsing a product, ask yourself how much that person knows about the product, and what he or she stands to gain by promoting it.  Actors who pretend to be doctors are sought for medical testimonials.

Plain Folks: This technique uses a folksy approach to convince us to support someone or something. These ads depict people with ordinary looks doing ordinary activities.  They are paying for shills to anonymously go out on internet forums and promote or attack ideas according to the party's agenda..

Card Stacking: This term comes from stacking a deck of cards in your favor. Card stacking is used to slant a message. Key words or unfavorable statistics may be omitted in an ad or commercial, leading to a series of half-truths. Keep in mind that an advertiser is under no obligation "to give the truth, the whole truth, and nothing but the truth."  We see this with the Global Warming movement.

Bandwagon: The "bandwagon" approach encourages you to think that because everyone else is doing something, you should do it too, or you'll be left out. The technique embodies a "keeping up with the Joneses" philosophy.

Either/or fallacy: This technique is also called "black-and-white thinking" because only two choices are given. You are either for something or against it; there is no middle ground or shades of gray or alternatives. It is used to polarize issues, and negates all attempts to find a common ground.

Faulty Cause and Effect: This technique suggests that because B follows A, A must cause B. Remember, just because two events or two sets of data are related does not necessarily mean that one caused the other to happen. It is important to evaluate data carefully before jumping to a wrong conclusion.

Social proof is the tendency to believe what most people believe. If an advocate creates the impression that "everyone knows" that someone is lying and covering up facts, there is a subtle implication that those who disagree are somehow flawed and lacking in credibility. Identifying a few people who believe a proposition, and encouraging them to go public (especially repeatedly) creates the impression that lots of people are experiencing something real. Repeated affirmations create the impression that the assertion is true.

Appeals to authority add weight to these persuasions. If one or more of the people affirming a belief is perceived as authoritative, e.g., a physician or a political leader, more people will be persuaded. It may matter little that the expert is the only one in the universe with that opinion, if he or she is the only one whose opinions we hear. Sometimes politicians are persuaded to join in unfounded but politically advantageous rhetoric. If we like the source of an opinion, we are more likely to believe. So if a popular actor, media figure, politician, or local hero joins the process, more people will endorse the perceived reality.

Vivid examples -- especially dramatic case histories -- often influence judgments more than dull but more accurate quantitative examples. For example, inviting the single child with a birth defect to the town hall meeting may overwhelm the fact that there are fewer birth defects in the neighborhood than in most similar residential areas.

Confusion techniques can create perceptions of toxicity, injury, or disease. For example, illogical but eloquent rhetoric delivered with an air of certainty can create such perceptions if a few clear alarming phrases are woven into the message. If the release of something harmless to humans is announced along with discussions of studies indicating cancer, birth defects, or brain damage in animals, concern or alarm may ensue. A classic technique is to pose an alarming question as the headline of a speech, article, or broadcast, e.g., "Are your children in danger?" We commonly hear announcements that "bad chemicals" or "known carcinogens" are out there, without objective data to clarify whether the type, amount, and location of the substance could actually hurt anyone. When someone questions the plausibility of the alleged toxic exposures, advocates may self- righteously respond that reasonable people have a right to worry, -- as though people who try to alleviate unnecessary worry are violating the rights of others.

Confusion of inverse probabilities is another classic form of invalid interpretation of facts that arouses unnecessary alarm. For example, suppose an announcement of a release of a toxic chemical is accompanied by news that the chemical can cause upper respiratory symptoms, aches and pains, or other common symptoms. Some people with these symptoms will conclude that the chemical was responsible. And this could be true. However, it may also be true that only 10% of persons exposed develop such symptoms, and only 1% of the population was exposed, so that the probability that a particular person has been poisoned is one in a thousand. These important details can be overlooked in the hue and cry following a dramatic toxic spill.

Social Disapproval. This is a technique by which the propagandist marshals group acceptance and suggests that attitudes or actions contrary to the one outlined will result in social rejection, disapproval, or outright ostracism. The latter, ostracism, is a control practice widely used within peer groups and traditional societies.  Heaping negative and shaming accusations on someone can emotionally wear them out even if they are right.  This can be countered with having a fellowship group to support them.  Sometimes name-calling is used to vilify the opposite view.

Errors of Faulty Logic

Contradiction:

Information is presented that is in direct opposition to other information within the same argument.

Example: If someone stated that schools were overstaffed, then later argued for the necessity of more counselors, that person would be guilty of contradiction.

Accident:

Someone fails to recognize (or conceals the fact) that an argument is based on an exception to the rule.

Example: By using selected scholar-athletes as the norm, one could argue that larger sports programs in schools were vital to improving academic performance of all students.

False Cause:

A temporal order of events is confused with causality; or, someone oversimplifies a complex causal network.

Example: Stating that poor performance in schools is caused by poverty; poverty certainly contributes to poor academic performance but it is not the only factor.

Begging the Question:

A person makes a claim then argues for it by advancing grounds whose meaning is simply equivalent to that of the original claim. This is also called "circular reasoning."

Example: Someone argues that schools should continue to have textbooks read from cover to cover because, otherwise, students would not be well-educated. When asked to define what "well-educated" means, the person says, "knowing what is in the textbooks."

Evading the Issue:

Someone sidesteps and issue by changing the topic.

Example: When asked to say whether or not the presence of homosexuals in the army could be a disruptive force, a speaker presents examples of homosexuals winning combat medals for bravery.

Arguing from Ignorance:

Someone argues that a claim is justified simply because its opposite cannot be proven.

Example: A person argues that voucher programs will not harm schools, since no one has ever proven that vouchers have harmed schools.

Composition and Division:

Composition involves an assertion about a whole that is true of its parts. Division is the opposite: an assertion about all of the parts that is true about the whole.

Example: When a school system holds up its above-average scores and claims that its students are superior, it is committing the fallacy of division. Overall scores may be higher but that does not prove all students are performing at that level. Likewise, when the military points to the promiscuous behavior of some homosexuals, it is committing the fallacy of composition: the behavior of some cannot serve as proof of-the behavior of all homosexuals.

Errors of Attack

Poisoning the Well:

A person is so committed to a position that he/she explains away absolutely everything others offer in opposition.

Example: Almost every proponent and opponent on the ban on gays in the military commits this error.

Ad Hominem:

A person rejects a claim on the basis of derogatory facts (real or alleged) about the person making the claim.

Example: Someone rejects President Clinton's reasons for lifting the ban on gays in the military because of Mr. Clinton's draft record.

Appealing to Force:

Someone uses threats to establish the validity of the claim.

Example: Opponents of year-round school threaten to keep their children out of school during the summer months.

Errors of Weak Reference

Appeal to Authority:

Authority is evoked as the last word on an issue.

Example: Someone uses the Bible as the basis for his arguments against specific school reform issues.

Appeal to the People:

Someone attempts to justify a claim on the basis of popularity.

Example: Opponents of year-round school claim that students would hate it.

Appeal to Emotion:

An emotion-laden "sob" story is used as proof for a claim.

Example: A politician uses a sad story of a child being killed in a drive-by shooting to gain support for a year-round school measure.