Recognizing Propaganda Techniques and Errors of Faulty Logic
What are Propaganda Techniques?
They are the methods and approaches used to spread ideas that further a cause -
a political, commercial, religious, or civil cause.
Black Propaganda: This technique consists of pretending to be a member of the opposition and pretend to be stupid, bigoted, evil, violent and generally repugnant. There have even been people who manipulated their way into positions of prominence in opposition movements so they can make it look bad. This is often done in liberal movies.
Name calling: This technique consists of attaching a negative label to a person or a thing. People engage in this type of behavior when they are trying to avoid supporting their own opinion with facts. Rather than explain what they believe in, they prefer to try to tear their opponent down.
Glittering Generalities: This technique uses important-sounding "glad words" that have little or no real meaning. These words are used in general statements that cannot be proved or disproved. Words like "good," "honest," "fair," and "change" are examples of "glad" words.
Transfer: In this technique, an attempt is made to transfer the prestige of a positive symbol to a person or an idea. For example, using the American flag as a backdrop for a political event makes the implication that the event is patriotic in the best interest of the U.S. President Obama refused to say the pledge of allegiance, but he always wanted an American flag as a background.
False Analogy: In this technique, two things that may or may not really be similar are portrayed as being similar. When examining the comparison, you must ask yourself how similar the items are. In most false analogies, there is simply not enough evidence available to support the comparison.
Testimonial: This technique is easy to understand. It is when "big name" personalities are used to endorse a product. Whenever you see someone famous endorsing a product, ask yourself how much that person knows about the product, and what he or she stands to gain by promoting it. Actors who pretend to be doctors are sought for medical testimonials.
Plain Folks: This technique uses a folksy approach to convince us to support someone or something. These ads depict people with ordinary looks doing ordinary activities. They are paying for shills to anonymously go out on internet forums and promote or attack ideas according to the party's agenda..
Card Stacking: This term comes from stacking a deck of cards in your favor. Card stacking is used to slant a message. Key words or unfavorable statistics may be omitted in an ad or commercial, leading to a series of half-truths. Keep in mind that an advertiser is under no obligation "to give the truth, the whole truth, and nothing but the truth." We see this with the Global Warming movement.
Bandwagon: The "bandwagon" approach encourages you to think that because everyone else is doing something, you should do it too, or you'll be left out. The technique embodies a "keeping up with the Joneses" philosophy.
Either/or fallacy: This technique is also called "black-and-white thinking" because only two choices are given. You are either for something or against it; there is no middle ground or shades of gray or alternatives. It is used to polarize issues, and negates all attempts to find a common ground.
Faulty Cause and Effect: This technique suggests that because B follows A, A must cause B. Remember, just because two events or two sets of data are related does not necessarily mean that one caused the other to happen. It is important to evaluate data carefully before jumping to a wrong conclusion.
Social proof is the tendency to believe what most people believe. If an advocate creates the impression that "everyone knows" that someone is lying and covering up facts, there is a subtle implication that those who disagree are somehow flawed and lacking in credibility. Identifying a few people who believe a proposition, and encouraging them to go public (especially repeatedly) creates the impression that lots of people are experiencing something real. Repeated affirmations create the impression that the assertion is true.
Appeals to authority add weight to these persuasions. If one or more of the people affirming a belief is perceived as authoritative, e.g., a physician or a political leader, more people will be persuaded. It may matter little that the expert is the only one in the universe with that opinion, if he or she is the only one whose opinions we hear. Sometimes politicians are persuaded to join in unfounded but politically advantageous rhetoric. If we like the source of an opinion, we are more likely to believe. So if a popular actor, media figure, politician, or local hero joins the process, more people will endorse the perceived reality.
Vivid examples -- especially dramatic case histories -- often influence judgments more than dull but more accurate quantitative examples. For example, inviting the single child with a birth defect to the town hall meeting may overwhelm the fact that there are fewer birth defects in the neighborhood than in most similar residential areas.
Confusion techniques can create perceptions of toxicity, injury, or disease. For example, illogical but eloquent rhetoric delivered with an air of certainty can create such perceptions if a few clear alarming phrases are woven into the message. If the release of something harmless to humans is announced along with discussions of studies indicating cancer, birth defects, or brain damage in animals, concern or alarm may ensue. A classic technique is to pose an alarming question as the headline of a speech, article, or broadcast, e.g., "Are your children in danger?" We commonly hear announcements that "bad chemicals" or "known carcinogens" are out there, without objective data to clarify whether the type, amount, and location of the substance could actually hurt anyone. When someone questions the plausibility of the alleged toxic exposures, advocates may self- righteously respond that reasonable people have a right to worry, -- as though people who try to alleviate unnecessary worry are violating the rights of others.
Confusion of inverse probabilities is another classic form of invalid interpretation of facts that arouses unnecessary alarm. For example, suppose an announcement of a release of a toxic chemical is accompanied by news that the chemical can cause upper respiratory symptoms, aches and pains, or other common symptoms. Some people with these symptoms will conclude that the chemical was responsible. And this could be true. However, it may also be true that only 10% of persons exposed develop such symptoms, and only 1% of the population was exposed, so that the probability that a particular person has been poisoned is one in a thousand. These important details can be overlooked in the hue and cry following a dramatic toxic spill.
Social Disapproval. This is a technique by which the propagandist marshals group acceptance and suggests that attitudes or actions contrary to the one outlined will result in social rejection, disapproval, or outright ostracism. The latter, ostracism, is a control practice widely used within peer groups and traditional societies. Heaping negative and shaming accusations on someone can emotionally wear them out even if they are right. This can be countered with having a fellowship group to support them. Sometimes name-calling is used to vilify the opposite view.
Errors of Faulty Logic
Errors of Attack
Errors of Weak Reference