Part The Second

May 08, 2008 00:56

Starts with:


To use logic is simply to examine the adequacy of the proof backing up an assertion. We are all logical thinkers in the main. Were we not, we would be incapable of making informed decisions and would probably be institutionalized. We have, however, all developed bad thinking habits.

First, a few definitions of terms used here. The term argument describes the steps or process used to reach a conclusion. Arguments are either valid or invalid. A conclusion is a statement supported by reasons. Reasons (also premises, evidences or assumptions) are facts presented as proof. In a longer argument, a premise may consist of a previously proven conclusion. Statements of reasons or conclusion are true or false. In an argument, the reasoning must be valid and the supporting facts must be true in order for the case to be acceptable.

Let's look at two sets of simple arguments, the first two thought processes are logical and the second two are flawed:

* Premise A: All birds have feathers;
Premise B: Some fliers are birds;
Valid Conclusion: Therefore, some fliers are have feathers.

* Premise A: All mammals have hair;
Premise B: All X's are mammals;
Valid Conclusion: Therefore, all X's have hair.

* Premise A: All birds have feathers;
Premise B: Some fliers have feathers;
Invalid Conclusion: Therefore, some fliers are birds.

* Premise A: All mammals have hair;
Premise B: All X's have hair;
Invalid Conclusion: X's are mammals.

Do you see how the structure of the logic makes the difference between a case that is valid and one that only seems to be valid? In the second two arguments, because we know that the invalid conclusion is probably a true statement, we might be inclined to accept the faulty argument and see it as reasonable. In the last example, the invalid conclusion might seem to work if we substituted "zebra" for X, but doesn't stand if we substitute "woolly worm." An illogical argument does not prove anything, whether it seems to or not.

It is quite possible for a fallacious argument to use sound logic and be confounded by one or more false facts. Equally, a fallacy may consist of true premises, but use unsound logic. Or both logic and facts may be wrong.

If on examination, a statement seems prone to being exposed as false or without factual support, a speaker may resort to twisting the logic of the argument so that the premise appears to be valid. These traps will be found in the method the speaker uses in drawing conclusions. Fallacious reasoning is an integral to many oratory techniques listed in the previous section. Aside from the common deceptions listed below, books and courses on logic can also help you identify contradictions and logical flaws.

To quote Lionel Ruby's The Art of Making Sense, "It is difficult to think well in fields which involve our emotions and self-interest. We often simply forget that we ought to exercise our critical powers. We become dogmatic, and make positive and arrogant assertions without proof. We may become blind fanatics, and stop thinking altogether. We become blind followers of authorities, without ever inquiring as to whether their pronouncements can be justified by the evidence."

Common Logical Errors in Arguments:

Guilt by Association - Citing mere common characteristics does not prove that two cases are identical. E.g., "John believes workers should have a safe work place, be paid a fair living wage, not be exploited by employers, and have job security; therefore, John must support the union's demands, since all union members hold these same ideals."

False Premise - This is an assumption which is introduced to the audience as fact and is either false, not proven, or backed up with unsound reasoning or false statements. The speaker goes on to use this as an evidence in support of another premise. E.g., "Cynthia supports a woman's right to choose whether or not to terminate her pregnancy; the Bible states that abortion is murder and a sin; therefore, in God's eyes, Cynthia has sided with those who condone committing murder and sinning."

Circular Argument - When an initial premise is supported by a second, and in turn the second is backed up by a third, and the third is backed up by the initial statement, then the speaker is (overtly or secretly) attempting to prove his fact by itself. This a presumption that the very conclusion being argued is proven while trying to prove it. E.g.,

* "I am a truly inspired apostle of God."
* "Why are you a true apostle?"
* "Because God said the only true apostles are poor itinerants who go out 2x2 like me."
* "I can't find where God says this, how do I know what you say that God has said is correct?"
* "Because God says you must listen to me."
* "Why is that?"
* "Because I am a truly inspired apostle of God."
* Etc. . . .

Accidental Fallacies - If a special case or condition is cited in an argument, and then used as a general rule to support a conclusion, the argument is invalid: e.g., "The boiling point of water at sea level is 100 degrees Centigrade, therefore, water boiling in Yellowstone's geysers must have reached 100 degrees Centigrade." (actually the boiling point varies according to elevation: it is less at points above sea level.) Conversely, the argument is also invalid if a general rule is cited to prove a special case: e.g., "Drugs are beneficial to mankind; cocaine is a drug; therefore cocaine is beneficial to cocaine addicts."

Presumption - This is a general classification of all fallacies in which a premise either avoids proving the issue at hand or secretly tries to use the conclusion (the thing to be proved) as evidence. This is also referred to as a "material fallacy" as opposed to a "formal fallacy."

Appeal to Awe - This fallacy of relevance cites the opinion of respected "experts" instead of proving the reasoning of the speaker. Just because "Mr. Big-wig says thus-and-so" is no reason to assume that an argument is valid, unless proofs are offered to back up Mr. Big-Wig's statements. Three closely related fallacies:

1. quoting conclusions from another source without (or with only vague) attribution -- as this also denies opportunity to examine the reasoning behind the borrowed premise (e.g., "Four out of five doctors said that Zoe's pills ...");

2. citing a tradition as proof (e.g., "All I can tell you is that it was good enough for great-granddaddy Smith ..."); and

3. use of testimonials as evidence -- in order to be suitable for inclusion in an argument, the testimony must first be proved just as would any other assertion. Competence and lack of prejudice on the part of the testifier must also be addressed here. (E.g., "Since I started coming to these meetings, I've been blessed with riches beyond compare. I've never been happier. I know this is right!)"

Appeal to Ignorance - This second fallacy of relevance is sometimes referred to as "Negative Proof." It often appears in the forms: "This is true because no one can (or has) shown otherwise." and, "Since this thing has not been shown satisfactorily to be true, therefore, its reverse must be true." An "unknowable" proviso lends no support to any argument. Showing that something is not conclusively established merely shows that it has not been confirmed, and this cannot, of itself, be used to justify anything else. E.g., "We can accept the story about George Washington and the cherry tree, because you haven't shown us that it didn't happen." or, "Since it hasn't been proven beyond doubting that there is a God, we are warranted in the assumption that God does not exist."

Appeal to the People - A third fallacy of relevance offers the holding up of popular ideals (such as liberty, or fairness) instead of offering logical reasons for a conclusion. E.g., "This hallowed ground, sanctified by the blood of freedom-loving patriots and the tears of bereaved motherhood, must never become a forum in which those critics who challenge our common ideals are allowed to exist and spread their filthy lies."

Appeal to Pity - A fourth fallacy of relevance consists of winning the audience to sympathy for the speaker's case instead (again) of offering reasonable arguments consisting of proven fact: e.g., "Mother Elsa hasn't a cent to her name, she spends hours scrubbing floors in tuberculosis wards, she gives half of her meager rations and income to help support her ailing sister, her back is bent from carrying buckets of cement to complete the new orphanage. Surely this saintly woman, who has suffered so much for others, would never fraudulently solicit legacy bequests to benefit herself. And what will become of her and those who depend on her if she has to go to prison?"

Personal Attack - Instead of providing proof for the position taken, the speaker turns the issue into an attack on the character or conduct of an opponent, e.g., "Senator Smythe would have us increase funding for school lunches: this from a man notorious for chasing skirts all over Washington, yet who supported equal rights for women; a man who was investigated for taking kickbacks from contractors in 1985, yet who voted for the latest ethics bill; a man who opposed waste regulations while owning one of the worlds largest landfill operations. My friends, Senator Smythe is simply not credible." This can also be used to intimidate the opposing side, casting an aspersion instead of providing valid counter-arguments, e.g., "Anyone who would ask such a thing should be ashamed!" Sometimes, a speaker will attempt to disqualify an argument from an opponent by referring to circumstances which might incline the opponent to take that position, e.g., "Jane has been pointing to serious error in our church. Jane's points should be dismissed because she left our church because she could not submit to our leadership. She is motivated by bitterness and a bad spirit, so of course finds fault with us." Note that the latter argument does not address or refute the charges brought forth, but merely attempts to blacken Jane's motives, which, in any case, are not relevant to the issue at hand. Finally, a speaker may attempt to shift the perception of guilt back onto an opponent by charging him/her with a similar offense (sometimes implying hypocrisy.) E.g., "I believe that the B-4 bomber base proposed for my district should be kept in the budget. I can't understand why Representative Vorn has a problem with this, since his district has benefited from billions in pork barrel defense projects over the past ten years."

Threat - The last error in relevance dealt with here consists of using an implied or explicit threat in place of reasoned proofs. E.g., a cleric might say: "We're doing things in God's only way. If you don't believe that, you might as well leave us and enter the separation and damnation deserved by those who malign our Truth."

Unfair Questions - Sometimes, more than one pertinent question will be combined to form a single query in such a way that the single answer required is inadequate. E.g., "Do you like vegetables?" (Carrots yes, but not Brussels sprouts.) A question can also be framed in the form of a "loaded question" which leaves a false implication no matter what the answer: e.g., "Did John ever stop cheating on his exams?" -- any answer given sounds like John cheated.

False Alternatives - Sometimes the speaker gives choices which are inadequate and present a contrived dilemma. E.g., "A man can either support my government's foreign policy or he is a traitor." -- the reality might lie somewhere in between, but has not been presented to the listener as a possibility.

Inconsistency - A position supported by premises which are directly or indirectly contradictory or false is invalid unless and until true explanations reconcile or replace the conflicting items. E.g., "We accept Biblical passages which state that God will directly teach all believers, and that there are no longer human mediators between God and men. It is obvious, however, that in the real world God must teach us through our (inspired) ministers, and that it is through them that we have knowledge of Him. Thus, in a way, we see God teaching His people in the way He said He would."

Note also that deductions cannot validly be formed unless the logical approach used in forming the conclusion is consistent.

"Hypocrisy" or a "Double Standard" results from drawing conclusions based on evaluation or judgment of two or more things or groups according to differing, inconsistent standards: e.g., "Television is evil, because it introduces worldly influences into the home." -- yet the same speaker might read newspapers, listen to radio, have a VCR, read steamy novels, subscribe to magazines, occasionally even hire a set or rent a room with a T.V., etc. In the above example, the speaker does not judge television by the same standards used to judge other media. The idea that a person who is able to exercise discrimination in choosing and reading news and magazines could not exercise the same self-control with regard to television has not been shown, and in fact may be contradicted by the actions of the speaker. And it is all too easy to demand that others adhere to higher standards than we require of ourselves or of people with which we identify. If we rationalize reasons for violating any standards ourselves, we must excuse others also: e.g., "If any among you is without sin, let him cast the first stone."

Non Sequitur (literally, "it does not follow") - This term can be used to classify a broad range of fallacious statements, however, the reference is usually reserved for instances in which there is no connection between the reasons put forth and the conclusion. E.g., "Dave fell out of a large tree last week when a branch broke; the tree was a fast-growing Chinese Elm; thus, Chinese Elms are prone to disease."

Equivocation - Basically, equivocation is the use of the same term in two or more different ways in different parts of the argument: e.g., "Mary is smart [stylish]; Smart [intelligent] people do well in school; therefore, Mary does well in school." Figures of speech (or word pictures) lend themselves handsomely to this fallacy: e.g., "Since he stopped gambling, Joe's been on an even keel -- what's he been doing on a boat all this time?" A related error can easily occur when a speaker uses words in an unorthodox, esoteric manner. If the speaker uses terms to which he/she attaches obscure or novel definitions which the listener cannot be expected to recognize, then the listener cannot accurately examine or argue with such statements. Thus both the speaker's argument and conclusion are rendered fallacious since the speaker has not communicated to, or has been misinterpreted by, the audience.

Composition - This is an inference that the whole of something has the same attributes as one of its parts: e.g., "The grapes are good this year; thus a crop consisting of good grapes must be a good crop." -- not if only five clusters fruited it isn't! Similarly, a too hasty generalization can be drawn about a whole group from limited or insufficient evidence: e.g, "Students born in Burma delivered the valedictory and salutatory addresses at this year's graduation. All of the Burmese students in our class graduated in the top ten percent. Obviously, all Burmese are highly intelligent."

Division - This is the opposite of the composition fallacy and contains the assumption that because the group as a whole has a quality, each member of the group possesses the same quality: e.g., "The grape crop is very large this year; therefore the crop must consist of very large grapes." Neither should it be presupposed that a general rule can be applied to a specific or exceptional case: e.g., "All bureaucracies are inefficient; Mary is a bureaucrat; therefore, Mary must be inefficient."

False Causes - If the several reasons used in an argument can be used to draw contradictory conclusions, the argument is invalidated, even if it is apparently valid as initially presented. This is easily seen in arguments where the validity of the conclusion rests in the sequence in which the facts were presented. The fallacy of false causes also covers situations where an event is misallocated to an unrelated cause: e.g., "A black cat crossed my path last week; and ever since I've had terrible luck; therefore my bad luck was caused by the black cat crossing my path."

Ambiguity - Generally, this refers to a premise in which what is meant has not been clearly stated. A common use of ambiguity is in propounding a statement which manages to say contradictory things at the same time (sometimes prompted by use of a valid word in a metaphorical or figurative sense): e.g., "He was mad when he lost the game." -- was he insane? This is often used in forecasting, oracular pronouncements, and where the speaker is uncertain so that the authority figure seems to be correct whatever the outcome or objection, e.g., "Caesar, the senators will kill." Who is supposed to be killed in this simple example? This type of statement says nothing. A related fallacy occurs when this kind of double-talk is used with certain parts stressed to give the statement apparent meaning: e.g., "I like lemonade and sugar." vs., "I like lemonade and sugar." -- does the speaker like sugar in the lemonade, lemonade and sugar separately, or both? This is a particularly insidious logical trap when the speaker quotes from another source and by a change in vocal emphasis, distorts (intentionally or not) the original's true meaning. In any of the above vague cases, the premise is invalidated, since the meaning cannot be nailed down with certainty.

Evidence quoted out of context also falls into this category. The statement may be accurately quoted, but the meaning has been rendered false. E.g., A reviewer's satirical comment "The play had wonderful moments, both of which must have occurred backstage during intermission." might be partially quoted on a billboard to say "The play had wonderful moments" by an unscrupulous producer more interested in peddling his product than being accurate.

Domino Fallacy - A process of reasoning which objects to adopting an action or position by stipulating that this will invariably lead to accepting a less desirable action or position. The second action or position will, in its turn, lead inevitably to a third, etc. -- until some ultimate, unavoidable abomination is reached. This is sometimes seen in arguments which oppose setting a precedent. This type of argument attempts to persuade the listener that one single "wrong" step will set off an unstoppable chain reaction. In reality, chain reactions can usually be stopped at any point, whenever the conditions allowing the reaction to continue are altered.

Misuse of Statistics - This is a vast, often technical subject beyond the scope of this article. However, two commonly encountered errors will illustrate the pitfalls here:

1. The Gambler's Fallacy ("If at first you don't succeed ...") i.e., the chances of getting either a one, two, three, four, five or six when rolling a single die are one in six. The odds against hitting the chosen number do not become more favorable with each subsequent toss. No matter how many times you roll the odds for each toss remain 1 in 6.

2. Selective Emphasis e.g., in a survey, 3 out of 10 spouses said they had cheated on their mates; conclusion: the family is clearly under attack and in decline. The way the statistic is presented de-emphasizes that a large majority (7 out of 10) are remaining faithful, and may also be ignoring long-term historical trends which could show infidelity rates to be cyclical, declining, etc., or that the family values being touted as being historically atypical.

Group Dynamics

We'd all like to think of ourselves as being independent, however it is the rare individual who does not automatically submit to the consensus of the group and is not cowed by those who sport the trappings of authority. A classic sociological experiment, first demonstrated by Stanley Milgram in the 1960's, showed just how little it takes to make people abandon their values -- some might say their humanity. In the experiment, the subjects were individually ushered into a room in which were a chair, a microphone, speaker and a machine having series of dials and buttons. A man in a white coat entered the room and explained that the machine was connected by wires to a person in another room. The subject was given a contrived explanation that the experiment was to see if people's learning ability could be improved through negative stimuli, or some such mumbo-jumbo. The man in the white coat instructed the subject to read a series of questions to the person in the other room. Each time a wrong response was given, the subject was to press a button to administer an electrical shock to the person in the other room. For each subsequent wrong answer, the intensity of the shock was increased one increment, all the way to potentially lethal levels. An actor, pretending to be the person being tortured in the other room, supplied both wrong answers and screams of pain over the loudspeaker. The researchers were surprised to discover that the overwhelming majority (something like 92%) of subjects were willing to go all the way and administer torture at potentially lethal levels even though the only pressure they were under to perform was provided by the other man in the room (who was instructed only to answer any reservations that might be expressed with the phrase: "The experiment must continue.") The subjects perceived the other man as an authority figure on just the basis of the white coat he wore, and rarely challenged him or expressed reservations. Not only did this experiment provide disturbing insights as to how easy it is to rationalize atrocities, but it showed just how little it takes to get people to abandon their convictions and support the policy and rationale of a group or leader (qualified or not).

An interesting sidelight to the situation above was provided in a second experiment in which subjects watched re-enactments of the previous experiment and were asked how they felt about people who reacted in different ways. Subjects tended to dislike most, those who objected and who reacted explosively, condemning both the authority figure and torture "experiment" itself. Those who refused to participate also, but did so in a less aggressive, even apologetic manner were seen as more likable. People tend to be impressed with dissenters who are perceived as being consistent, firm and independent.

From childhood, we are all subjected to peer pressure and authority figures. As we age, it is comforting to associate with others sharing some of the same experiences. Most will derive their self identity, at least in part, from the group or groups they join. This can be seen most clearly in the individuality teenagers sacrifice in order to adopt the precepts of gangs and high school cliques. In adults, the same process expresses itself in blind adherence to political party lines, cults and group prejudices. Usually, people will say they believe in truth, but actually settle for a group's consensus. To our shame, we all know that it doesn't matter if the whole world believes something -- if it isn't true, it isn't true. Yet, most people find it impossibly difficult to stand alone in a position that challenges a group consensus. In adolescents, we call this "caving in to peer pressure," but it is a characteristic just as evident in adults. The desire to belong is strongly ingrained and inculcated, and like most desires, is exploitable.

As with emotion, group associations can be constructive. Streets can be paved, medicine shipped to flood victims, buildings raised. The trouble comes when we become dependent on the group: when we accept falsehood instead of truth because it is popular, when our perceptions of the world are shaped and narrowed by what others say; when the "facts" we learn are rationalized to fit group tenets; or when we act in a manner the group or its authority figure expects of us. If the point comes that who we are, how we think or how we live is based to any degree on the precepts or dictates of others, then it becomes very difficult to break free and independently establish what is true, logical or ethical.

It has been shown that people find it extraordinarily difficult to object or take a stand on their own. Many, however, will stand up for principles if there are at least one or two others who they can count on to support the same position. In controlled group situations, that support is seldom, if ever, allowed to manifest itself. Instead, and even when there is some support, there is the constant implication that the dissident is rejecting the values held by the group, as expressed by the authority figure. When a perception of being outnumbered is present, the psychological pressure on the dissenter to submit is enormous.

The propensity to submit to the group is exploited by the authority figure. The authority figure (speaker, ruler, priest, etc.) is invested by the group with the right to speak collectively for its members. Generally speaking, objections on the part of individual members are stifled, or dispersed by the awesome influence and presumed opposition of the group the authority represents. Many people will never escape the trap of unquestioning obedience to authority figures and group values because these people's very identity and place in the scheme of things has been formed around the precepts held forth by the group.

Distortion of facts and logic in a group situation are well known mind control techniques. The group can consist of just the authority figure representing an unseen group and the compliant listener (although in this situation the subject of control usually must be under the physical control of the authority). In cult and political situations, the group can be hundreds or millions. What is important is that the group provides the pressure to conform through providing a comforting environment where acceptance of predigested formulas always takes precedence to rational thinking. Leaving this environment becomes something dreaded (whether from fear of losing friends, belief systems, self value, etc.) Submission follows. Persons long exposed to this kind of environment eventually censor their own thoughts and actions to conform to the little world they've entered. Authority figures may even exercise control and discipline over such persons by non-verbal means such as a raised eyebrow, a frown or the tone of voice. People thus "brain washed" become like little children (at least in those areas of life where the group/authority chooses to establish its influence) ruled by authoritarian parents -- only they don't grow or develop as do real children.

Be careful!
Questioning What Is Said

It is one thing to understand how people are misled, and quite another to put that knowledge to use in everyday situations in which you are exposed to manipulation. One way is by making it a habit to examine what you hear. Make a note of each point, so you can test the validity of what you have heard:

* What did the speaker actually say?
* Is the speaker correctly stating the facts?
* Am I being told all the facts, or just those facts which support the speaker's argument?
* Is this statement germane to the subject at hand?
* What did the speaker mean by this statement?
* Is the speaker's interpretation valid?
* Is the speaker qualified to make this interpretation?
* Are there alternative facts that would invalidate the speaker's argument?
* Are there alternative conclusions that could be drawn from the speaker's argument?
* What argument is the speaker advancing by this statement?
* Is this meant to throw me off of the track of something important?
* Am I hearing logical reasoning and verified facts, or just being asked to adopt the speaker's convictions without valid proof?
* How does the speaker want me to think?
* What does the speaker want me to do?

Listener Defenses

If at all possible, the listener must remove him/herself from the influence of the group and the emotional pull of the speaker. These make it very hard to rationally examine what has been said.

It helps to be exposed to a variety of views. Repeated exposure to the same faulty logic and false claims only compounds the influence these will have on you. Even if your object is to attend a monologue delivered by an amusing comedian, a steady diet of this one viewpoint will have an influence on your perceptions. You lose some of your objectivity.

It is sometimes helpful if the speaker furnishes a transcript ahead of time on which digressions and emphasis can be noted by the listener, and which can be dispassionately examined later. More problematic, is the taking of notes, as these are often incomplete and tend to overemphasize the conclusions of the speaker. In a similar vein, taped speeches often preserve the emotional and oratorical devices employed. A written transcript of taped lectures will be far easier to sort out. Note that having a written copy does not guarantee protection from unsound reasoning, prejudice or emotional influences, but they do allow you the time to examine and re-examine. Some types of propaganda have been shown to be more effective when presented in a written rather than oral format. However, any of the above is better than relying on what the listener thinks was said.

Another listening defense has to do with the attitude you take going into lectures or even discussions. Most people go into a lecture situation (class, sermon, speech, etc.) predisposed to agree with, submit to, and/or learn from the point of view to be expressed -- why else would they bother attending? Rather than the natural position (expecting to agree), try taking and maintaining an adversarial position while listening. This will help you to separate yourself from audience influence, at least somewhat. If you take stereotypes and diatribes as being directed at you personally, you'll better be able to judge the speaker's fairness. If, when prompted to agree, you disagree and construct possible objections, you'll be in a position to partially reconstruct the missing other side of the argument.

And there is nothing wrong with standing up and asking a question or challenging falsehood. Many cultures do not share our passion for appearing to be polite and uncritically tolerant. Do you actually believe that it is polite to mutely allow lies to go unchallenged, to let others be led astray by unsound reasoning, to permit insults and prejudice to pass without objection, or to sit by quietly and thus lend your tacit agreement and support to the promulgation of error? This may require more courage than most of us possess, as we are conditioned to submit to authority figures. But we can flee such situations; at the very least, we can walk out. There is no law, no rule of conduct, that requires us to allow ourselves to be utilized and manipulated by others.

We also too readily attribute a "good" motive to the speaker. However, we must bear in mind that, even if we have judged the speaker rightly, he/she may still be wrong and quite sincerely believe the delusions he/she is trying to inculcate into the minds of the audience. Sincerity is never a measure of the truth of any presentation or its content.

If the subject is going to influence your life at all, it is also vital that you do your own research. The man or woman on the dais or behind the podium is not a god or some divinely inspired being with knowledge unavailable to anyone else. If such material were the province only of speakers, they wouldn't have any business trying to disclose it to the rest of us. Their theories are supposed to be backed up by real evidences. Check out the facts for yourself. Examine the logic they've used in constructing their position. One false support (or lack of valid support) taints the speaker's entire conclusion. A tainted position is not necessarily wrong, but must be discarded until it can be validly reproved from the ground up. To accept statements on the basis of someone's word -- because you like and trust the speaker, because you don't want to exercise your brain, because the statement appeals to you, because it is more comfortable to believe than to question -- is to prefer ignorance and delusion to knowledge and truth.

Ideally, all public speaking would be conducted as open discussions rather than monologues. An atmosphere in which questioning and debate are encouraged is far more informative and less prone to abuse.

Without hard facts, you can only test the logic of statements you hear. And, however sound you find the logic, you must also determine if the facts used are true in order to accept the statement as valid. One does not stand without the other. But what if you, as do many, insist that it isn't so important that everything add up, and that the general drift and how you feel about what was said will do for proof. In that case, your motive seems seriously compromised and you should determine whether truth is your real goal, or if you actually search for something else.

We are targets in other ways. Public relations firms and professional opinion managers regularly fine-tune messages and images through psychological testing to produce the effect they wish. The ads we read, the politicians we hear, the causes we are enjoined to support base their appeals on the fact that most of the intended audience will react in predictable ways. They know that most would rather react than think. They know that even the knowledgeable are more predisposed towards a blind acceptance of pre-packaged emotional presentations, myths and wishful thinking, than to analyzing a sober presentation of fact. They know that, even if a person realizes that he or she is being manipulated (especially in a group situation), chances are very small that he or she will take the time to defend themselves, much less protest. They know that people prefer to believe what is comfortable and attractive -- things that do not require self examination and which affirm their and their peer's preconceptions -- no matter what the message may lack in soundness of reasoning or truth.

Finally, we must continually re-examine our own positions and reasoning. A lot of people boast about being "open minded." What they usually mean is that they'll allow others to stick to an opposing opinion, while not giving up or changing their own conclusion. This kind of attitude may be tolerant, but it falls far short of seeking the truth. We should, instead, be always "open minded" to facts and new perspectives. Just because we've grown up with, or based our thinking on certain concepts is no reason to ignore challenges to any premise we hold dear. We should be anxious to know if we're wrong and, if so, why we are wrong -- whether or not it means we have to rethink our whole outlook. This may be extremely difficult for some to do -- it is always easier, more comfortable and pleasant to go about in the same old habits, ignore reality, conform to the peer group and live in a fool's paradise. Error is everywhere, it's easy to believe, it's seductive, and it will become part of our very thinking process if we don't resist it.

Also, have they changed thesaurus.com recently? It does not seem to fill me with glee as it once had. There seems to be a significant drop in available similar words. Have they realized that there is no such thing as a synonym?

procrastination station, edjamacayshun

Previous post Next post
Up