Are Widely Held Views Often Wrong Essay

Michelle Maiese

Originally Published July, 2003; Current Implications section added by Heidi Burgess in April 2017.

What Is Moral Conflict?

Current Implications

The 2017 Presidential election in the United States was a "wake up call" for many people. Many of us were not aware of the depth of the distributional--and moral--divide in this country. More...

Protracted conflict sometimes results from a clash between differing world-views. One group's most fundamental and cherished assumptions about the best way to live may differ radically from the values held by another group.[1] Parties may have different standards of rightness and goodness and give fundamentally different answers to serious moral questions.[2] When groups have different ideas about the good life, they often stress the importance of different things, and may develop radically different or incompatible goals. This can lead to conflict.

Because values and morals tend to be quite stable, people are often unwilling to negotiate or compromise with respect to these topics. Indeed, if the basic substantive issues of the conflict are deeply embedded in the participants' moral orders, these issues are likely to be quite intractable.[3]

A group's moral order is related to its practices, its patterns of thinking, and its patterns of language. As they are socialized, group members learn to center their judgments on values and procedures fundamental to their own common culture.[4] Their moral order provides the set of meanings through which they understand their experience and make judgments about what is valuable and important.[5] These patterns of meaning shape the way that individuals understand facts and issues and help them to develop a sense of identity. Social reality also dictates what counts as appropriate action and sets boundaries on what people are able to do.[6] It even affects the way in which emotions are labeled, understood, and acted upon. Thus, an individual's beliefs, sayings, and actions must be understood within the context of a particular social world.

People from the same culture have more or less equivalent realities and mindsets. Their values, assumptions, and procedures become part of "common sense" for them. However, when two parties that do not share norms of communication [customary patterns and rules of communication] and expectations about behavior must interact, they often clash.[7] Each party may believe that its ways of doing things and thinking about things is the best way and come to regard other ways of thinking and acting as inferior, strange, or morally wrong.[8]

Moral conflict occurs when disputants are acting within different social worlds, according to different meanings.[9] Indeed, one of the reasons groups in conflict have trouble breaking the pattern of interaction between them is that each is caught in its own moral order. When two groups have radically different ways of making sense of human life, it is likely that actions regarded by one side as good and prudent will be perceived by the other as evil or foolish.[10] This is because an action that one moral order deems perfectly acceptable may be regarded as an abomination by a different moral order.

For example, sometimes people distinguish between moral orders built on rights and those built on virtues.[11] Each one is associated with particular forms of society and ways of being human. While a rights-based approach is associated with the Enlightenment and modernity, a virtues-based approach emerges from traditional society. When modernists carry out acts regarded as obligatory or good within their own moral order, "these very acts offend traditionalists."[12] Inter-racial or inter-religious marriages, for example, are seen by many as one outgrowth of inclusivity and tolerance. The freedom to marry anyone is a "right." Traditionalists, however, would see it as evil -- harming their race or religion. Likewise, some traditional religious and political activities, for instance, limiting women's dress, freedom of movement, education, and/or public involvement is seen as abhorrent to modern, Western societies. The freedom to wear what one wants, and do what one wants, with no limitations, is seen as a woman's right. Yet the freedom that women exhibit in Western societies is abhorrent to some very traditional Muslim cultures, in which women's modesty is seen as a virtue. In short, the two groups have clashing conceptions of moral value.

In many cases, culture has a powerful influence on the moral order. Because systems of meaning and ways of thinking differ from one culture to another, people from different cultures typically develop different ideas about morality and the best way to live. They often have different conceptions of moral authority, truth, and the nature of community.[13] For example, some cultures place great moral emphasis on the family, while others stress the importance of individual autonomy. These cultural differences become even more problematic when groups have radically different expectations about what is virtuous, what is right, and how to deal with moral conflicts.[14] Thus, culture wars are often driven by moral conflict.

In some cases, one group may come to view the beliefs and actions of another group as fundamentally evil and morally intolerable. This often results in hostility and violence and severely damages the relationship between the two groups. For this reason, moral conflicts tend to be quite harmful and intractable.

Features of Moral Conflict

To further understand moral conflict and deal with it effectively, it is helpful to be aware of its common features.


The first general feature is the tendency for each side to misunderstand the words and actions of the other. People from incommensurate traditions may have trouble communicating because they rely on different systems of meaning, norms of communication, and behavioral expectations.

One possibility is that the participants use the same vocabulary but define and use these key terms differently. For example, the word "honor" might mean martial excellence to one party and economic success to the other.[15] But it is also possible that the groups simply rely on radically different vocabularies that stress the importance of different values. If one party regards the key terms used by the other as unimportant, communication between them will be quite strained. All of this contributes to misunderstanding and makes it very difficult for participants to "articulate the logic of the other sides' social world in ways that the other side will accept."[16]

Further misunderstanding and erroneous perceptions may arise because groups often perceive, define, and deal with conflict in different ways.[17] Because of differing cultural frames,many of the words used to describe appropriate behavior during conflict do not reflect the same content from one culture to another. For example, the terms "conflict," "aggression," "peace," "time," and "negotiation" are not value-free. They carry judgments with them and may be used differently in different cultures.[18] Aggression, usually defined as intentionally hurting another person, is a reflection of norms of conduct, and what hurts in one society may not be what hurts in another society. Thus, indicators of aggression may vary.[19] In the Middle East, for example, a direct refusal is considered a hostile gesture. But in other cultures, raising an objection is customary and well accepted. Ideas about fairness and images of justice can also vary among different groups.

The moral positions of anti-abortion and pro-choice activists are sometimes regarded as incommensurable. That is, the parties not only disagree about substantive moral issues, but also approach moral questions in a fundamentally different way. For this reason, the abortion debate is a prime example of a moral conflict. Because parties are unlikely to be willing to compromise their most cherished values, such conflicts are likely to be interminable and intractable.


The second general feature of moral conflict is that group members tend to develop feelings of mistrust and suspicion toward the other group -- even a sense that the other group poses a danger to their very survival. Given the groups' different values and systems of meaning, actions taken by one side to defuse or resolve the conflict may often be perceived as threatening by the other party.[20] This second party is likely to be stunned and offended by the other's action, and to respond in a negative way. This serves to perpetuate and/or intensify the conflict. Thus, the groups' different conceptions of morality lead to misunderstanding, which in turn contributes to conflict escalation.

Strained and Hostile Communication

Another general feature of moral conflicts is the hostility characteristic of the relationship and the communication between the parties. While sophisticated rhetoric consists of exchanging reasons in a quest to form shared beliefs, the patterns of communication in moral conflicts consist primarily in personal attacks, denunciations, and curses.[21] Slogans and chants replace arguments intended to persuade and inform, and the discourse between the two groups involves many statements about what is wrong with the other group. Thus, opportunities for opposing groups to converse intelligibly and reason together are diminished. When one group is denounced, its members are likely to become defensive, which can contribute to more negative emotions and behavior.

Thus, discourse often moves to sweeping generalizations and abstract principles.[22] For example, groups may appeal to abstract ideals of religion, patriotism, liberty, or "what America is all about" to point out why the actions of another group are morally wrong. In many cases, groups rely on rigidly held social or political beliefs, or ideology, to indicate why their position is morally superior. Such ideology is often accompanied by a sense of urgency about the need for pursuing those ideals.[23]

Additional insights into moral or value conflicts are offered by Beyond Intractability project participants.

Negative Stereotyping

Discourse often involves sweeping generalizations about members of the other group. People in moral conflicts tend to invidiously categorize and denounce the personalities, intelligence, and social manners of those with whom they disagree.[24] They may form negative stereotypes and attribute moral depravity or other negative characteristics to those who violate their cultural expectations, while they ignore their own vices and foibles, perceiving their own group to be entirely virtuous. This is what social psychologists call the attribution error.

For example, disputants may attribute the "strange" behavior of foreigners to undesirable character traits, such as moral depravity or lack of intelligence, rather than realizing that their seemingly inappropriate acts are simply a matter of cultural difference.[25] Because parties are typically unable to give rich accounts of the moral order of the opposing group, they are likely to attribute whatever the group does to its stupidity, evil nature, and overall moral depravity. Groups with radically different conceptions of morality may feel stunned and offended by the actions or words of the other group and denounce those actions or the group as a whole.[26]


These belief systems pull together fundamental assumptions and global viewpoints that are in general not up for compromise.[27] Strict adherence to ideology can make it particularly difficult for individuals to approach those with differing worldviews with an open mind. They come to see the conflict entirely in win-lose terms. They may even get to the point that the goal of harming the other becomes more important than helping oneself.[28]

Effects of Moral Conflict

Not surprisingly, moral conflict often has harmful effects. Participants in moral conflict often behave immorally, even according to their own standards of behavior, because they believe the actions of their enemies force them to do so.[29] If a group is regarded as morally depraved, its members may come to be regarded as less than human and undeserving of humane treatment. The demonization or dehumanization of one's opponent that often occurs in moral conflict paves the way for hateful action and violence. It often leads to human rights violations or even attempts at genocide, as parties may come to believe that the capitulation or elimination of the other group is the only way to resolve the conflict.[30]

Why Moral Conflict is Intractable

Because of its deep-roots, moral conflicts tend to be intractable and long-lasting.[31] Parties to such conflict often have great difficulty in describing the substantive issues in shared terms. Because they are arguing from different moral positions, they disagree about the meaning and significance of the important issues.[32] This makes negotiation or compromise extremely difficult in and of itself.

Resolution becomes even more difficult when parties disagree not only about substantive issues, but also about which forms of conflict resolution are morally right, aesthetically preferred, and politically prudent.[33] Parties may have very different ideas about how to gather information, arrive at a conclusion, make a decision, and deal with uncertainty.[34]

Over the course of conflict, the original issues often become irrelevant and new causes for conflict are generated by actions within the conflict itself. This is because in moral conflict, when groups try to act consistently with what they believe is morally good and just, they "prove" to the other side that they are fools or villains.[35] Thus, the means by which the parties seek resolution often just provoke further conflict. As the conflict continues, substantive issues are largely forgotten and "the other side's means of dealing with the conflict is itself the force that drives the interactions among the various conflicted parties."[36] Thus, moral conflicts are self-sustaining.

Parties involved in moral conflict also tend to have great difficulty in imagining a win-win resolution of the conflict at hand. The substantive issues are often a matter of rigidly held moral beliefs, based in fundamental assumptions that cannot be proved wrong.[37] These fundamental moral, religious, and personal values are not easily changed, and people who adhere to a particular ideology may very well be unwilling to compromise their world-view. Instead, as noted earlier, they may engage in diatribe, a rhetorical strategy that discredits adversaries by characterizing them as evil or morally inferior.[38] Such characterizations often lead to subversion, repression, and violence. Because rational discourse has become useless, each party may try to force the other side into compliance.[39] The conflict is likely to escalate and become more protracted as a result.

Also, those involved in moral conflict may regard perpetuation of the conflict as virtuous or necessary. They may derive part of their identity from being warriors or opponents of their enemy and have a stake in the continuation of the conflict because it provides them with a highly desirable role.[40] In addition, because struggles over values often involve claims to status and power, parties may have a great stake in neutralizing, injuring or eliminating their rivals. They may view any compromise about their most cherished values as a threat to their very identity and a grave evil. Indeed, moral conflicts often stem from a desire to safeguard basic human needs such as security and social recognition of identity. On some occasions, the continuation of a conflict may seem preferable to what would have to be given up if the other party were accommodated.[41]

Unfortunately, those enmeshed in moral conflict may be unable to discern the effects of conflict, even if those effects themselves threaten the basic human needs that were at issue. Because moral conflicts tend to be intractable and have great potential for violence, we must search for new ways to manage them.

Dealing With Moral Conflict

What can be done when parties are faced with moral differences that seem to be intolerable?

Changing the Stories

In some cases, each party can heighten its understanding of the other's world-view through new forms of communication. Some suggest that moral conflict be viewed as a particular form of communication and pattern of interaction. At various points in a moral conflict, people have the ability to handle their conflict differently.[42] One way in which people can change the pattern of conflict is by telling different stories about what they are doing. By using narratives and story-telling to communicate they can enrich the views that each side has about the other, often revealing commonalities in the midst of all the differences.


Third parties can sometime help the disputants to redefine or reframe their conflict, focusing more on attainable interests and less on non-negotiable positions or negative stereotypes. They can also help parties to seek mutually beneficial outcomes rather than competitive, win-lose outcomes. Even if the moral differences cannot be eliminated, sometimes the parties share interests or needs. All sides, for example, have a need for security, and increasing the feeling of security of one side does not diminish the security of the other side, as is commonly believed. Rather the opposite is generally true: the more secure one side feels, the less it feels a need to attack the other side; hence the more secure the other side is likely to feel. Therefore, reframing the conflict as a problem (at least in part) of security can sometimes help to get the parties to focus on something they can achieve together rather than on their non-negotiable differences.


Similar to story-telling, dialogue is a process of in-depth communication that allows parties to get to know each other better and to find commonalities with the other side. Although there are many forms and contexts of dialogue, all seek to replace the ubiquitous "diatribe" of moral conflicts with respectful communication, empathic listening, improved understanding, and respect. In some cases, these new forms of communication may help parties to see that their moral disagreements are less deep and fundamental than they previously thought. However, in other cases, the substantive issues will truly be beyond compromise.

Some suggest that in these sorts of cases, parties must strive to develop a space for citizenly public discourse.[43] Even though the parties have radically different world-views and do not agree about the relevant issues, they can nevertheless reach an agreement about how to contend with moral and political differences in a constructive way. In other words, they can come to an agreement about how to disagree. They can thereby find a way to manage their conflict in a way that minimizes the costs to both parties.

Current Implications

The 2017 Presidential election in the United States was a "wake up call" for many people. Many of us were not aware of the depth of the distributional--and moral--divide in this country.  While there are undoubtably many reasons why the election came out as it did, some observers believe that the past political successes of the left in forcing their moral views on the entire country was at least in part (perhaps in large part) responsible for the backlash that put Donald Trump in power. Fundamentalist Christians chaffed at being told that they had to issue marriage licences for gay couples (and at least one, who made the news, refused to do so). Christian bakers didn't want to bake "gay cakes." And Christian hospitals and businesses didn't want to be forced to provide abortions or birth control pills.

The left, meanwhile, assumed that they were "right" (meaning correct) and that the rest of the country was "coming around."  This election shows, I think, that the country didn't "come around" as much as we thought it did.  Morals, as this article argues, are very strong, very stable.  And when a conflict involves such issues, it tends to become intractable.

As I re-read this article to write this "current implications" note, I was particularly struck with Maiese's list of "Features of Moral Conflict."

1 - Misunderstandings

2 - Mistrust

3- Strained and Hostile Communication

4- Negative Stereotyping

5- Non-negotiability. 

All of these are rampant between the right and the left right now.  We don't understand each others' worldviews, nor do we even try to talk to the other side to try to learn about their views.  We "know" we are right, they are wrong, and we have no interest in compromising or even listening to the other side. 

All of this contributes to intractability.  But note! This article lists some positive things that can be done to address such issues...and these suggestions are very valid in this case.  

First, people can change their stories--they can explain who they are and why they believe what they do in different and sometimes more compelling ways.  When I listened to Trump voters explain why they voted for him, I was surprised and in some sense sympathetic.  I maybe wouldn't have made the same choice had I been in their shoes. But I could understand and empathize with their struggles much more than before when I hadn't heard those stories.

2.  Reframe.  To the extent we can reframe the dialogue to be about "all of us" instead of "us-versus-them," the better off we could be.  I too, actually, want to "make America great again."  So let's talk about what that means and how we can do it.  Many of my friends believe it's about going backwards--going back to the 50s and its anti-women, anti-minority attitudes.  That may be part of it, yes, but it is also about fundamental things such as security, jobs, and hope.  We all want those.  So if we can reframe the conversation about how we can all get those, we might be able to move away from the intractable moral conflict.

3.  Lastly dialogue.  This is a very effective way to get (willing) people to listen and learn from "the other."  It has been used successfully in many contexts and goes a long way toward ameliorating moral conflicts.  However, it is a "table-oriented process"  meaning it is small scale, usually involving between 10-20 people.  We need to figure out how to "scale dialogue up" so that its benefits can be experienced by 1000s or hundreds of 1000s of people.  That's a serious challenge!

Heidi Burgess, May 2017.

Back to Essay Top

[1] W. Barnett Pearce and Stephen W. Littlejohn. Moral Conflict: When Social Worlds Collide. (Thousand Oaks, CA: Sage Inc., 1997), 49.

[2] Otomar J. Bartos and Paul Wehr. Using Conflict Theory. (New York: Cambridge University Press, 2002), 41.

[3] Pearce and Littlejohn, 50.

[4] Paul R. Kimmel, "Culture and Conflict," in The Handbook of Conflict Resolution: Theory and Practice, eds. Morton Deutsch an Peter T. Coleman. (San Francisco: Jossey-Bass Publishers, 2000), 456.

[5] Pearce and Littlejohn, 51.

[6] ibid, 54.

[7] Kimmel, 453.

[8] ibid, 457.

[9] Pearce and Littlejohn, 55.

[10] ibid, 50.

[11] ibid, 59.

[12] ibid, 60.

[13] ibid, 70.

[14] ibid, 62.

[15] ibid, 68.

[16] ibid, 68.

[17] Guy Oliver Faure, "Conflict Formation: Going Beyond Culture-Bound Views of Conflict," in Conflict, Cooperation, and Justice, eds. Morton Deutsch, Barbara Bunker, and Jeffrey Rubin. (San Francisco: Jossey-Bass Publishers, 1995), 39.

[18] Faure, 41.

[19] ibid, 42.

[20] Pearce and Littlejohn, 68.

[21] ibid, 75.

[22] ibid, 70.

[23] David P. Barash and Charles P. Webel. Peace and Conflict Studies. (California: Sage Publications, 2002), 233.

[24] Pearce and Littlejohn, 74.

[25] Kimmel, 457.

[26] Pearce and Littlejohn, 73.

[27] Barash and Webel, 234.

[28] Pearce and Littlejohn, 73.

[29] ibid, 73.

[30] ibid, 68.

[31] ibid, 68.

[32] ibid, 71.

[33] ibid, 69.

[34] Kimmel, 459.

[35] Pearce and Littlejohn, 69.

[36] ibid, 69.

[37] Barash and Webel, 234.

[38] Pearce and Littlejohn, 118.

[39] ibid, 119.

[40] ibid, 70.

[41] ibid, 70.

[42] Pearce and Littlejohn, 77.

[43] ibid, 104.

Use the following to cite this article:
Maiese, Michelle. "Moral or Value Conflicts." Beyond Intractability. Eds. Guy Burgess and Heidi Burgess. Conflict Information Consortium, University of Colorado, Boulder. Posted: July 2003 <>.

Additional Resources

This incomplete list is not intended to be exhaustive.

This list of common misconceptions corrects erroneous beliefs that are currently widely held about notable topics. Each misconception and the corresponding facts have been discussed in published literature. Note that each entry is formatted as a correction; the misconceptions themselves are implied rather than stated.

Arts and culture

Food and cooking

  • Searing meat may cause it to lose moisture in comparison to an equivalent amount of cooking without searing. Generally, the value in searing meat is that it creates a brown crust with a rich flavor via the Maillard reaction.[1][2]
  • Food items cooked with wine or liquor retain alcohol according to a study which found that some of the alcohol remains: 25 percent after one hour of baking or simmering, and 10 percent after two hours; in either case, however, the amount consumed while eating a dish prepared with alcohol will rarely if ever contain sufficient alcohol to cause even low levels of intoxication.[3][4]
  • There is no consistent data supporting monosodium glutamate (MSG) as triggering migraine headache exacerbation or other symptoms of so-called Chinese restaurant syndrome. Although there have been reports of an MSG-sensitive subset of the population, this has not been demonstrated in placebo-controlled trials.[5][6]
  • Twinkies have a shelf life of approximately 45 days[7] (25 in their original formulation)—far shorter than the common (and somewhat jocular) myth that Twinkies are edible for decades or longer.[8] They generally remain on a store shelf for only 7 to 10 days.[9]
  • Fortune cookies, despite being associated with Chinese cuisine in the United States, were invented in Japan and introduced to the US by the Japanese.[10] The cookies are extremely rare in China, where they are seen as symbols of American cuisine.[11]
  • A standard cup of brewed coffee has more caffeine than a single shot of espresso. The belief that the reverse is true results from espresso having a higher concentration of caffeine, which is offset by the much larger volume overall of a regular cup of coffee.[12]

Microwave ovens

  • Placing metal inside a microwave oven does not damage the oven's electronics. There are, however, other safety-related issues: electrical arcing may occur on pieces of metal not designed for use in a microwave oven, and metal objects may become hot enough to damage food, skin, or the interior of the oven. Metallic objects designed for microwave use can be used in a microwave with no danger; examples include the metallized surfaces used in browning sleeves and pizza-cooking platforms.[13]
  • The functional principle of a microwave oven is dielectric heating rather than resonance frequencies of water, and microwave ovens can therefore operate at many frequencies. Water molecules are exposed to intense electromagnetic fields in strong non-resonant microwaves to create heat. The 22 GHz resonant frequency of isolated water molecules has a wavelength too short to penetrate common foodstuffs to useful depths. The typical oven frequency of 2.45 GHz was chosen partly due to its ability to penetrate a food object of reasonable size, and partly to avoid interference with communication frequencies in use when microwave ovens became commercially available.[14]

Law, crime, and military

  • It is rarely necessary to wait 24 hours before filing a missing person report. In instances where there is evidence of violence or of an unusual absence, law enforcement agencies in the United States often stress the importance of beginning an investigation promptly.[15] The UK government website says explicitly in large type, "You don't have to wait 24 hours before contacting the police".[16]
  • Entrapment law in the United States does not require police officers to identify themselves as police in the case of a sting or other undercover work, and police officers may lie in doing such work.[17] The law is instead specifically concerned with enticing people to commit crimes they would not have considered in the normal course of events.[18]
  • No one ever claimed in court that Twinkies made them commit a crime. In the murder trial of Dan White, the defense attorneys successfully argued diminished capacity as a result of severe depression. While eating Twinkies was given as evidence of depression, it was never claimed to be the cause of the murders. Despite this, people often claim that White's attorneys argued that Twinkies made him commit the murders.[19]
  • The US Armed Forces have generally forbidden deferred adjudication, or military enlistment in lieu of jail, since the 1980s. US Navy protocols discourage the practice, while the other four branches have specific regulations against it.[20][21]


See also: Wikiquote: List of misquotations


See also: Mondegreen § In songs



  • The historical Buddha was not obese. The "chubby Buddha" or "laughing Buddha" is a 10th-century Chinese folk hero by the name of Budai. In Chinese Buddhist culture, Budai came to be revered as an incarnation of Maitreya, the Bodhisattva who will become a Buddha to restore Buddhism after the teachings of the historical Buddha, Siddhārtha Gautama, have been forgotten.[25]
  • The Buddha is not a god. In early Buddhism, Siddhārtha Gautama possessed no salvific powers and strongly encouraged "self-reliance, self-discipline and individual striving."[26] However, in later developments of Mahāyāna Buddhism, notably in the Pure Land (Jìngtǔ) school of Chinese Buddhism, the Amitābha Buddha was thought to be a savior. Through faith in the Amitābha Buddha, one could be reborn in the western Pure Land. Although in Pure Land Buddhism the Buddha is considered a savior, he is still not considered a god in the common understanding of the term.[27]

Christianity and Judaism

  • The forbidden fruit mentioned in the Book of Genesis is never identified as an apple,[28] a misconception widely depicted in Western art. The original Hebrew texts mention only tree and fruit. Early Latin translations use the word mali, which can be taken to mean both "evil" and "apple". In early Germanic languages the word "apple" and its cognates usually meant simply "fruit". German and French artists commonly depict the fruit as an apple from the 12th century onwards, and John Milton's Areopagitica from 1644 explicitly mentions the fruit as an apple.[29] Jewish scholars have suggested that the fruit could have been a grape, a fig, wheat, an apricot, or an etrog.[30]
  • There is no evidence that Jesus was born on December 25.[31] The Bible never claims a date of December 25, but may imply a date closer to September.[31] The fixed date is attributed to Pope Julius the First because in the year 350 CE he declared the twenty-fifth of December the official date of celebration.[32][33] The date may have initially been chosen to correspond with either the day exactly nine months after Christians believe Jesus to have been conceived,[34][35] the date of the Romanwinter solstice,[36] or one of various ancient winter festivals.[34][37]
  • The Bible does not say that exactly three magi came to visit the baby Jesus, nor that they were kings, or rode on camels, or that their names were Casper, Melchior, and Balthazar. Matthew 2 has traditionally been combined with Isaiah 60:1–3.

Arise, shine, for your light has come, and the glory of the Lord has risen upon you. 2For behold, darkness shall cover the earth, and thick darkness the peoples; but the Lord will arise upon you, and his glory will be seen upon you. 3And nations shall come to your light, and kings to the brightness of your rising.

Three magi are supposed because three gifts are described, and artistic depictions of the nativity have almost always depicted three magi since the 3rd century.[38] The Bible specifies no interval between the birth and the visit, and artistic depictions and the closeness of the traditional dates of December 25 and January 6 encourage the popular assumption that the visit took place in the same season as the birth, but later traditions varied, with the visit taken as occurring up to two years later. This maximum interval explained Herod's command at Matthew 2:16–18 that the Massacre of the Innocents included boys up to two years old. More recent commentators, not tied to the traditional feast days, may suggest a variety of intervals.[39] (Matthew 2:11).[40]
  • The idea that Mary Magdalene was a prostitute before meeting Jesus is not found in the Bible. In the Gospel of Luke, there is a passage about a woman with a reputation for sinning (which may well mean prostitution) immediately before the story introducing Mary Magdalene for the first time. The Catholic Church, since Pope Gregory I's time in the 6th century if not before, had historically assumed that the two accounts refer to the same woman, meaning that before her encounter with Jesus, Mary Magdalene was a prostitute. But there is no direct evidence from the Bible over such a link, most modern scholars assert that she was most likely not a prostitute, and even the Catholic Church no longer suggests that the two passages from Luke refer to the same person.[41][42][43]
  • Paul the Apostle did not change his name from Saul. He was born a Jew, with Roman citizenship inherited from his father, and thus carried both a Hebrew and a Latin name from birth. Luke indicates the coexistence of the names in Acts 13:9: "...Saul, who also is called Paul...".[44][45]
  • The term "Immaculate Conception" was not coined to refer to the virgin birth of Jesus,[note 1] nor does it reference a supposed belief in the virgin birth of Mary, his mother. Instead, it denotes a Roman Catholic belief that Mary was not in a state of original sin from the moment of her own conception.[46]
  • Roman Catholic dogma does not say that the pope is either sinless or always infallible.[47] Catholic dogma since 1870 does state that a dogmatic teaching contained in divine revelation that is promulgated by the pope (deliberately, and under certain very specific circumstances) is free from error, although official invocation of papal infallibilityis rare. While most theologians state that canonizations meet the requisites,[48] aside from that, most recent popes have finished their reign without a single invocation of infallibility. Otherwise, even when speaking in his official capacity, dogma does not hold that he is free from error.
  • Mormons who are members of The Church of Jesus Christ of Latter-day Saints (LDS Church) no longer practice polygamy, although it was historically practiced in the LDS Church.[49][50][51][52] Currently, the LDS Church excommunicates any members that practice polygamy within the organization.[53] However, some Mormon fundamentalistsects still practice polygamy within their groups.[54][55] For more details on this subject, see Mormonism and polygamy.


  • A fatwā is a non-binding legal opinion issued by an Islamic scholar under Islamic law; it is therefore commonplace for fatwās from different authors to disagree. The popular misconception[56][57] that the word means a death sentence probably stems from the fatwā issued by Ayatollah Ruhollah Khomeini of Iran in 1989 regarding the author Salman Rushdie, whom he stated had earned a death sentence for blasphemy. This event led to fatwās gaining widespread media attention in the West.[58]
  • The word "jihad" does not always mean "holy war"; literally, the word in Arabic means "struggle". While there is such a thing as "jihad bil saif", or jihad "by the sword",[59] many modern Islamic scholars usually say that it implies an effort or struggle of a spiritual kind.[60][61] Scholar Louay Safi asserts that "misconceptions and misunderstandings regarding the nature of war and peace in Islam are widespread in both the Muslim societies and the West", as much following 9/11 as before.[62]
  • The Quran does not promise martyrs 72 virgins in heaven. It does mention companions, houri, to all people—martyr or not—in heaven, but no number is specified. The source for the 72 virgins is a hadith in Sunan al-Tirmidhi by Imam Tirmidhi.[63][64] Hadiths are sayings and acts of the prophet Mohammed as reported by others, and as such they are not part of the Quran itself. Muslims are not meant to necessarily believe all hadiths, and that applies particularly to those hadiths that are weakly sourced, such as this one.[65] Furthermore, the correct translation of this particular hadith is a matter of debate.[63] In the same collection of Sunni hadiths, however, the following is judged strong (hasan sahih): "There are six things with Allah for the martyr. He is forgiven with the first flow of blood (he suffers), he is shown his place in Paradise, he is protected from punishment in the grave, secured from the greatest terror, the crown of dignity is placed upon his head—and its gems are better than the world and what is in it—he is married to seventy two wives among Al-Huril-'Ayn of Paradise, and he may intercede for seventy of his close relatives."[66]


  • Abner Doubleday did not invent baseball, nor did it originate in Cooperstown, New York. It is believed to have evolved from other bat-and-ball games such as cricket and rounders and first took its modern form in New York City.[67][68] (See Origins of baseball.)
  • The black belt in martial arts does not necessarily indicate expert level or mastery. It was introduced for judo in the 1880s to indicate competency at all of the basic techniques of the sport. Promotion beyond black belt varies among different martial arts. In judo and some other Asian martial arts, holders of higher ranks are awarded belts with alternating red and white panels, and the highest ranks with solid red belts.[69]

Words, phrases and languages

Main articles: List of common false etymologies and Common English usage misconceptions

See also: List of common misconceptions about language learning

  • Non-standard, slang or colloquial terms used by English speakers are sometimes alleged not to be real words, despite appearing in numerous dictionaries. All words in English became accepted by being commonly used for a certain period of time; thus there are many informal words currently regarded as "incorrect" in formal speech or writing, but the idea that they are not words is a misconception.[70][71] Examples of words that are sometimes alleged not to be words include "irregardless",[72][73] "conversate", "funnest",[74] "mentee", "impactful", and "thusly",[75] all of which appear in numerous dictionaries as English words.[76]
  • The word "fuck" did not originate in Christianized Anglo-Saxon England (7th century CE) as an acronym for "Fornication Under Consent of King"; nor did it originate as an acronym for "For Unlawful Carnal Knowledge", either as a sign posted above adulterers in the stocks, or as a criminal charge against members of the British Armed Forces; nor did it originate during the 15th-century Battle of Agincourt as a corruption of "pluck yew" (an idiom falsely attributed to the English for drawing a longbow).[77]Modern English was not spoken until the 16th century, and words such as "fornication" and "consent" did not exist in any form in English until the influence of Anglo-Norman in the late 12th century. The earliest certain recorded use of "fuck" in English comes from c. 1475, in the poem "Flen flyys", where it is spelled fuccant (conjugated as if a Latin verb meaning "they fuck"). It is of Proto-Germanic origin, and is related to either Dutchfokken and German ficken or Norwegianfukka.[78]
  • The word "crap" did not originate as a back-formation of British plumber Thomas Crapper's surname, nor does his name originate from the word "crap", although the surname may have helped popularize the word.[79] The surname "Crapper" is a variant of "Cropper", which originally referred to someone who harvested crops.[80][81] The word "crap" ultimately comes from Medieval Latincrappa, meaning "chaff".[82]
  • The expression "rule of thumb" did not originate from a law allowing a man to beat his wife with a stick no thicker than his thumb, and there is no evidence that such a law ever existed.[83] The true origin of this phrase remains uncertain, but the false etymology has been broadly reported in media including The Washington Post (1989), CNN (1993), and Time magazine (1983).[84]
  • "Golf" did not originate as an acronym of "Gentlemen Only, Ladies Forbidden".[85] The word's true origin is unknown, but it existed in the Middle Scots period.[86][87][88]
  • The word "gringo" did not originate during the Mexican–American War (1846–48), the Venezuelan War of Independence (1811–23), the Mexican Revolution (1910–20), or in the American Old West (c. 1865–99) as a corruption of the lyrics "green grow" in either "Green Grow the Lilacs" or "Green Grow the Rushes, O" sung by US-American soldiers or cowboys;[89] nor did it originate during any of these times as a corruption of "Green go home!", falsely said to have been shouted at green-clad American troops.[90] The word originally simply meant "foreigner", and is probably a corruption of Spanish griego, "Greek".[91]
    "Xmas," along with a modern Santa Claus, used on a Christmas postcard (1910)
  • The anti-Italian slur wop is widely considered to have originated from an acronym for "without papers" or "without passport,"[92][93][93][94] whereas it actually originated from the term guappo (roughly meaning thug) in 1908[95][96], predating modern immigration laws.[97]
  • "420" did not originate from the Los Angeles police or penal code for marijuana use.[98] In California, Police Code 420 means "juvenile disturbance",[99] and California Penal Code section 420 prohibits the obstruction of access to public land.[98][100] The use of "420" started in 1971 at San Rafael High School, where it indicated the time, 4:20 pm, when a group of students would go to smoke under the statue of Louis Pasteur.[98]
  • The word "the" was never pronounced or spelled "ye" in Old or Middle English.[101] The confusion derives from the use of the character thorn (þ) in abbreviations of the word "the", which in Middle English text () looked similar to a y with a superscript e.[102][103]
  • "Xmas" did not originate as a secular plan to "take the Christ out of Christmas".[104]X stands for the Greek letter chi, the starting letter of Χριστός (Christos), or "Christ" in Greek.[105] The use of the word "Xmas" in English can be traced to the year 1021 when monks in Great Britain used the X as abbreviation while transcribing classical manuscripts into Old English in place of "Christ".[104] The Oxford English Dictionary's "first recorded use of 'Xmas' for 'Christmas' dates back to 1551."[106]
  • The pronunciation of coronal fricatives in Spanish did not come around as imitation of a lisping king. Only one Spanish king, Peter of Castile, is documented as having a lisp, and the current pronunciation originated two centuries after his death.[107]
  • The Chevrolet Nova sold very well in Latin American markets; General Motors did not need to rename the car. While "no va" does mean "doesn't go" in Spanish, "nova" is understood as "new" and drivers in Mexico and Venezuela where it was first sold bought it eagerly. There was no need to change the model name,[108] despite claims to the contrary.[109][110]
  • Sign languages are not the same worldwide. Aside from the pidginInternational Sign, each country generally has its own native sign language, and some have more than one (although there are also substantial similarities among all sign languages).[111][112][113]


Ancient to early modern history

  • Vomiting was not a regular part of Roman dining customs.[114] In ancient Rome, the architectural feature called a vomitorium was the entranceway through which crowds entered and exited a stadium, not a special room used for purging food during meals.[115]
  • The Library of Alexandria was not destroyed by the Muslim Army during the capture of the city in 641. A common misconception alleged that Caliph Umar ordered the destruction based on the reasoning "If those books are in agreement with the Quran, we have no need of them; and if these are opposed to the Quran, destroy them" (or its variation). This story did not appear in writing until hundreds of years after the alleged incident (most famously in the work of Bar Hebraeus in the 13th century) and contemporary accounts of the Arab invasion do not include any account of the library's destruction. Modern consensus suggests the library had likely already been destroyed centuries before this incident.[116][117] (It is instead believed that the Library of Caesarea, a key repository of Christian literature, was the library destroyed near this time.)[118]
  • It is true that life expectancy in the Middle Ages and earlier was low; however, one should not infer that people usually died around the age of 30.[119] In fact, earlier low life expectancies were very strongly influenced by high infant mortality, and the life expectancy of people who lived to adulthood was much higher. A 21-year-old man in medieval England, for example, could by one estimate expect to live to the age of 64.[120]
  • There is no evidence that Vikings wore horns on their helmets.[121] In fact, the image of Vikings wearing horned helmets stems from the scenography of an 1876 production of the Der Ring des Nibelungen opera cycle by Richard Wagner.[122]
  • Vikings did not drink out of the skulls of vanquished enemies. This was based on a mistranslation of the skaldic poetic use of ór bjúgviðum hausa (branches of skulls) to refer to drinking horns.[123]
  • King Canute did not command the tide to reverse in a fit of delusional arrogance.[124] His intent that day, if the incident even happened, was most likely to prove a point to members of his privy council that no man is all-powerful, and we all must bend to forces beyond our control, such as the tides.
  • There is no evidence that iron maidens were invented in the Middle Ages or even used for torture. Instead they were pieced together in the 18th century from several artifacts found in museums in order to create spectacular objects intended for (commercial) exhibition.[125]
  • The plate armor of European soldiers did not stop soldiers from moving around or necessitate a crane to get them into a saddle. They would as a matter of course fight on foot and could mount and dismount without help. In fact, soldiers equipped with plate armor were more mobile than those with mail armor (chain armor), as mail was heavier and required stiff padding beneath due to its pliable nature.[126] It is true that armor used in tournaments in the late Middle Ages was significantly heavier than that used in warfare,[127] which may have contributed to this misconception.
  • Whether chastity belts, devices designed to prevent women from having sexual intercourse, were invented in medieval times is disputed by modern historians. Most existing chastity belts are now thought to be deliberate fakes or anti-masturbatory devices from the 19th and early 20th centuries. The latter were made due to the widespread belief that masturbation could lead to insanity, and were mostly bought by parents for their teenage children.[128]
  • Medieval Europeansdid not believe Earth was flat; in fact, from the time of the ancient Greek philosophers Plato and Aristotle, belief in a spherical Earth remained almost universal among European intellectuals. As a result, Christopher Columbus's efforts to obtain support for his voyages were hampered not by belief in a flat Earth but by valid worries that the East Indies were farther than he realized.[129] If the Americas had not existed, he would surely have run out of supplies before reaching Asia.
  • Columbus never reached any land that now forms part of the mainland United States of America; most of the landings Columbus made on his four voyages, including the initial October 12, 1492 landing (the anniversary of which forms the basis of Columbus Day), were on Caribbean islands which today are independent countries. Columbus was also not the first European to visit the Americas: at least one explorer, Leif Ericson, preceded him by reaching what is believed to be the island now known as Newfoundland, part of modern Canada, though he never made it to the mainland.[130][131]
  • Marco Polo did not import pasta from China,[132] a misconception which originated with the Macaroni Journal, published by an association of food industries with the goal of promoting the use of pasta in the United States.[133] Marco Polo describes a food similar to "lagana" in his Travels, but he uses a term with which he was already familiar. Durum wheat, and thus pasta as it is known today, was introduced by Arabs from Libya, during their conquest of Sicily in the late 7th century, according to the newsletter of the National Macaroni Manufacturers Association,[134] thus predating Marco Polo's travels to China by about six centuries.
  • Contrary to the popular image of the Pilgrim Fathers, the early settlers of the Plymouth Colony did not wear all black, and their capotains (hats) were shorter and rounder than the widely depicted tall hat with a buckle on it. Instead, their fashion was based on that of the late Elizabethan era: doublets, jerkins and ruffs. Both men and women wore the same style of shoes, stockings, capes, coats and hats in a range of colors including reds, yellows, purples, and greens.[135] According to Plimoth Plantation historian James W. Baker, the traditional image was formed in the 19th century when buckles were a kind of emblem of quaintness.[136][137]
  • The accused at the Salem witch trials were not burned at the stake; about 15 died in prison, 19 were hanged and one was pressed to death.[138][139]
  • Marie Antoinette did not say "let them eat cake" when she heard that the French peasantry were starving due to a shortage of bread. The phrase was first published in Rousseau's Confessions when Marie was only nine years old and most scholars believe that Rousseau coined it himself, or that it was said by Maria-Theresa, the wife of Louis XIV. Even Rousseau (or Maria-Theresa) did not use the exact words but actually Qu'ils mangent de la brioche, "Let them eat brioche" (a rich type of bread). Marie Antoinette was an unpopular ruler; therefore, people attribute the phrase "let them eat cake" to her, in keeping with her reputation as being hard-hearted and disconnected from her subjects.[140]
  • George Washington did not have wooden teeth. His dentures were made of gold, hippopotamus ivory, lead, animal teeth (including horse and donkey teeth),[141] and probably human teeth purchased from slaves.[142]
  • The signing of the United States Declaration of Independence did not occur on July 4, 1776. After the Second Continental Congress voted to declare independence on July 2, the final language of the document was approved on July 4, and it was printed and distributed on July 4–5.[143] However, the actual signing occurred on August 2, 1776.[144]
  • Benjamin Franklin did not propose that the wild turkey be used as the symbol for the United States instead of the bald eagle. While he did serve on a commission that tried to design a seal after the Declaration of Independence, his proposal was an image of Moses. His objections to the eagle as a national symbol and preference for the turkey were stated in a 1784 letter to his daughter in response to the Society of the Cincinnati's use of the former; he never expressed that sentiment publicly.[145][146]
  • There was never a bill to make German the official language of the United States that was defeated by one vote in the House of Representatives, nor has one been proposed at the state level. In 1794, a petition from a group of German immigrants was put aside on a procedural vote of 42 to 41, that would have had the government publish some laws in German. This was the basis of the Muhlenberg legend, named after the Speaker of the House at the time, Frederick Muhlenberg, a speaker of German descent who abstained from this vote.[147][148][149]

Modern history

  • Napoleon Bonaparte was not short. He was actually slightly taller than the average Frenchman of his time.[150][151] After his death in 1821, the French emperor's height was recorded as 5 feet 2 inches in French feet, which in English measurements is 5 feet 7 inches (1.69 m).[152][153] Some believe that he was nicknamed le Petit Caporal (The Little Corporal) as a term of affection.[154] Napoleon was often accompanied by his imperial guard, who were selected for their height[155]—this could have contributed to a perception that he was comparatively short.
  • Cinco de Mayo is not Mexico's Independence Day, but the celebration of the Mexican Army's victory over the French in the Battle of Puebla on May 5, 1862. Mexico's Declaration of Independence from Spain in 1810 is celebrated on September 16.[156][157]
  • Cowboy hats were not initially popular in the Western American frontier, with derby or bowler hats being the typical headgear of choice.[158] Heavy marketing of the Stetson "Boss of the Plains" model in the years following the American Civil War was the primary driving force behind the cowboy hat's popularity, with its characteristic dented top not becoming standard until near the end of the 19th century.[159]
  • The Great Chicago Fire of 1871 was not caused by Mrs. O'Leary's cow kicking over a lantern. A newspaper reporter invented the story to make colorful copy.[160]
  • The claim that Frederic Remington, on assignment to Cuba in 1897, telegraphed William Randolph Hearst that "There will be no war. I wish to return" and that Hearst responded, "Please remain. You furnish the pictures, and I'll furnish the war" is unsubstantiated. This anecdote was originally included in a book by James Creelman, though there is no evidence that the telegraph exchange ever happened, and substantial evidence that it did not.[161][162]
  • Immigrants' last names were not Americanized (voluntarily, mistakenly, or otherwise) upon arrival at Ellis Island. Officials there kept no records other than checking ship manifests created at the point of origin, and there was simply no paperwork which would have created such an effect, let alone any law. At the time in New York, anyone could change the spelling of their name simply by using that new spelling.[163] These names are often referred to as an "Ellis Island Special".
  • The common image of Santa Claus (Father Christmas) as a jolly old man in red robes was not created by The Coca-Cola Company as an advertising gimmick. Despite being historically represented with different characteristics in different colours of robes, Santa Claus had already taken his modern form in popular culture and seen extensive use in other companies' advertisements and other mass media at the time Coca-Cola began using his image in the 1930s.[164]
  • Italian dictator Benito Mussolini did not "make the trains run on time". Much of the repair work had been performed before Mussolini and the Fascists came to power in 1922. Accounts from the era also suggest that the Italian railways' legendary adherence to timetables was more propaganda than reality.[165]
  • There was no widespread outbreak of panic across the United States in response to Orson Welles' 1938 radio adaptation of H.G. Wells' The War of the Worlds. Only a very small share of the radio audience was even listening to it, and isolated reports of scattered incidents and increased call volume to emergency services were played up the next day by newspapers, eager to discredit radio as a competitor for advertising. Both Welles and CBS, which had initially reacted apologetically, later came to realize that the myth benefited them and actively embraced it in later years.[166][167]
  • There is no evidence of Polish cavalry mounting a brave but futile charge against German tanks using lances and sabres during the German invasion of Poland in 1939. This story may have originated from German propaganda efforts following the charge at Krojanty, in which a Polish cavalry brigade surprised German infantry in the open, and successfully charged and dispersed them, until driven off by armoured cars. While Polish cavalry still carried the sabre for such opportunities, they were trained to fight as highly mobile, dismounted cavalry (dragoons) and issued with light anti-tank weapons.[168][169]
  • During the occupation of Denmark by the Nazis during World War II, King Christian X of Denmark did not thwart Nazi attempts to identify Jews by wearing a yellow star himself. Jews in Denmark were never forced to wear the Star of David. The Danish resistance did help most Jews flee the country before the end of the war.[170]
  • Albert Einstein did not fail mathematics classes (never "flunked a math exam") in school. Upon seeing a column making this claim, Einstein said "I never failed in mathematics... Before I was fifteen I had mastered differential and integral calculus."[171][172] Einstein did however fail his first entrance exam into the Swiss Federal Polytechnic School (ETH) in 1895, when he was two years younger than his fellow students but scored exceedingly well in the mathematics and science sections, then passed on his second attempt.[173]
  • Actor Ronald Reagan was never seriously considered for the role of Rick Blaine in the 1942 film classic Casablanca, eventually played by Humphrey Bogart. This belief came from an early studio press release announcing the film's production that used his name to generate interest in the film. But by the time it had come out, Warner Bros. knew that Reagan was unavailable for any roles in the foreseeable future since he was no longer able to defer his entry into military service.[174] Studio records show that producer Hal B. Wallis had always wanted Bogart for the part.[175][176]
  • U.S. Senator George Smathers never gave a speech to a rural audience describing his opponent, Claude Pepper, as an "extrovert" whose sister was a "thespian", in the apparent hope they would confuse them with similar-sounding words like "pervert" and "lesbian". Time, which is sometimes cited as the source, described the story of the purported speech as a "yarn" at the time,[177] and no Florida newspaper reported such a speech during the campaign. The leading reporter who covered Smathers said he always gave the same boilerplate speech. Smathers had offered US$10,000 to anyone who could prove he had made the speech; it was never claimed.[178]
  • John F. Kennedy's words "Ich bin ein Berliner" are standard German for "I am a Berliner."[179][180] An urban legend has it that due to his use of the indefinite article ein, Berliner is translated as jelly donut, and that the population of Berlin was amused by the supposed mistake. The word Berliner is not commonly used in Berlin to refer to the Berliner Pfannkuchen; is usually called ein Pfannkuchen.[181]
  • African American intellectual and activist W.E.B. Du Bois did not renounce his U.S. citizenship while living in Ghana shortly before his death,[182] as is often claimed.[183][184][185] In early 1963, due to his membership in the Communist Party and support for the Soviet Union, the U.S. State Department did not renew his passport while he was already in Ghana overseeing the creation of the Encyclopedia Africana. After leaving the embassy, he stated his intention to renounce his citizenship in protest. But while he took Ghanaian citizenship, he never went through the process of renouncing his American citizenship,[186] and may not even have intended to.[182]
  • When bartender Kitty Genovese was murdered outside her Queens apartment in 1964, 37 neighbors did not stand idly by and watch, not calling the police until after she was dead, as The New York Times initially reported[187] to widespread public outrage that persisted for years. Later reporting established that the police report the Times had initially relied on was inaccurate, that Genovese had been attacked twice in different locations, and while the many witnesses heard the attack they only heard brief portions and did not realize what was occurring, with only six or seven actually reporting seeing anything. Some called police; one who did not said "I didn't want to get involved", an attitude which later came to be attributed to all the residents who saw or heard part of the attack.[188]
  • The Rolling Stones were not performing "Sympathy for the Devil" at the 1969 Altamont Free Concert when Meredith Hunter was stabbed to death by a member of the local Hells Angels chapter that was serving as security. While the incident that culminated in Hunter's death began while the band was performing the song, prompting a brief interruption before the Stones finished it, it concluded several songs later as the band was performing "Under My Thumb".[189][190] The misconception arose from mistaken reporting in Rolling Stone.[191]
  • While it was praised by one architectural magazine prior to its construction as "the best high apartment of the year", the Pruitt–Igoehousing project in St. Louis, Missouri, considered to epitomize the failures of urban renewal in American cities after it was demolished in the early 1970s, never won any awards for its design.[192] The architectural firm that designed the buildings did win an award for an earlier St. Louis project, which may have been confused with Pruitt–Igoe.[193]
  • Although popularly known as the "red telephone", the Moscow–Washington hotline was never a telephone line, nor were red phones used. The first implementation of the hotline used teletype equipment, which was replaced by facsimile (fax) machines in 1988. Since 2008, the hotline has been a secure computer link over which the two countries exchange emails.[194] Moreover, the hotline links the Kremlin to the Pentagon, not the White House.[195]

Science and technology

See also: Scientific misconceptions, Tornado myths, and List of misconceptions about illegal drugs


  • The Great Wall of China is not, as is claimed, the only human-made object visible from the Moon or from space. None of the Apollo astronauts reported seeing any specific human-made object from the Moon, and even earth-orbiting astronauts can barely see it. City lights, however, are easily visible on the night side of earth from orbit.[196] Shuttle astronaut Jay Apt has been quoted as saying that "the Great Wall is almost invisible from only 180 miles (290 km) up."[197] (See Man-made structures visible from space.) ISS commander Chris Hadfield attempted to find it from space, but said that it was "hard as it's narrow and dun-colored."[198]
  • Black holes have the same gravitational effects as any other equal mass in their place. They will draw objects nearby towards them, just as any other planetary body does, except at very close distances.[199] If, for example, the Sun were replaced by a black hole of equal mass, the orbits of the planets would be essentially unaffected. A black hole can act like a "cosmic vacuum cleaner" and pull a substantial inflow of matter, but only if the star it forms from is already having a similar effect on surrounding matter.[200]
  • Seasons are not caused by the Earth being closer to the Sun in the summer than in the winter. In fact, the Earth is farthest from the Sun when it is summer in the Northern Hemisphere. Seasons are caused by Earth's 23.4-degree axial tilt. In July, the Northern Hemisphere is tilted towards the Sun resulting in longer days and more direct sunlight; in January, it is tilted away. The seasons are reversed in the Southern Hemisphere, which is tilted towards the Sun in January and away from the Sun in July.[201][202]
Marcos Torregrosa wearing a black belt with a red bar. In some martial arts, such as Brazilian Jiu Jitsu and Judo, red belts indicate a higher rank than black. In some cases, a solid red belt is reserved for the founder of the art, and in others, higher degrees of black belts are shown by red stripes.
Medieval depiction of a spherical Earth.
Napoleon on the Bellerophon, a painting of Napoleon I by Charles Lock Eastlake. Napoleon was taller than his nickname, The Little Corporal, suggests.
A satellite image of a section of the Great Wall of China, running diagonally from lower left to upper right (not to be confused with the much more prominent river running from upper left to lower right). The region pictured is 12 by 12 kilometres (7.5 mi × 7.5 mi).

0 Thoughts to “Are Widely Held Views Often Wrong Essay

Leave a comment

L'indirizzo email non verrà pubblicato. I campi obbligatori sono contrassegnati *