Cyborg | Designer-Babies | Futurism | Futurist | Immortality | Longevity | Nanotechnology | Post-Human | Singularity | Transhuman

Atheism – Simple English Wikipedia, the free encyclopedia

 Atheism  Comments Off on Atheism – Simple English Wikipedia, the free encyclopedia
Feb 072016
 

Atheism is rejecting belief that there is a god.[1][2] It is the opposite of theism, which is the belief that at least one god exists. A person who rejects belief in gods is called an atheist.

Atheism is not the same as agnosticism. Agnostics say that there is no way to know whether gods exist or not.[3] Being an agnostic does not have to mean a person rejects or believes in god. Some agnostics are theists, believing in god. The theologian Kierkegaard is an example. Other agnostics are atheists.

Atheists often give reasons why they do not believe in a god or gods. Three of the reasons that they often give are the problem of evil, the argument from inconsistent revelations, and the argument from nonbelief. Not all atheists think these reasons provide complete proof that gods cannot exist, but they are reasons given to support rejecting belief that gods exist. Some atheists think there is no evidence for any god or gods and goddesses so believing any type of theism means believing unproved assumptions. These atheists think a simpler explanation for everything is methodological naturalism which means that only natural things exist. Occam’s razor shows simple explanations without many unproved guesses are more likely to be true.[4]

The word atheism comes from the Greek language. It can be divided into a- (), a Greek prefix meaning “without”, and theos (), meaning “god”, and recombined to form “without gods”[6] or “godless”. In Ancient Greece it also meant “impious”.

Starting in about the 5th century BC, the word came to describe people who were “severing relations with the gods” or “denying the gods”. Before then, the meaning had been closer to “impious”. There is also the abstract noun, (atheots), “atheism”.

Cicero transliterated the Greek word into the Latin atheos. This word was often used in the debate between early Christians and Hellenists. Each side used it to label the other, in a bad way.[7]

Karen Armstrong writes that “During the sixteenth and seventeenth centuries, the word ‘atheist’ was still reserved exclusively for polemic … The term ‘atheist’ was an insult. Nobody would have dreamed of calling himself an atheist.”[8]Atheism was first used to describe an openly positive belief in late 18th-century Europe, meaning disbelief in the monotheistic Abrahamic god.[9] The 20th century saw the term expand to refer to disbelief in all deities. However, it is still common in Western society to describe atheism as simply “disbelief in God”.[10]

In many places, it is (or was) a crime to be make public the idea of atheism. Examples would be to claim the Bible or Qur’an could not be true, or to speak or write that there is no god.[11]

Muslim apostasy, that is becoming an atheist or believing in a god other than Allah, may be a dangerous act in places with many conservative Muslim people. Many religious courts have punished and some still punish this act with the death penalty. Many countries still have laws against atheism.[12][13][14]

Atheism is becoming more common,[15] mainly in South America, North America, Oceania and Europe (by percentage of people that had a religion before and started to be atheist).

In many countries, mainly in the Western world, there are laws that protect atheists’ right to express their atheistic belief (freedom of speech). This means that atheists have the same rights under the law as everyone else. Freedom of religion in international law and treaties includes the freedom to not have a religion.

Today, about 2.3% of the world’s population describes itself as atheist. About 11.9% is described as nontheist.[16] Between 64% and 65% of Japanese describe themselves as atheists, agnostics, or non-believers,[17][18] and up to 48% in Russia.[17] The percentage of such people in European Union member states ranges between 6% (Italy) and 85% (Sweden).[17]

People disagree about what atheism means. They disagree on when to call certain people atheists or not.

Atheism has sometimes been described as someone not believing in God. This is very general. It includes people who have never heard about God, but would believe in God if they did learn about God.

George H. Smith created the expressions “implicit atheism” and “explicit atheism” to describe the difference between different types of Atheism. Implicit Atheism is when you do not believe in God because you do not know about God. Explicit Atheism is when you do not believe in God after learning about God.

In 1772, Baron d’Holbach said that “All children are born Atheists; they have no idea of God”.[19]

In 1979 George H. Smith said that: “The man who is unacquainted with theism is an atheist because he does not believe in a god. This category would also include the child [who is able to] grasp the issues involved, but who is still unaware of those issues. The fact that this child does not believe in god qualifies him as an atheist”.[20]

These two quotes describe Implicit Atheism.

Ernest Nagel disagrees with Smith’s definition of atheism as an “absence of theism”, saying only explicit atheism is true atheism.[21] This means that Nagel believes that to be an Atheist, a person needs to know about God and then reject the idea of God.

Philosophers like Antony Flew,[22] Michael Martin,[10] and William L. Rowe[23] have looked at strong (sometimes called positive) atheism against weak (sometimes called negative) atheism. According to this idea, anyone who does not believe in a god or gods is either a weak or a strong atheist.[24]

Strong Atheism is the certain belief that no god exists. An older way of saying Strong Atheism is to say “Positive Atheism” Weak atheism is all other forms of not believing in a god or gods. An older way of saying Weak Atheism is to say “Negative Atheism” These terms have been used more in philosophical writing[22] and in Catholic beliefs.[25] since at least 1813.[26][27] Under this definition of atheism, most Agnostics are Weak Atheists.

Michael Martin says that agnosticism includes weak atheism.[10] Some agnostics, including Anthony Kenny, disagree. They think being an agnostic is different from being an atheist. They think atheism is no different from believing in a god, because both require belief. This overlooks the reality that agnostics also have their own belief or “claim to knowledge” [28]

Agnostics say that it cannot be known if a god or gods exist. In their view, strong atheism requires a leap of faith. The mathematician W. K. Clifford wrote an essay called The Ethics of Belief.[29] In this essay, Clifford shows some examples how people can believe in things which go against what they see or feel. One of these examples is a story of a ship captain who transports immigrants. The immigrants have to pay to be able to go on the ship. The ship is old and needs to be fixed badly. The captain thought about fixing the ship, but then decided not to. The captain told himself that the ship has safely made many trips and survived many storms before. The captain thought the ship would be okay without being fixed, so he had no need to be scared. Unfortunately the ship sinks, and all die. The shipowner is greedy and takes the money the insurance pays for the ship. According to Clifford, the captain did something that is wrong. When he made himself believe there were no problems with the ship, he did this because he is greedy. Even if the ship had made its trip safely, the captain would have done something that is wrong. According to Clifford, it is always wrong to believe something without enough reasons.[3]

Atheists usually respond by saying that there is no difference between an idea about religion with no proof, and an idea about other things[30] The lack of proof that god does not exist does not mean that there is no god, but it also does not mean that there is a god.[31] Scottish philosopher J. J. C. Smart says that “sometimes a person who is really an atheist may describe herself, even passionately, as an agnostic because of unreasonable generalised philosophical skepticism which would preclude us from saying that we know anything whatever, except perhaps the truths of mathematics and formal logic.”[32] So, some popular atheist authors such as Richard Dawkins like to show the difference between theist, agnostic and atheist positions by the probability assigned to the statement “God exists”.[33]

In everyday life, many people define natural phenomena without the need of a god or gods. They do not deny the existence of one or more gods, they simply say that this existence is not necessary. Gods do not provide a purpose to life, nor influence it, according to this view.[34] Many scientists practice what they call methodological naturalism. They silently adopt philosophical naturalism and use the scientific method. Their belief in a god does not affect their results.[35]

Practical atheism can take different forms:

Theoretic atheism tries to find arguments against the existence of god, and to disprove the arguments of Theism, such as the argument from design or Pascal’s Wager. These theoretical reasons have many forms, most of them are ontological or epistemological. Some rely on psychology or sociology.

According to Immanuel Kant, there can be no proof of a supreme being that is made using reason. In his work, “Critique of pure reason”, he tries to show that all attempts of either proving the existence of God, or disproving it, end in a logical contradictions. Kant says that it is impossible to know whether there are any higher beings. This makes him an agnostic.

Ludwig Feuerbach published The Essence of Christianity in 1841.[37] In his work he postulates the following:

The following phrases sum up Feuerbach’s writing:

Read more:

Atheism – Simple English Wikipedia, the free encyclopedia

 Posted by at 1:45 am  Tagged with:

First Amendment to the United States Constitution – Simple …

 Misc  Comments Off on First Amendment to the United States Constitution – Simple …
Jan 312016
 

The First Amendment to the United States Constitution is a part of the United States Bill of Rights that protects freedom of speech, freedom of religion, freedom of assembly, freedom of the press, and right to petition.

The Establishment Clause does not allow the government to support one religion more than any other religion. The government also can not say a religion or a god is true. This is often described as “separation of church and state”, where “state” means “the government”. It also does not allow the government to establish a national religion. It allows people to debate religion freely without the federal government of the United States getting involved. The clause did not stop the various states from supporting a particular religion, and several states did.

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

Read more here:
First Amendment to the United States Constitution – Simple …

 Posted by at 8:41 pm  Tagged with:

Fourth Amendment – Kids | Laws.com

 Fourth Amendment  Comments Off on Fourth Amendment – Kids | Laws.com
Jan 292016
 

A Guide to the Fourth Amendment

The Fourth Amendment, or Amendment IV of the United States Constitution is the section of the Bill of Rights that protects people from being searched or having their things taken away from them without any good reason. If the government or any law enforcement official wants to do that, he or she must have a very good reason to do that and must get permission to perform the search from a judge. The fourth amendment was introduced into the Constitution of the United States as a part of the Bill of Rights on September 5, 1789 and was ratified or voted four by three fourths of the states on December 15, 1791.

The Text of the Fourth Amendment

The text of the Fourth Amendment which is found in the United States Constitution and the Bill of Rights is the following:

The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.

History of the Third Amendment

In Colonial America, laws were written in order to help the English earn money on customs. The justices of the peace would do this by writing general warrants, which allowed general search and seizure to happen. Massachusetts wrote a law in 1756 that banned these warrants, because tax collectors were abusing their powers by searching the colonists homes for illegal goods. These general warrants allowed any messenger or officer to search a suspected place without any evidence. It also allowed them to seize people without even saying what they did wrong or showing evidence of their wrongdoings. Virginia also banned the use of general warrants later due to other fears. These actions later led to the addition of the Fourth Amendment in the Bill of Rights.

The Fourth Amendment Today

Today, the Fourth Amendment means that in order for a police officer to search and arrest someone, he or she will need to get permission or a warrant to do so from a judge. In order to get a warrant, the police officer must have evidence or probable cause that supports it. The police officer, or whoever has the evidence, must swear that it is true to his or her knowledge.

Facts About the Fourth Amendment

The Fourth Amendment applies to the government, but not any searches done by organizations or people who are not doing it for the government.

Some searches can be done without a warrant without breaking the law, like when there is a good reason to think that a crime is happening.

comments

See the original post:
Fourth Amendment – Kids | Laws.com

1st Amendment – Revolutionary War and Beyond!

 First Amendment  Comments Off on 1st Amendment – Revolutionary War and Beyond!
Jan 232016
 

We are considering offers for the sale of this website. Use the contact form in the left column to contact us for more information.

The 1st Amendment is the most well known to Americans of all the amendments in the Bill of Rights. It contains some of the most familiar phrases in political discussion, such as freedom of religion, freedom of speech and freedom of the press. The 1st Amendment reads like this:

“Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.”

The 1st Amendment protects your right to believe and practice whatever religious principles you choose and your right to say what you believe, even if it is unpopular or against the will of elected officials.

It also protects your right to publish any information you want, join together with whomever you want and ask the government to correct its own errors.

What exactly does the 1st Amendment mean and how does it apply to people today? Does it have relevance to you today? It sure does. In fact, it affects just about everything you do.

The 1st Amendment has seven clauses. This page has a brief description of each clause with links to more detailed information about the history and purpose of each section.

The Opening Phrase of the 1st Amendment says “Congress shall make no law.” This opening phrase immediately tells exactly who this amendment is aimed at… and that entity is Congress. So the 1st Amendment specifically prohibits Congress from making laws interfering with the rights mentioned in the amendment.

It does not however, prohibit the states from making such laws, nor does it prohibit individuals from restricting these rights to those who may be under their authority, such as a parent and child or an employer and an employee.

For one hundred years the 1st Amendment was understood to only apply to the federal government, but after the Civil War and the 14th Amendment was added to the Constitution, courts began to forbid the states to interfere with these rights as well due to an idea called “due process of law.”

Learn more about the Opening Phrase of the 1st Amendment here.

The Establishment Clause is the part of the 1st Amendment that says Congress shall make no law “respecting an establishment of religion.” This is a very crucial part of the American Constitution. It prohibits the government from establishing a state religion or denomination and from directing people in what they must believe.

Without the Establishment Clause, the government could choose a state religion and force everyone to participate in it. It could also punish anyone who didn’t adhere to its chosen faith.

This clause has been the focus of much debate in the last half century. Some Americans believe that whenever the government is involved, absolutely all religious expression must be forbidden in order to comply with the Establishment Clause.

For example, they might say a public school football team should not pray at a football game because the school is a government funded school.

Other Americans believe the government must make certain allowances for religious expressions in public events and buildings because Americans are a very religious people. They belive a high school football team prayer or a government employee displaying a cross at work does not violate the Establishment Clause because it is simply a personal expression and not an expression endorsed by the state.

Indeed, in the minds of some, banning expressions of religious faith like this is a violation of another clause of the 1st Amendment – the Free Exercise Clause, because it seeks to control the religious expressions of citizens.

Learn more about the history and purpose of the Establishment Clause here.

The Free Exercise Clause is the part of the 1st Amendment that says Congress shall make no law respecting the establishment of religion or “the free exercise thereof.” This phrase deals with the restriction on Congress to regulate anyone’s religious practices.

In general, Congress cannot tell people how they can or cannot express their religious beliefs. Such things as telling people when or how to pray, when they should go to church or to whom they should pray, are off limits to lawmakers.

In general, this is the case, but sometimes, minority religious groups may want to practice something that is not generally accepted or that the state has a very strong interest in regulating. For example, polygamy, ritual sacrifice and drug usage have been banned at times, because there is a compelling public interest in eliminating these behaviors.

In such cases, the Supreme Court has often ruled that the Free Exercise Clause does not apply. In other words, the Free Exercise Clause does not give free license to any behavior that someone says is their religious belief.

You can learn all about the Free Exercise Clause here.

The Freedom of Speech Clause is the part of the 1st Amendment that says, “Congress shall make no law… abridging the freedom of speech.”

British history contained a long string of suppression by those in authority of those with whom they disagreed. Many British subjects had been thrown in prison for voicing their religious and political beliefs. The Americans intended to prevent this from ever happening in their newly formed republic.

This is one of the protections in the Constitution that Americans hold most dear. They value it because it allows them to speak out against government policies they don’t like. It also allows them to express the religious beliefs of their choosing.

Negatively speaking, many people abuse this right by slandering people they disagree with, or using ugly and offensive language, racial epithets or hateful language about people who are different than they are.

Generally, freedom of speech is considered to be not only the words people speak, but any type of expression that is used to convey an idea. Such things as picketing, wearing symbols or burning the flag are considered protected forms of speech because they are expressing the ideas of the people participating in them.

You can learn more about the Freedom of Speech Clause by clicking here.

The Freedom of the Press Clause states that “Congress shall make no law… abridging the freedom… of the press.”

This was a very important principle to the Founding Fathers of America because of the importance the press played during the Revolutionary War.

Without the press, the Founding Fathers would have found it very difficult to distribute their views to people in other parts of the country. The press turned out to be a very important instigation in getting Americans to consolidate their views against England and in spreading the concepts that would justify a break with England.

English history contained no freedoms for the press whatsoever. All publications were subject to governmental review before publication. Criticisms of the government were strictly prosecuted as sedition. All Americans wanted the right to criticize their government freely as well as to discuss other topics whenever they chose.

If you would like to learn more about the Freedom of the Press Clause, please click here.

The Freedom of Assembly Clause is the part of the First Amendment that reads like this: “Congress shall make no law… abridging… the right of the people peaceably to assemble…” This clause is also sometimes referred to as the Freedom of Association Clause. This clause protects the right to assemble in peace to all Americans.

The Freedom of Assembly was very important to early Americans because without the right to assemble, they could not coordinate their opposition to the British government. The Freedom of Assembly was recognized to be of utmost importance if the Americans were to be successful in establishing a government of the people.

The Freedom of Assembly Clause has been relied upon by many groups in American history, such as civil rights groups, women’s suffrage groups and labor unions. Government officials in each case tried to restrict the speech of these groups and prevent them from meeting, organizing and getting their message out. The Freedom of Assembly proved to be an important factor that allowed these groups to prosper and see their visions fulfilled.

You can learn more about the history and importance of the Freedom of Assembly Clause here.

Click to enlarge

King George III

by Allan Ramsay

The Freedom of Petition Clause of the 1st Amendment reads like this:

“Congress shall make no law… abridging the freedom… of the people… to petition the Government for a redress of grievances.”

The freedom to petition the government was very important to early Americans because of their experience with trying to get King George III and Parliament to respond to their grievances. The colonists were so angry about the Monarchy’s refusal to acknowledge their grievances that they mentioned this fact in the Declaration of Independence.

The freedom to petition the government for redress of grievances has come to include the right to do such things as picketing, protesting, conducting peaceful sitins or boycotts and addressing government officials through any media available.

You can read more about the history and meaning of the Freedom of Petition Clause here.

Preamble to the Bill of Rights Learn about the 1st Amendment here. Learn about the 2nd Amendment here. Learn about the 3rd Amendment here. Learn about the 4th Amendment here. Learn about the 5th Amendment here. Learn about the 6th Amendment here. Learn about the 7th Amendment here. Learn about the 8th Amendment here. Learn about the 9th Amendment here. Learn about the 10th Amendment here.

Read the Bill of Rights here.

Learn more about the Bill of Rights with the following articles:

Last updated 8/7/12

Return to top of 1st Amendment

Revolutionary War and Beyond Home

See more here:
1st Amendment – Revolutionary War and Beyond!

Nihilism – Wikipedia, the free encyclopedia

 Nihilism  Comments Off on Nihilism – Wikipedia, the free encyclopedia
Jan 202016
 

Nihilism ( or ; from the Latin nihil, nothing) is a philosophical doctrine that suggests the lack of belief in one or more reputedly meaningful aspects of life. Most commonly, nihilism is presented in the form of existential nihilism, which argues that life is without objective meaning, purpose, or intrinsic value.[1]Moral nihilists assert that morality does not inherently exist, and that any established moral values are abstractly contrived. Nihilism can also take epistemological or ontological/metaphysical forms, meaning respectively that, in some aspect, knowledge is not possible, or that reality does not actually exist.

The term is sometimes used in association with anomie to explain the general mood of despair at a perceived pointlessness of existence that one may develop upon realising there are no necessary norms, rules, or laws.[2] Movements such as Futurism and deconstruction,[3] among others, have been identified by commentators[who?] as “nihilistic”.

Nihilism is also a characteristic that has been ascribed to time periods: for example, Jean Baudrillard and others have called postmodernity a nihilistic epoch,[4] and some Christian theologians and figures of religious authority have asserted that postmodernity[5] and many aspects of modernity[3] represent a rejection of theism, and that such rejection of their theistic doctrine entails nihilism.

Nihilism has many definitions, and thus can describe philosophical positions that are arguably independent.

Metaphysical nihilism is the philosophical theory that concrete objects and physical constructs might not exist in the possible world, or that even if there exist possible worlds that contain some concrete objects, there is at least one that contains only abstract objects.

An extreme form of metaphysical nihilism is commonly defined as the belief that nothing exists as a correspondent component of the self-efficient world.[6] The American Heritage Medical Dictionary defines one form of nihilism as “an extreme form of skepticism that denies all existence.”[7] A similar position can be found in solipsism; however, the solipsist affirms whereas the nihilist would deny the self.[8] Both these positions are considered forms of anti-realism.[9]

Epistemological nihilism is a form of skepticism in which all knowledge is accepted as possibly untrue or unable to be known. Additionally, morality is seen as subjective or false.[10]

Mereological nihilism (also called compositional nihilism) is the position that objects with proper parts do not exist (not only objects in space, but also objects existing in time do not have any temporal parts), and only basic building blocks without parts exist, and thus the world we see and experience full of objects with parts is a product of human misperception (i.e., if we could see clearly, we would not perceive compositive objects).

This interpretation of existence must be based on resolution. The resolution with which humans see and perceive the “improper parts” of the world is not an objective fact of reality, but is rather an implicit trait that can only be qualitatively explored and expressed. Therefore, there is no arguable way to surmise or measure the validity of mereological nihilism. Example: An ant can get lost on a large cylindrical object because the circumference of the object is so large with respect to the ant that the ant effectively feels as though the object has no curvature. Thus, the resolution with which the ant views the world it exists “within” is a very important determining factor in how the ant experiences this “within the world” feeling.

Existential nihilism is the belief that life has no intrinsic meaning or value. With respect to the universe, existential nihilism posits that a single human or even the entire human species is insignificant, without purpose and unlikely to change in the totality of existence. The meaninglessness of life is largely explored in the philosophical school of existentialism.

Moral nihilism, also known as ethical nihilism, is the meta-ethical view that morality does not exist as something inherent to objective reality; therefore no action is necessarily preferable to any other. For example, a moral nihilist would say that killing someone, for whatever reason, is not inherently right or wrong.

Other nihilists may argue not that there is no morality at all, but that if it does exist, it is a human construction and thus artificial, wherein any and all meaning is relative for different possible outcomes. As an example, if someone kills someone else, such a nihilist might argue that killing is not inherently a bad thing, or bad independently from our moral beliefs, because of the way morality is constructed as some rudimentary dichotomy. What is said to be a bad thing is given a higher negative weighting than what is called good: as a result, killing the individual was bad because it did not let the individual live, which was arbitrarily given a positive weighting. In this way a moral nihilist believes that all moral claims are void of any truth value. An alternative scholarly perspective is that moral nihilism is a morality in itself. Cooper writes, “In the widest sense of the word ‘morality’, moral nihilism is a morality.”[11]

Political nihilism, a branch of nihilism, follows the characteristic nihilist’s rejection of non-rationalized or non-proven assertions; in this case the necessity of the most fundamental social and political structures, such as government, family, and law. An influential analysis of political nihilism is presented by Leo Strauss.[12]

The Russian Nihilist movement was a Russian trend in the 1860s that rejected all authority.[13] Their name derives from the Latin nihil, meaning “nothing”. After the assassination of Tsar Alexander II in 1881, the Nihilists gained a reputation throughout Europe as proponents of the use of violence for political change.[citation needed] The Nihilists expressed anger at what they described as the abusive nature of the Eastern Orthodox Church and of the tsarist monarchy, and at the domination of the Russian economy by the aristocracy. Although the term Nihilist was first popularised by the German theologian Friedrich Heinrich Jacobi (17431818), its widespread usage began with the 1862 novel Fathers and Sons by the Russian author Ivan Turgenev. The main character of the novel, Eugene Bazarov, who describes himself as a Nihilist, wants to educate the people. The “go to the people be the people” campaign reached its height in the 1870s, during which underground groups such as the Circle of Tchaikovsky, the People’s Will, and Land and Liberty formed. It became known as the Narodnik movement, whose members believed that the newly freed serfs were merely being sold into wage slavery in the onset of the Industrial Revolution, and that the middle and upper classes had effectively replaced landowners. The Russian state attempted to suppress them[who?]. In actions described by the Nihilists as propaganda of the deed many government officials were assassinated. In 1881 Alexander II was killed on the very day he had approved a proposal to call a representative assembly to consider new reforms.

The term nihilism was first used by Friedrich Heinrich Jacobi (17431819). Jacobi used the term to characterize rationalism[14] and in particular Immanuel Kant’s “critical” philosophy to carry out a reductio ad absurdum according to which all rationalism (philosophy as criticism) reduces to nihilismand thus it should be avoided and replaced with a return to some type of faith and revelation. Bret W. Davis writes, for example, “The first philosophical development of the idea of nihilism is generally ascribed to Friedrich Jacobi, who in a famous letter criticized Fichte’s idealism as falling into nihilism. According to Jacobi, Fichtes absolutization of the ego (the ‘absolute I’ that posits the ‘not-I’) is an inflation of subjectivity that denies the absolute transcendence of God.”[15] A related but oppositional concept is fideism, which sees reason as hostile and inferior to faith.

With the popularizing of the word nihilism by Ivan Turgenev, a new Russian political movement called the Nihilist movement adopted the term. They supposedly called themselves nihilists because nothing “that then existed found favor in their eyes”.[16]

Sren Kierkegaard (18131855) posited an early form of nihilism, to which he referred as levelling.[17] He saw levelling as the process of suppressing individuality to a point where the individual’s uniqueness becomes non-existent and nothing meaningful in his existence can be affirmed:

Levelling at its maximum is like the stillness of death, where one can hear one’s own heartbeat, a stillness like death, into which nothing can penetrate, in which everything sinks, powerless. One person can head a rebellion, but one person cannot head this levelling process, for that would make him a leader and he would avoid being levelled. Each individual can in his little circle participate in this levelling, but it is an abstract process, and levelling is abstraction conquering individuality.

Kierkegaard, an advocate of a philosophy of life, generally argued against levelling and its nihilist consequence, although he believed it would be “genuinely educative to live in the age of levelling [because] people will be forced to face the judgement of [levelling] alone.”[18] George Cotkin asserts Kierkegaard was against “the standardization and levelling of belief, both spiritual and political, in the nineteenth century [and he] opposed tendencies in mass culture to reduce the individual to a cipher of conformity and deference to the dominant opinion.”[19] In his day, tabloids (like the Danish magazine Corsaren) and apostate Christianity were instruments of levelling and contributed to the “reflective apathetic age” of 19th century Europe.[20] Kierkegaard argues that individuals who can overcome the levelling process are stronger for it and that it represents a step in the right direction towards “becoming a true self.”[18][21] As we must overcome levelling,[22]Hubert Dreyfus and Jane Rubin argue that Kierkegaard’s interest, “in an increasingly nihilistic age, is in how we can recover the sense that our lives are meaningful”.[23]

Note however that Kierkegaard’s meaning of “nihilism” differs from the modern definition in the sense that, for Kierkegaard, levelling led to a life lacking meaning, purpose or value,[20] whereas the modern interpretation of nihilism posits that there was never any meaning, purpose or value to begin with.

Nihilism is often associated with the German philosopher Friedrich Nietzsche, who provided a detailed diagnosis of nihilism as a widespread phenomenon of Western culture. Though the notion appears frequently throughout Nietzsche’s work, he uses the term in a variety of ways, with different meanings and connotations, all negative[citation needed]. Karen Carr describes Nietzsche’s characterization of nihilism “as a condition of tension, as a disproportion between what we want to value (or need) and how the world appears to operate.”[24] When we find out that the world does not possess the objective value or meaning that we want it to have or have long since believed it to have, we find ourselves in a crisis.[25] Nietzsche asserts that with the decline of Christianity and the rise of physiological decadence,[clarification needed] nihilism is in fact characteristic of the modern age,[26] though he implies that the rise of nihilism is still incomplete and that it has yet to be overcome.[27] Though the problem of nihilism becomes especially explicit in Nietzsche’s notebooks (published posthumously), it is mentioned repeatedly in his published works and is closely connected to many of the problems mentioned there.

Nietzsche characterized nihilism as emptying the world and especially human existence of meaning, purpose, comprehensible truth, or essential value. This observation stems in part from Nietzsche’s perspectivism, or his notion that “knowledge” is always by someone of some thing: it is always bound by perspective, and it is never mere fact.[28] Rather, there are interpretations through which we understand the world and give it meaning. Interpreting is something we can not go without; in fact, it is something we need. One way of interpreting the world is through morality, as one of the fundamental ways that people make sense of the world, especially in regard to their own thoughts and actions. Nietzsche distinguishes a morality that is strong or healthy, meaning that the person in question is aware that he constructs it himself, from weak morality, where the interpretation is projected on to something external. Regardless of its strength, morality presents us with meaning, whether this is created or ‘implanted,’ which helps us get through life.[29]

Nietzsche discusses Christianity, one of the major topics in his work, at length in the context of the problem of nihilism in his notebooks, in a chapter entitled “European Nihilism”.[30] Here he states that the Christian moral doctrine provides people with intrinsic value, belief in God (which justifies the evil in the world) and a basis for objective knowledge. In this sense, in constructing a world where objective knowledge is possible, Christianity is an antidote against a primal form of nihilism, against the despair of meaninglessness. However, it is exactly the element of truthfulness in Christian doctrine that is its undoing: in its drive towards truth, Christianity eventually finds itself to be a construct, which leads to its own dissolution. It is therefore that Nietzsche states that we have outgrown Christianity “not because we lived too far from it, rather because we lived too close”.[31] As such, the self-dissolution of Christianity constitutes yet another form of nihilism. Because Christianity was an interpretation that posited itself as the interpretation, Nietzsche states that this dissolution leads beyond skepticism to a distrust of all meaning.[32][33]

Stanley Rosen identifies Nietzsche’s concept of nihilism with a situation of meaninglessness, in which “everything is permitted.” According to him, the loss of higher metaphysical values that exist in contrast to the base reality of the world, or merely human ideas, gives rise to the idea that all human ideas are therefore valueless. Rejecting idealism thus results in nihilism, because only similarly transcendent ideals live up to the previous standards that the nihilist still implicitly holds.[34] The inability for Christianity to serve as a source of valuating the world is reflected in Nietzsche’s famous aphorism of the madman in The Gay Science.[35] The death of God, in particular the statement that “we killed him”, is similar to the self-dissolution of Christian doctrine: due to the advances of the sciences, which for Nietzsche show that man is the product of evolution, that Earth has no special place among the stars and that history is not progressive, the Christian notion of God can no longer serve as a basis for a morality.

One such reaction to the loss of meaning is what Nietzsche calls passive nihilism, which he recognises in the pessimistic philosophy of Schopenhauer. Schopenhauer’s doctrine, which Nietzsche also refers to as Western Buddhism, advocates a separating of oneself from will and desires in order to reduce suffering. Nietzsche characterises this ascetic attitude as a “will to nothingness”, whereby life turns away from itself, as there is nothing of value to be found in the world. This mowing away of all value in the world is characteristic of the nihilist, although in this, the nihilist appears inconsistent:[36]

A nihilist is a man who judges of the world as it is that it ought not to be, and of the world as it ought to be that it does not exist. According to this view, our existence (action, suffering, willing, feeling) has no meaning: the pathos of ‘in vain’ is the nihilists’ pathos at the same time, as pathos, an inconsistency on the part of the nihilists.

Nietzsche’s relation to the problem of nihilism is a complex one. He approaches the problem of nihilism as deeply personal, stating that this predicament of the modern world is a problem that has “become conscious” in him.[37] Furthermore, he emphasises both the danger of nihilism and the possibilities it offers, as seen in his statement that “I praise, I do not reproach, [nihilism’s] arrival. I believe it is one of the greatest crises, a moment of the deepest self-reflection of humanity. Whether man recovers from it, whether he becomes master of this crisis, is a question of his strength!”[38] According to Nietzsche, it is only when nihilism is overcome that a culture can have a true foundation upon which to thrive. He wished to hasten its coming only so that he could also hasten its ultimate departure.[26]

He states that there is at least the possibility of another type of nihilist in the wake of Christianity’s self-dissolution, one that does not stop after the destruction of all value and meaning and succumb to the following nothingness. This alternate, ‘active’ nihilism on the other hand destroys to level the field for constructing something new. This form of nihilism is characterized by Nietzsche as “a sign of strength,”[39] a wilful destruction of the old values to wipe the slate clean and lay down one’s own beliefs and interpretations, contrary to the passive nihilism that resigns itself with the decomposition of the old values. This wilful destruction of values and the overcoming of the condition of nihilism by the constructing of new meaning, this active nihilism, could be related to what Nietzsche elsewhere calls a ‘free spirit'[40] or the bermensch from Thus Spoke Zarathustra and The Antichrist, the model of the strong individual who posits his own values and lives his life as if it were his own work of art. It may be questioned, though, whether “active nihilism” is indeed the correct term for this stance, and some question whether Nietzsche takes the problems nihilism poses seriously enough.[41]

Martin Heidegger’s interpretation of Nietzsche influenced many postmodern thinkers who investigated the problem of nihilism as put forward by Nietzsche. Only recently has Heidegger’s influence on Nietzschean nihilism research faded.[42] As early as the 1930s, Heidegger was giving lectures on Nietzsches thought.[43] Given the importance of Nietzsches contribution to the topic of nihilism, Heidegger’s influential interpretation of Nietzsche is important for the historical development of the term nihilism.

Heidegger’s method of researching and teaching Nietzsche is explicitly his own. He does not specifically try to present Nietzsche as Nietzsche. He rather tries to incorporate Nietzsche’s thoughts into his own philosophical system of Being, Time and Dasein.[44] In his Nihilism as Determined by the History of Being (194446),[45] Heidegger tries to understand Nietzsches nihilism as trying to achieve a victory through the devaluation of the, until then, highest values. The principle of this devaluation is, according to Heidegger, the Will to Power. The Will to Power is also the principle of every earlier valuation of values.[46] How does this devaluation occur and why is this nihilistic? One of Heidegger’s main critiques on philosophy is that philosophy, and more specifically metaphysics, has forgotten to discriminate between investigating the notion of a Being (Seiende) and Being (Sein). According to Heidegger, the history of Western thought can be seen as the history of metaphysics. And because metaphysics has forgotten to ask about the notion of Being (what Heidegger calls Seinsvergessenheit), it is a history about the destruction of Being. That is why Heidegger calls metaphysics nihilistic.[47] This makes Nietzsches metaphysics not a victory over nihilism, but a perfection of it.[48]

Heidegger, in his interpretation of Nietzsche, has been inspired by Ernst Jnger. Many references to Jnger can be found in Heidegger’s lectures on Nietzsche. For example, in a letter to the rector of Freiburg University of November 4, 1945, Heidegger, inspired by Jnger, tries to explain the notion of God is dead as the reality of the Will to Power. Heidegger also praises Jnger for defending Nietzsche against a too biological or anthropological reading during the Third Reich.[49]

Heidegger’s interpretation of Nietzsche influenced a number of important postmodernist thinkers. Gianni Vattimo points at a back-and-forth movement in European thought, between Nietzsche and Heidegger. During the 1960s, a Nietzschean ‘renaissance’ began, culminating in the work of Mazzino Montinari and Giorgio Colli. They began work on a new and complete edition of Nietzsche’s collected works, making Nietzsche more accessible for scholarly research. Vattimo explains that with this new edition of Colli and Montinari, a critical reception of Heidegger’s interpretation of Nietzsche began to take shape. Like other contemporary French and Italian philosophers, Vattimo does not want, or only partially wants, to rely on Heidegger for understanding Nietzsche. On the other hand, Vattimo judges Heidegger’s intentions authentic enough to keep pursuing them.[50] Philosophers who Vattimo exemplifies as a part of this back and forth movement are French philosophers Deleuze, Foucault and Derrida. Italian philosophers of this same movement are Cacciari, Severino and himself.[51]Habermas, Lyotard and Rorty are also philosophers who are influenced by Heidegger’s interpretation of Nietzsche.[52]

Postmodern and poststructuralist thought question the very grounds on which Western cultures have based their ‘truths’: absolute knowledge and meaning, a ‘decentralization’ of authorship, the accumulation of positive knowledge, historical progress, and certain ideals and practices of humanism and the Enlightenment.

Jacques Derrida, whose deconstruction is perhaps most commonly labeled nihilistic, did not himself make the nihilistic move that others have claimed. Derridean deconstructionists argue that this approach rather frees texts, individuals or organizations from a restrictive truth, and that deconstruction opens up the possibility of other ways of being.[53]Gayatri Chakravorty Spivak, for example, uses deconstruction to create an ethics of opening up Western scholarship to the voice of the subaltern and to philosophies outside of the canon of western texts.[54] Derrida himself built a philosophy based upon a ‘responsibility to the other’.[55] Deconstruction can thus be seen not as a denial of truth, but as a denial of our ability to know truth (it makes an epistemological claim compared to nihilism’s ontological claim).

Lyotard argues that, rather than relying on an objective truth or method to prove their claims, philosophers legitimize their truths by reference to a story about the world that can’t be separated from the age and system the stories belong toreferred to by Lyotard as meta-narratives. He then goes on to define the postmodern condition as characterized by a rejection both of these meta-narratives and of the process of legitimation by meta-narratives. “In lieu of meta-narratives we have created new language-games in order to legitimize our claims which rely on changing relationships and mutable truths, none of which is privileged over the other to speak to ultimate truth.”[citation needed] This concept of the instability of truth and meaning leads in the direction of nihilism, though Lyotard stops short of embracing the latter.

Postmodern theorist Jean Baudrillard wrote briefly of nihilism from the postmodern viewpoint in Simulacra and Simulation. He stuck mainly to topics of interpretations of the real world over the simulations of which the real world is composed. The uses of meaning was an important subject in Baudrillard’s discussion of nihilism:

The apocalypse is finished, today it is the precession of the neutral, of forms of the neutral and of indifferenceall that remains, is the fascination for desertlike and indifferent forms, for the very operation of the system that annihilates us. Now, fascination (in contrast to seduction, which was attached to appearances, and to dialectical reason, which was attached to meaning) is a nihilistic passion par excellence, it is the passion proper to the mode of disappearance. We are fascinated by all forms of disappearance, of our disappearance. Melancholic and fascinated, such is our general situation in an era of involuntary transparency.

In Nihil Unbound: Extinction and Enlightenment, Ray Brassier maintains that philosophy has avoided the traumatic idea of extinction, instead attempting to find meaning in a world conditioned by the very idea of its own annihilation. Thus Brassier critiques both the phenomenological and hermeneutic strands of Continental philosophy as well as the vitality of thinkers like Gilles Deleuze, who work to ingrain meaning in the world and stave off the threat of nihilism. Instead, drawing on thinkers such as Alain Badiou, Franois Laruelle, Paul Churchland, and Thomas Metzinger, Brassier defends a view of the world as inherently devoid of meaning. That is, rather than avoiding nihilism, Brassier embraces it as the truth of reality. Brassier concludes from his readings of Badiou and Laruelle that the universe is founded on the nothing,[56] but also that philosophy is the “organon of extinction,” that it is only because life is conditioned by its own extinction that there is thought at all.[57] Brassier then defends a radically anti-correlationist philosophy proposing that Thought is conjoined not with Being, but with Non-Being.

The term Dada was first used by Richard Huelsenbeck and Tristan Tzara in 1916.[58] The movement, which lasted from approximately 1916 to 1922, arose during World War I, an event that influenced the artists.[59] The Dada Movement began in Zrich, Switzerland known as the “Niederdorf” or “Niederdrfli” in the Caf Voltaire.[60] The Dadaists claimed that Dada was not an art movement, but an anti-art movement, sometimes using found objects in a manner similar to found poetry. The “anti-art” drive is thought to have stemmed from a post-war emptiness. This tendency toward devaluation of art has led many to claim that Dada was an essentially nihilistic movement. Given that Dada created its own means for interpreting its products, it is difficult to classify alongside most other contemporary art expressions. Hence, due to its ambiguity, it is sometimes classified as a nihilistic modus vivendi.[59]

The term “nihilism” was actually popularized by Ivan Turgenev in his novel Fathers and Sons, whose hero, Bazarov, was a nihilist and recruited several followers to the philosophy. He found his nihilistic ways challenged upon falling in love.[61]

Anton Chekhov portrayed nihilism when writing Three Sisters. The phrase “what does it matter” or such variants is often spoken by several characters in response to events; the significance of some of these events suggests a subscription to nihilism by said characters as a type of coping strategy.

Ayn Rand vehemently denounced nihilism as an abdication of rationality and the pursuit of happiness which she regarded as life’s moral purpose. As such, most villains are depicted as moral nihilists including Ellsworth Monckton Toohey in The Fountainhead who is a self-aware nihilist and the corrupt government in Atlas Shrugged who are unconsciously driven by nihilism which has taken root in the books depiction of American society with the fictional slang phrase “Who is John Galt?” being used as a defeatist way of saying “Who knows?” or “What does it matter?” by characters in the book who have essentially given up on life.[citation needed]

The philosophical ideas of the French author, the Marquis de Sade, are often noted as early examples of nihilistic principles.[citation needed]

In Act III of Shostakovich’s opera “Lady Macbeth of the Mtsensk District”, a nihilist is tormented by the Russian police.[citation needed]

A 2007 article in The Guardian noted that “…in the summer of 1977, …punk’s nihilistic swagger was the most thrilling thing in England.”[62] The Sex Pistols’ God Save The Queen, with its chant-like refrain of “no future”, became a slogan for unemployed and disaffected youth during the late 1970s. Their song Pretty Vacant is also a prime example of the band’s nihilistic outlook. Other influential punk rock and proto-punk bands to adopt nihilistic themes include The Velvet Underground, The Stooges, Misfits, Ramones, Johnny Thunders and the Heartbreakers, Richard Hell and the Voidoids, Suicide and Black Flag.[63]

Industrial, black metal, death metal, and doom metal music often emphasize nihilistic themes. Explorers of nihilistic themes in heavy metal include Black Sabbath, Metallica, Marilyn Manson, Slayer, KMFDM, Opeth, Alice in Chains, Godflesh, Celtic Frost, Ministry, Autopsy, Dismember, Motrhead, Nine Inch Nails, Bathory, Darkthrone, Emperor, Tool, Meshuggah, Candlemass, Morbid Saint, Kreator, Morbid Angel, Sepultura, Exodus, Entombed, Death, Mayhem, Nevermore, Dark Angel, Dissection, Nihilist, Weakling, Obituary, Electric Wizard, Eyehategod, Pantera, Sleep, Xasthur, At the Gates and the band Turbonegro have a song called TNA (The Nihilistic Army), which is solely in reference to outlying principles of nihilism.[64][65][66]

In 2014 is composed the first opera (Demandolx) carrying the expression of “Nihilist Opera”, using classical, modern and electronic instruments and following some drastic different rules, musically and theoretically.

Three of the antagonists in the 1998 movie The Big Lebowski are explicitly described as “nihilists,” but are not shown exhibiting any explicitly nihilistic traits during the film. Regarding the nihilists, the character Walter Sobchak comments “Nihilists! Fuck me. I mean, say what you want about the tenets of National Socialism, Dude, at least it’s an ethos.” [67] The 1999 film The Matrix portrays the character Thomas A. Anderson with a hollowed out copy of Baudrillard’s treatise, Simulacra and Simulation, in which he stores contraband data files under the chapter “On Nihilism.” The main antagonist Agent Smith is also depicted frequently as a nihilist, with him ranting about how all of peace, justice and love were meaningless in The Matrix Revolutions.[68] The 1999 film Fight Club also features concepts relating to Nihilism by exploring the contrasts between the artificial values imposed by consumerism in relation to the more meaningful pursuit of spiritual happiness.

In keeping with his comic book depiction, The Joker is portrayed as a nihilist in The Dark Knight, describing himself as “an Agent of Chaos” and at one point burning a gigantic pile of money stating that crime is “not about money, it’s about sending a message: everything burns.” Alfred Pennyworth states, regarding the Joker, “Some men aren’t looking for anything logical, like moneythey can’t be bought, bullied, reasoned or negotiated withsome men just want to watch the world burn.”[69]

The character from Star Wars: Knights of the Old Republic II The Sith Lords, a dark lord named Darth Nihilus was a reference to the Nihilism ideology as he devoured entire planets and did not care for living things at all.[citation needed]

Although the character Barthandelus from Final Fantasy XIII is not referred to as nihilistic in the game itself, he is referred to as such in the Fighting Fate entry for Theatrhythm Final Fantasy.[70]

See the original post:

Nihilism – Wikipedia, the free encyclopedia

Atheism – Wikipedia, the free encyclopedia

 Atheism  Comments Off on Atheism – Wikipedia, the free encyclopedia
Jan 202016
 

Atheism is, in a broad sense, the rejection of belief in the existence of deities.[1][2] In a narrower sense, atheism is specifically the position that there are no deities.[3][4][5] Most inclusively, atheism is the absence of belief that any deities exist.[4][5][6][7] Atheism is contrasted with theism,[8][9] which, in its most general form, is the belief that at least one deity exists.[9][10][11]

The term “atheism” originated from the Greek (atheos), meaning “without god(s)”, used as a pejorative term applied to those thought to reject the gods worshiped by the larger society.[12] With the spread of freethought, skeptical inquiry, and subsequent increase in criticism of religion, application of the term narrowed in scope. The first individuals to identify themselves using the word “atheist” lived in the 18th century during the Age of Enlightenment. The French Revolution, noted for its “unprecedented atheism,” witnessed the first major political movement in history to advocate for the supremacy of human reason.[14]

Arguments for atheism range from the philosophical to social and historical approaches. Rationales for not believing in deities include arguments that there is a lack of empirical evidence;[15][16] the problem of evil; the argument from inconsistent revelations; the rejection of concepts that cannot be falsified; and the argument from nonbelief.[15][17] Although some atheists have adopted secular philosophies (eg. humanism and skepticism),[18][19] there is no one ideology or set of behaviors to which all atheists adhere.[20] Many atheists hold that atheism is a more parsimonious worldview than theism and therefore that the burden of proof lies not on the atheist to disprove the existence of God but on the theist to provide a rationale for theism.[21]

Since conceptions of atheism vary, accurate estimations of current numbers of atheists are difficult.[22] Several comprehensive global polls on the subject have been conducted by Gallup International: their 2015 poll featured over 64,000 respondents and indicated that 11% were “convinced atheists” whereas an earlier 2012 poll found that 13% of respondents were “convinced atheists.”[23][24] An older survey by the BBC, in 2004, recorded atheists as comprising 8% of the world’s population.[25] Other older estimates have indicated that atheists comprise 2% of the world’s population, while the irreligious add a further 12%.[26] According to these polls, Europe and East Asia are the regions with the highest rates of atheism. In 2015, 61% of people in China reported that they were atheists.[27] The figures for a 2010 Eurobarometer survey in the European Union (EU) reported that 20% of the EU population claimed not to believe in “any sort of spirit, God or life force”.[28]

Writers disagree on how best to define and classify atheism,[29] contesting what supernatural entities it applies to, whether it is a philosophic position in its own right or merely the absence of one, and whether it requires a conscious, explicit rejection. Atheism has been regarded as compatible with agnosticism,[30][31][32][33][34][35][36] and has also been contrasted with it.[37][38][39] A variety of categories have been used to distinguish the different forms of atheism.

Some of the ambiguity and controversy involved in defining atheism arises from difficulty in reaching a consensus for the definitions of words like deity and god. The plurality of wildly different conceptions of God and deities leads to differing ideas regarding atheism’s applicability. The ancient Romans accused Christians of being atheists for not worshiping the pagan deities. Gradually, this view fell into disfavor as theism came to be understood as encompassing belief in any divinity.

With respect to the range of phenomena being rejected, atheism may counter anything from the existence of a deity, to the existence of any spiritual, supernatural, or transcendental concepts, such as those of Buddhism, Hinduism, Jainism, and Taoism.[41]

Definitions of atheism also vary in the degree of consideration a person must put to the idea of gods to be considered an atheist. Atheism has sometimes been defined to include the simple absence of belief that any deities exist. This broad definition would include newborns and other people who have not been exposed to theistic ideas. As far back as 1772, Baron d’Holbach said that “All children are born Atheists; they have no idea of God.”[42] Similarly, George H. Smith (1979) suggested that: “The man who is unacquainted with theism is an atheist because he does not believe in a god. This category would also include the child with the conceptual capacity to grasp the issues involved, but who is still unaware of those issues. The fact that this child does not believe in god qualifies him as an atheist.”[43] Smith coined the term implicit atheism to refer to “the absence of theistic belief without a conscious rejection of it” and explicit atheism to refer to the more common definition of conscious disbelief. Ernest Nagel contradicts Smith’s definition of atheism as merely “absence of theism”, acknowledging only explicit atheism as true “atheism”.[44]

Philosophers such as Antony Flew[45] and Michael Martin have contrasted positive (strong/hard) atheism with negative (weak/soft) atheism. Positive atheism is the explicit affirmation that gods do not exist. Negative atheism includes all other forms of non-theism. According to this categorization, anyone who is not a theist is either a negative or a positive atheist. The terms weak and strong are relatively recent, while the terms negative and positive atheism are of older origin, having been used (in slightly different ways) in the philosophical literature[45] and in Catholic apologetics.[46] Under this demarcation of atheism, most agnostics qualify as negative atheists.

While Martin, for example, asserts that agnosticism entails negative atheism,[33] many agnostics see their view as distinct from atheism,[47][48] which they may consider no more justified than theism or requiring an equal conviction.[47] The assertion of unattainability of knowledge for or against the existence of gods is sometimes seen as indication that atheism requires a leap of faith.[49][50] Common atheist responses to this argument include that unproven religious propositions deserve as much disbelief as all other unproven propositions,[51] and that the unprovability of a god’s existence does not imply equal probability of either possibility.[52] Scottish philosopher J. J. C. Smart even argues that “sometimes a person who is really an atheist may describe herself, even passionately, as an agnostic because of unreasonable generalised philosophical skepticism which would preclude us from saying that we know anything whatever, except perhaps the truths of mathematics and formal logic.”[53] Consequently, some atheist authors such as Richard Dawkins prefer distinguishing theist, agnostic and atheist positions along a spectrum of theistic probabilitythe likelihood that each assigns to the statement “God exists”.

Before the 18th century, the existence of God was so accepted in the western world that even the possibility of true atheism was questioned. This is called theistic innatismthe notion that all people believe in God from birth; within this view was the connotation that atheists are simply in denial.[55]

There is also a position claiming that atheists are quick to believe in God in times of crisis, that atheists make deathbed conversions, or that “there are no atheists in foxholes”.[56] There have however been examples to the contrary, among them examples of literal “atheists in foxholes”.[57]

Some atheists have doubted the very need for the term “atheism”. In his book Letter to a Christian Nation, Sam Harris wrote:

In fact, “atheism” is a term that should not even exist. No one ever needs to identify himself as a “non-astrologer” or a “non-alchemist”. We do not have words for people who doubt that Elvis is still alive or that aliens have traversed the galaxy only to molest ranchers and their cattle. Atheism is nothing more than the noises reasonable people make in the presence of unjustified religious beliefs.

The source of man’s unhappiness is his ignorance of Nature. The pertinacity with which he clings to blind opinions imbibed in his infancy, which interweave themselves with his existence, the consequent prejudice that warps his mind, that prevents its expansion, that renders him the slave of fiction, appears to doom him to continual error.

The broadest demarcation of atheistic rationale is between practical and theoretical atheism.

In practical or pragmatic atheism, also known as apatheism, individuals live as if there are no gods and explain natural phenomena without reference to any deities. The existence of gods is not rejected, but may be designated unnecessary or useless; gods neither provide purpose to life, nor influence everyday life, according to this view.[60] A form of practical atheism with implications for the scientific community is methodological naturalismthe “tacit adoption or assumption of philosophical naturalism within scientific method with or without fully accepting or believing it.”[61]

Practical atheism can take various forms:

Theoretical (or theoric) atheism explicitly posits arguments against the existence of gods, responding to common theistic arguments such as the argument from design or Pascal’s Wager. Theoretical atheism is mainly an ontology; more precisely, a physical ontology.

Epistemological atheism argues that people cannot know a God or determine the existence of a God. The foundation of epistemological atheism is agnosticism, which takes a variety of forms. In the philosophy of immanence, divinity is inseparable from the world itself, including a person’s mind, and each person’s consciousness is locked in the subject. According to this form of agnosticism, this limitation in perspective prevents any objective inference from belief in a god to assertions of its existence. The rationalistic agnosticism of Kant and the Enlightenment only accepts knowledge deduced with human rationality; this form of atheism holds that gods are not discernible as a matter of principle, and therefore cannot be known to exist. Skepticism, based on the ideas of Hume, asserts that certainty about anything is impossible, so one can never know for sure whether or not a god exists. Hume, however, held that such unobservable metaphysical concepts should be rejected as “sophistry and illusion”.[63] The allocation of agnosticism to atheism is disputed; it can also be regarded as an independent, basic worldview.[60]

Other arguments for atheism that can be classified as epistemological or ontological, including logical positivism and ignosticism, assert the meaninglessness or unintelligibility of basic terms such as “God” and statements such as “God is all-powerful.” Theological noncognitivism holds that the statement “God exists” does not express a proposition, but is nonsensical or cognitively meaningless. It has been argued both ways as to whether such individuals can be classified into some form of atheism or agnosticism. Philosophers A. J. Ayer and Theodore M. Drange reject both categories, stating that both camps accept “God exists” as a proposition; they instead place noncognitivism in its own category.[64][65]

One author writes:

“Metaphysical atheism… includes all doctrines that hold to metaphysical monism (the homogeneity of reality). Metaphysical atheism may be either: a) absolute an explicit denial of God’s existence associated with materialistic monism (all materialistic trends, both in ancient and modern times); b) relative the implicit denial of God in all philosophies that, while they accept the existence of an absolute, conceive of the absolute as not possessing any of the attributes proper to God: transcendence, a personal character or unity. Relative atheism is associated with idealistic monism (pantheism, panentheism, deism).”[66]

Logical atheism holds that the various conceptions of gods, such as the personal god of Christianity, are ascribed logically inconsistent qualities. Such atheists present deductive arguments against the existence of God, which assert the incompatibility between certain traits, such as perfection, creator-status, immutability, omniscience, omnipresence, omnipotence, omnibenevolence, transcendence, personhood (a personal being), nonphysicality, justice, and mercy.[15]

Theodicean atheists believe that the world as they experience it cannot be reconciled with the qualities commonly ascribed to God and gods by theologians. They argue that an omniscient, omnipotent, and omnibenevolent God is not compatible with a world where there is evil and suffering, and where divine love is hidden from many people.[17] A similar argument is attributed to Siddhartha Gautama, the founder of Buddhism.[68]

Philosopher Ludwig Feuerbach[69] and psychoanalyst Sigmund Freud have argued that God and other religious beliefs are human inventions, created to fulfill various psychological and emotional wants or needs. This is also a view of many Buddhists.[70]Karl Marx and Friedrich Engels, influenced by the work of Feuerbach, argued that belief in God and religion are social functions, used by those in power to oppress the working class. According to Mikhail Bakunin, “the idea of God implies the abdication of human reason and justice; it is the most decisive negation of human liberty, and necessarily ends in the enslavement of mankind, in theory and practice.” He reversed Voltaire’s famous aphorism that if God did not exist, it would be necessary to invent him, writing instead that “if God really existed, it would be necessary to abolish him.”[71]

Atheism is acceptable within some religious and spiritual belief systems, including Hinduism, Jainism, Buddhism, Syntheism, Ralism,[72] and Neopagan movements[73] such as Wicca.[74]stika schools in Hinduism hold atheism to be a valid path to moksha, but extremely difficult, for the atheist can not expect any help from the divine on their journey.[75] Jainism believes the universe is eternal and has no need for a creator deity, however Tirthankaras are revered that can transcend space and time [76] and have more power than the god Indra.[77]Secular Buddhism does not advocate belief in gods. Early Buddhism was atheistic as Gautama Buddha’s path involved no mention of gods. Later conceptions of Buddhism consider Buddha himself a god, suggest adherents can attain godhood, and revere Bodhisattvas[78] and Eternal Buddha.

Axiological, or constructive, atheism rejects the existence of gods in favor of a “higher absolute”, such as humanity. This form of atheism favors humanity as the absolute source of ethics and values, and permits individuals to resolve moral problems without resorting to God. Marx and Freud used this argument to convey messages of liberation, full-development, and unfettered happiness.[60] One of the most common criticisms of atheism has been to the contrarythat denying the existence of a god leads to moral relativism, leaving one with no moral or ethical foundation,[79] or renders life meaningless and miserable.[80]Blaise Pascal argued this view in his Penses.[81]

French philosopher Jean-Paul Sartre identified himself as a representative of an “atheist existentialism” concerned less with denying the existence of God than with establishing that “man needs… to find himself again and to understand that nothing can save him from himself, not even a valid proof of the existence of God.” Sartre said a corollary of his atheism was that “if God does not exist, there is at least one being in whom existence precedes essence, a being who exists before he can be defined by any concept, and… this being is man.” The practical consequence of this atheism was described by Sartre as meaning that there are no a priori rules or absolute values that can be invoked to govern human conduct, and that humans are “condemned” to invent these for themselves, making “man” absolutely “responsible for everything he does”.

Sociologist Phil Zuckerman analyzed previous social science research on secularity and non-belief, and concluded that societal well-being is positively correlated with irreligion. He found that there are much lower concentrations of atheism and secularity in poorer, less developed nations (particularly in Africa and South America) than in the richer industrialized democracies.[85][86] His findings relating specifically to atheism in the US were that compared to religious people in the US, “atheists and secular people” are less nationalistic, prejudiced, antisemitic, racist, dogmatic, ethnocentric, closed-minded, and authoritarian, and in US states with the highest percentages of atheists, the murder rate is lower than average. In the most religious states, the murder rate is higher than average.[87][88]

People who self-identify as atheists are often assumed to be irreligious, but some sects within major religions reject the existence of a personal, creator deity.[90] In recent years, certain religious denominations have accumulated a number of openly atheistic followers, such as atheistic or humanistic Judaism[91][92] and Christian atheists.[93][94][95]

The strictest sense of positive atheism does not entail any specific beliefs outside of disbelief in any deity; as such, atheists can hold any number of spiritual beliefs. For the same reason, atheists can hold a wide variety of ethical beliefs, ranging from the moral universalism of humanism, which holds that a moral code should be applied consistently to all humans, to moral nihilism, which holds that morality is meaningless.[96]

Philosophers such as Slavoj iek,[97]Alain de Botton,[98] and Alexander Bard and Jan Sderqvist,[99] have all argued that atheists should reclaim religion as an act of defiance against theism, precisely not to leave religion as an unwarranted monopoly to theists.

According to Plato’s Euthyphro dilemma, the role of the gods in determining right from wrong is either unnecessary or arbitrary. The argument that morality must be derived from God, and cannot exist without a wise creator, has been a persistent feature of political if not so much philosophical debate.[100][101][102] Moral precepts such as “murder is wrong” are seen as divine laws, requiring a divine lawmaker and judge. However, many atheists argue that treating morality legalistically involves a false analogy, and that morality does not depend on a lawmaker in the same way that laws do.[103]Friedrich Nietzsche believed in a morality independent of theistic belief, and stated that morality based upon God “has truth only if God is truthit stands or falls with faith in God.”[104][105][106]

There exist normative ethical systems that do not require principles and rules to be given by a deity. Some include virtue ethics, social contract, Kantian ethics, utilitarianism, and Objectivism. Sam Harris has proposed that moral prescription (ethical rule making) is not just an issue to be explored by philosophy, but that we can meaningfully practice a science of morality. Any such scientific system must, nevertheless, respond to the criticism embodied in the naturalistic fallacy.[107]

Philosophers Susan Neiman[108] and Julian Baggini[109] (among others) assert that behaving ethically only because of divine mandate is not true ethical behavior but merely blind obedience. Baggini argues that atheism is a superior basis for ethics, claiming that a moral basis external to religious imperatives is necessary to evaluate the morality of the imperatives themselvesto be able to discern, for example, that “thou shalt steal” is immoral even if one’s religion instructs itand that atheists, therefore, have the advantage of being more inclined to make such evaluations.[110] The contemporary British political philosopher Martin Cohen has offered the more historically telling example of Biblical injunctions in favour of torture and slavery as evidence of how religious injunctions follow political and social customs, rather than vice versa, but also noted that the same tendency seems to be true of supposedly dispassionate and objective philosophers.[111] Cohen extends this argument in more detail in Political Philosophy from Plato to Mao, where he argues that the Qur’an played a role in perpetuating social codes from the early 7th century despite changes in secular society.[112]

Some prominent atheistsmost recently Christopher Hitchens, Daniel Dennett, Sam Harris and Richard Dawkins, and following such thinkers as Bertrand Russell, Robert G. Ingersoll, Voltaire, and novelist Jos Saramagohave criticized religions, citing harmful aspects of religious practices and doctrines.[113]

The 19th-century German political theorist and sociologist Karl Marx called religion “the sigh of the oppressed creature, the heart of a heartless world, and the soul of soulless conditions. It is the opium of the people”. He goes on to say, “The abolition of religion as the illusory happiness of the people is the demand for their real happiness. To call on them to give up their illusions about their condition is to call on them to give up a condition that requires illusions. The criticism of religion is, therefore, in embryo, the criticism of that vale of tears of which religion is the halo.[114]Lenin said that “every religious idea and every idea of God “is unutterable vileness… of the most dangerous kind, ‘contagion’ of the most abominable kind. Millions of sins, filthy deeds, acts of violence and physical contagions… are far less dangerous than the subtle, spiritual idea of God decked out in the smartest ideological constumes…”[115]

Sam Harris criticises Western religion’s reliance on divine authority as lending itself to authoritarianism and dogmatism. There is a correlation between religious fundamentalism and extrinsic religion (when religion is held because it serves ulterior interests)[117] and authoritarianism, dogmatism, and prejudice.[118] These argumentscombined with historical events that are argued to demonstrate the dangers of religion, such as the Crusades, inquisitions, witch trials, and terrorist attackshave been used in response to claims of beneficial effects of belief in religion.[119] Believers counter-argue that some regimes that espouse atheism, such as in Soviet Russia, have also been guilty of mass murder.[120][121] In response to those claims, atheists such as Sam Harris and Richard Dawkins have stated that Stalin’s atrocities were influenced not by atheism but by dogmatic Marxism, and that while Stalin and Mao happened to be atheists, they did not do their deeds in the name of atheism.[123]

In early ancient Greek, the adjective theos (, from the privative – + “god”) meant “godless”. It was first used as a term of censure roughly meaning “ungodly” or “impious”. In the 5th century BCE, the word began to indicate more deliberate and active godlessness in the sense of “severing relations with the gods” or “denying the gods”. The term (asebs) then came to be applied against those who impiously denied or disrespected the local gods, even if they believed in other gods. Modern translations of classical texts sometimes render theos as “atheistic”. As an abstract noun, there was also (atheots), “atheism”. Cicero transliterated the Greek word into the Latin theos. The term found frequent use in the debate between early Christians and Hellenists, with each side attributing it, in the pejorative sense, to the other.[12]

The term atheist (from Fr. athe), in the sense of “one who… denies the existence of God or gods”,[125] predates atheism in English, being first found as early as 1566,[126] and again in 1571.[127]Atheist as a label of practical godlessness was used at least as early as 1577.[128] The term atheism was derived from the French athisme,[129] and appears in English about 1587.[130] An earlier work, from about 1534, used the term atheonism.[131][132] Related words emerged later: deist in 1621,[133]theist in 1662,[134]deism in 1675,[135] and theism in 1678.[136] At that time “deist” and “deism” already carried their modern meaning. The term theism came to be contrasted with deism.

Karen Armstrong writes that “During the sixteenth and seventeenth centuries, the word ‘atheist’ was still reserved exclusively for polemic… The term ‘atheist’ was an insult. Nobody would have dreamed of calling himself an atheist.”

Atheism was first used to describe a self-avowed belief in late 18th-century Europe, specifically denoting disbelief in the monotheistic Abrahamic god.[137] In the 20th century, globalization contributed to the expansion of the term to refer to disbelief in all deities, though it remains common in Western society to describe atheism as simply “disbelief in God”.

While the earliest-found usage of the term atheism is in 16th-century France,[129][130] ideas that would be recognized today as atheistic are documented from the Vedic period and the classical antiquity.

Atheistic schools are found in early Indian thought and have existed from the times of the historical Vedic religion.[138] Among the six orthodox schools of Hindu philosophy, Samkhya, the oldest philosophical school of thought, does not accept God, and the early Mimamsa also rejected the notion of God.[139] The thoroughly materialistic and anti-theistic philosophical Crvka (also called Nastika or Lokaiata) school that originated in India around the 6th century BCE is probably the most explicitly atheistic school of philosophy in India, similar to the Greek Cyrenaic school. This branch of Indian philosophy is classified as heterodox due to its rejection of the authority of Vedas and hence is not considered part of the six orthodox schools of Hinduism, but it is noteworthy as evidence of a materialistic movement within Hinduism.[140] Chatterjee and Datta explain that our understanding of Crvka philosophy is fragmentary, based largely on criticism of the ideas by other schools, and that it is not a living tradition:

“Though materialism in some form or other has always been present in India, and occasional references are found in the Vedas, the Buddhistic literature, the Epics, as well as in the later philosophical works we do not find any systematic work on materialism, nor any organized school of followers as the other philosophical schools possess. But almost every work of the other schools states, for refutation, the materialistic views. Our knowledge of Indian materialism is chiefly based on these.”[141]

Other Indian philosophies generally regarded as atheistic include Classical Samkhya and Purva Mimamsa. The rejection of a personal creator God is also seen in Jainism and Buddhism in India.[142]

Western atheism has its roots in pre-Socratic Greek philosophy, but did not emerge as a distinct world-view until the late Enlightenment.[143] The 5th-century BCE Greek philosopher Diagoras is known as the “first atheist”,[144] and is cited as such by Cicero in his De Natura Deorum.[145]Atomists such as Democritus attempted to explain the world in a purely materialistic way, without reference to the spiritual or mystical. Critias viewed religion as a human invention used to frighten people into following moral order[146] and Prodicus also appears to have made clear atheistic statements in his work. Philodemus reports that Prodicus believed that “the gods of popular belief do not exist nor do they know, but primitive man, [out of admiration, deified] the fruits of the earth and virtually everything that contributed to his existence”. Protagoras has sometimes been taken to be an atheist but rather espoused agnostic views, commenting that “Concerning the gods I am unable to discover whether they exist or not, or what they are like in form; for there are many hindrances to knowledge, the obscurity of the subject and the brevity of human life.”[147] In the 3rd-century BCE the Greek philosophers Theodorus Cyrenaicus[145][148] and Strato of Lampsacus[149] did not believe gods exist.

Socrates (c. 470399 BCE) was associated in the Athenian public mind with the trends in pre-Socratic philosophy towards naturalistic inquiry and the rejection of divine explanations for phenomena. Although such an interpretation misrepresents his thought he was portrayed in such a way in Aristophanes’ comic play Clouds and was later to be tried and executed for impiety and corrupting the young. At his trial Socrates is reported as vehemently denying that he was an atheist and contemporary scholarship provides little reason to doubt this claim.[150][151]

Euhemerus (c. 300 BCE) published his view that the gods were only the deified rulers, conquerors and founders of the past, and that their cults and religions were in essence the continuation of vanished kingdoms and earlier political structures.[152] Although not strictly an atheist, Euhemerus was later criticized for having “spread atheism over the whole inhabited earth by obliterating the gods”.[153]

Also important in the history of atheism was Epicurus (c. 300 BCE). Drawing on the ideas of Democritus and the Atomists, he espoused a materialistic philosophy according to which the universe was governed by the laws of chance without the need for divine intervention (see scientific determinism). Although he stated that deities existed, he believed that they were uninterested in human existence. The aim of the Epicureans was to attain peace of mind and one important way of doing this was by exposing fear of divine wrath as irrational. The Epicureans also denied the existence of an afterlife and the need to fear divine punishment after death.[154]

The Roman philosopher Sextus Empiricus held that one should suspend judgment about virtually all beliefsa form of skepticism known as Pyrrhonismthat nothing was inherently evil, and that ataraxia (“peace of mind”) is attainable by withholding one’s judgment. His relatively large volume of surviving works had a lasting influence on later philosophers.[155]

The meaning of “atheist” changed over the course of classical antiquity. The early Christians were labeled atheists by non-Christians because of their disbelief in pagan gods.[156] During the Roman Empire, Christians were executed for their rejection of the Roman gods in general and Emperor-worship in particular. When Christianity became the state religion of Rome under Theodosius I in 381, heresy became a punishable offense.[157]

During the Early Middle Ages, the Islamic world underwent a Golden Age. With the associated advances in science and philosophy, Arab and Persian lands produced outspoken rationalists and atheists, including Muhammad al Warraq (fl. 7th century), Ibn al-Rawandi (827911), Al-Razi (854925), and Al-Maarri (9731058). Al-Ma’arri wrote and taught that religion itself was a “fable invented by the ancients”[158] and that humans were “of two sorts: those with brains, but no religion, and those with religion, but no brains.”[159] Despite being relatively prolific writers, nearly none of their writing survives to the modern day, most of what little remains being preserved through quotations and excerpts in later works by Muslim apologists attempting to refute them.[160] Other prominent Golden Age scholars have been associated with rationalist thought and atheism as well, although the current intellectual atmosphere in the Islamic world, and the scant evidence that survives from the era, make this point a contentious one today.

In Europe, the espousal of atheistic views was rare during the Early Middle Ages and Middle Ages (see Medieval Inquisition); metaphysics and theology were the dominant interests pertaining to religion.[161] There were, however, movements within this period that furthered heterodox conceptions of the Christian god, including differing views of the nature, transcendence, and knowability of God. Individuals and groups such as Johannes Scotus Eriugena, David of Dinant, Amalric of Bena, and the Brethren of the Free Spirit maintained Christian viewpoints with pantheistic tendencies. Nicholas of Cusa held to a form of fideism he called docta ignorantia (“learned ignorance”), asserting that God is beyond human categorization, and thus our knowledge of him is limited to conjecture. William of Ockham inspired anti-metaphysical tendencies with his nominalistic limitation of human knowledge to singular objects, and asserted that the divine essence could not be intuitively or rationally apprehended by human intellect. Followers of Ockham, such as John of Mirecourt and Nicholas of Autrecourt furthered this view. The resulting division between faith and reason influenced later radical and reformist theologians such as John Wycliffe, Jan Hus, and Martin Luther.[161]

The Renaissance did much to expand the scope of free thought and skeptical inquiry. Individuals such as Leonardo da Vinci sought experimentation as a means of explanation, and opposed arguments from religious authority. Other critics of religion and the Church during this time included Niccol Machiavelli, Bonaventure des Priers, Michel de Montaigne, and Franois Rabelais.[155]

Historian Geoffrey Blainey wrote that the Reformation had paved the way for atheists by attacking the authority of the Catholic Church, which in turn “quietly inspired other thinkers to attack the authority of the new Protestant churches”.[162]Deism gained influence in France, Prussia, and England. The philosopher Baruch Spinoza was “probably the first well known ‘semi-atheist’ to announce himself in a Christian land in the modern era”, according to Blainey. Spinoza believed that natural laws explained the workings of the universe. In 1661 he published his Short Treatise on God.[163]

Criticism of Christianity became increasingly frequent in the 17th and 18th centuries, especially in France and England, where there appears to have been a religious malaise, according to contemporary sources. Some Protestant thinkers, such as Thomas Hobbes, espoused a materialist philosophy and skepticism toward supernatural occurrences, while Spinoza rejected divine providence in favour of a panentheistic naturalism. By the late 17th century, deism came to be openly espoused by intellectuals such as John Toland who coined the term “pantheist”.[164]

The first known explicit atheist was the German critic of religion Matthias Knutzen in his three writings of 1674.[165] He was followed by two other explicit atheist writers, the Polish ex-Jesuit philosopher Kazimierz yszczyski and in the 1720s by the French priest Jean Meslier.[166] In the course of the 18th century, other openly atheistic thinkers followed, such as Baron d’Holbach, Jacques-Andr Naigeon, and other French materialists.[167]John Locke in contrast, though an advocate of tolerance, urged authorities not to tolerate atheism, believing that the denial of God’s existence would undermine the social order and lead to chaos.[168]

The philosopher David Hume developed a skeptical epistemology grounded in empiricism, and Immanuel Kant’s philosophy has strongly questioned the very possibility of a metaphysical knowledge. Both philosophers undermined the metaphysical basis of natural theology and criticized classical arguments for the existence of God.

Blainey notes that, although Voltaire is widely considered to have strongly contributed to atheistic thinking during the Revolution, he also considered fear of God to have discouraged further disorder, having said “If God did not exist, it would be necessary to invent him.”[169] In Reflections on the Revolution in France (1790), the philosopher Edmund Burke denounced atheism, writing of a “literary cabal” who had “some years ago formed something like a regular plan for the destruction of the Christian religion. This object they pursued with a degree of zeal which hitherto had been discovered only in the propagators of some system of piety… These atheistical fathers have a bigotry of their own…”. But, Burke asserted, “man is by his constitution a religious animal” and “atheism is against, not only our reason, but our instincts; and… it cannot prevail long”.[170]

Baron d’Holbach was a prominent figure in the French Enlightenment who is best known for his atheism and for his voluminous writings against religion, the most famous of them being The System of Nature (1770) but also Christianity Unveiled. One goal of the French Revolution was a restructuring and subordination of the clergy with respect to the state through the Civil Constitution of the Clergy. Attempts to enforce it led to anti-clerical violence and the expulsion of many clergy from France, lasting until the Thermidorian Reaction. The radical Jacobins seized power in 1793, ushering in the Reign of Terror. The Jacobins were deists and introduced the Cult of the Supreme Being as a new French state religion. Some atheists surrounding Jacques Hbert instead sought to establish a Cult of Reason, a form of atheistic pseudo-religion with a goddess personifying reason. The Napoleonic era further institutionalized the secularization of French society.

In the latter half of the 19th century, atheism rose to prominence under the influence of rationalistic and freethinking philosophers. Many prominent German philosophers of this era denied the existence of deities and were critical of religion, including Ludwig Feuerbach, Arthur Schopenhauer, Max Stirner, Karl Marx, and Friedrich Nietzsche.[171]

G.J. Holyoake was the last person (1842) imprisoned in Great Britain due to atheist beliefs.[172]Stephen Law states that Holyoake “first coined the term ‘secularism'”.[173]

Atheism in the 20th century, particularly in the form of practical atheism, advanced in many societies. Atheistic thought found recognition in a wide variety of other, broader philosophies, such as existentialism, objectivism, secular humanism, nihilism, anarchism, logical positivism, Marxism, feminism,[174] and the general scientific and rationalist movement.

In addition, state atheism emerged in Eastern Europe and Asia during that period, particularly in the Soviet Union under Vladimir Lenin and Joseph Stalin, and in Communist China under Mao Zedong. Atheist and anti-religious policies in the Soviet Union included numerous legislative acts, the outlawing of religious instruction in the schools, and the emergence of the League of Militant Atheists.[175][176] After Mao, the Chinese Communist Party remains an atheist organization, and regulates, but does not completely forbid, the practice of religion in mainland China.[177][178][179]

While Geoffrey Blainey has written that “the most ruthless leaders in the Second World War were atheists and secularists who were intensely hostile to both Judaism and Christianity”,[180] Richard Madsen has pointed out that Hitler and Stalin each opened and closed churches as a matter of political expedience, and Stalin softened his opposition to Christianity in order to improve public acceptance of his regime during the war.[181] Blackford and Schklenk have written that “the Soviet Union was undeniably an atheist state, and the same applies to Maoist China and Pol Pot’s fanatical Khmer Rouge regime in Cambodia in the 1970s. That does not, however, show that the atrocities committed by these totalitarian dictatorships were the result of atheist beliefs, carried out in the name of atheism, or caused primarily by the atheistic aspects of the relevant forms of communism.”[182]

Logical positivism and scientism paved the way for neopositivism, analytical philosophy, structuralism, and naturalism. Neopositivism and analytical philosophy discarded classical rationalism and metaphysics in favor of strict empiricism and epistemological nominalism. Proponents such as Bertrand Russell emphatically rejected belief in God. In his early work, Ludwig Wittgenstein attempted to separate metaphysical and supernatural language from rational discourse. A. J. Ayer asserted the unverifiability and meaninglessness of religious statements, citing his adherence to the empirical sciences. Relatedly the applied structuralism of Lvi-Strauss sourced religious language to the human subconscious in denying its transcendental meaning. J. N. Findlay and J. J. C. Smart argued that the existence of God is not logically necessary. Naturalists and materialistic monists such as John Dewey considered the natural world to be the basis of everything, denying the existence of God or immortality.[53][183]

Other leaders like Periyar E. V. Ramasamy, a prominent atheist leader of India, fought against Hinduism and Brahmins for discriminating and dividing people in the name of caste and religion.[184] This was highlighted in 1956 when he arranged for the erection of a statue depicting a Hindu god in a humble representation and made antitheistic statements.[185]

Atheist Vashti McCollum was the plaintiff in a landmark 1948 Supreme Court case that struck down religious education in US public schools.[186]Madalyn Murray O’Hair was perhaps one of the most influential American atheists; she brought forth the 1963 Supreme Court case Murray v. Curlett which banned compulsory prayer in public schools.[187] In 1966, Time magazine asked “Is God Dead?”[188] in response to the Death of God theological movement, citing the estimation that nearly half of all people in the world lived under an anti-religious power, and millions more in Africa, Asia, and South America seemed to lack knowledge of the Christian view of theology.[189] The Freedom From Religion Foundation was co-founded by Anne Nicol Gaylor and her daughter, Annie Laurie Gaylor, in 1976 in the United States, and incorporated nationally in 1978. It promotes the separation of church and state.[190][191]

Since the fall of the Berlin Wall, the number of actively anti-religious regimes has reduced considerably. In 2006, Timothy Shah of the Pew Forum noted “a worldwide trend across all major religious groups, in which God-based and faith-based movements in general are experiencing increasing confidence and influence vis–vis secular movements and ideologies.”[192] However, Gregory S. Paul and Phil Zuckerman consider this a myth and suggest that the actual situation is much more complex and nuanced.[193]

A 2010 survey found that those identifying themselves as atheists or agnostics are on average more knowledgeable about religion than followers of major faiths. Nonbelievers scored better on questions about tenets central to Protestant and Catholic faiths. Only Mormon and Jewish faithful scored as well as atheists and agnostics.[194]

In 2012, the first “Women in Secularism” conference was held in Arlington, Virginia.[195] Secular Woman was organized in 2012 as a national organization focused on nonreligious women.[196] The atheist feminist movement has also become increasingly focused on fighting sexism and sexual harassment within the atheist movement itself.[197] In August 2012, Jennifer McCreight (the organizer of Boobquake) founded a movement within atheism known as Atheism Plus, or A+, that “applies skepticism to everything, including social issues like sexism, racism, politics, poverty, and crime”.[198][199][200]

In 2013 the first atheist monument on American government property was unveiled at the Bradford County Courthouse in Florida: a 1,500-pound granite bench and plinth inscribed with quotes by Thomas Jefferson, Benjamin Franklin, and Madalyn Murray O’Hair.[201][202]

New Atheism is the name given to a movement among some early-21st-century atheist writers who have advocated the view that “religion should not simply be tolerated but should be countered, criticized, and exposed by rational argument wherever its influence arises.”[203] The movement is commonly associated with Sam Harris, Daniel C. Dennett, Richard Dawkins, Victor J. Stenger, and Christopher Hitchens.[204] Several best-selling books by these authors, published between 2004 and 2007, form the basis for much of the discussion of New Atheism.

These atheists generally seek to disassociate themselves from the mass political atheism that gained ascendency in various nations in the 20th century. In best selling books, the religiously motivated terrorist events of 9/11 and the partially successful attempts of the Discovery Institute to change the American science curriculum to include creationist ideas, together with support for those ideas from George W. Bush in 2005, have been cited by authors such as Harris, Dennett, Dawkins, Stenger, and Hitchens as evidence of a need to move society towards atheism.[206]

It is difficult to quantify the number of atheists in the world. Respondents to religious-belief polls may define “atheism” differently or draw different distinctions between atheism, non-religious beliefs, and non-theistic religious and spiritual beliefs.[207] A Hindu atheist would declare oneself as a Hindu, although also being an atheist at the same time.[208] A 2010 survey published in Encyclopdia Britannica found that the non-religious made up about 9.6% of the world’s population, and atheists about 2.0%, with a very large majority based in Asia. This figure did not include those who follow atheistic religions, such as some Buddhists.[209] The average annual change for atheism from 2000 to 2010 was 0.17%.[209] A broad figure estimates the number of atheists and agnostics on Earth at 1.1 billion.[210]

According to global studies done by Gallup International, 13% of respondents were “convinced atheists” in 2012 and 11% were “convinced atheists” in 2015.[24][211] As of 2012, the top ten countries with people who viewed themselves as “convinced atheists” were China (47%), Japan (31%), the Czech Republic (30%), France (29%), South Korea (15%), Germany (15%), Netherlands (14%), Austria (10%), Iceland (10%), Australia (10%), and the Republic of Ireland (10%) [212]

According to the 2010 Eurobarometer Poll, the percentage of those polled who agreed with the statement “you don’t believe there is any sort of spirit, God or life force” varied from: France (40%), Czech Republic (37%), Sweden (34%), Netherlands (30%), and Estonia (29%), down to Poland (5%), Greece (4%), Cyprus (3%), Malta (2%), and Romania (1%), with the European Union as a whole at 20%.[28] In a 2012 Eurobarometer poll on discrimination in the European Union, 16% of those polled considered themselves non believers/agnostics and 7% considered themselves atheists.[214] According to the Australian Bureau of Statistics, 22% of Australians have “no religion”, a category that includes atheists.[215]

According to a Pew Research Center survey in 2012 religiously unaffiliated (including agnostics and atheists) make up about 18% of Europeans.[216] According to the same survey, the religiously unaffiliated are the majority of the population only in two European countries: Czech Republic (75%) and Estonia (60%).[216] There are another four countries where the unaffiliated make up a majority of the population: North Korea (71%), Japan (57%), Hong Kong (56%), and China (52%).[216]

In the US, there was a 1% to 5% increase in self-reported atheism from 2005 to 2012, and a larger drop in those who self-identified as “religious”, down by 13%, from 73% to 60%.[217] According to the World Values Survey, 4.4% of Americans self-identified as atheists in 2014.[218] However, the same survey showed that 11.1% of all respondents stated “no” when asked if they believed in God.[218] In 1984, these same figures were 1.1% and 2.2%, respectively. According to a 2015 report by the Pew Research Center, 3.1% of the US adult population identify as atheist, up from 1.6% in 2007, and within the religiously unaffiliated (or “no religion”) demographic, atheists made up 13.6%.[219] According to the 2015 General Sociological Survey the number of atheists and agnostics in the US has remained relatively flat in the past 23 years since in 1991 only 2% identified as atheist and 4% identified as agnostic and in 2014 only 3% identified as atheists and 5% identified as agnostics.[220]

A study noted positive correlations between levels of education and secularism, including atheism, in America.[87] According to evolutionary psychologist Nigel Barber, atheism blossoms in places where most people feel economically secure, particularly in the social democracies of Europe, as there is less uncertainty about the future with extensive social safety nets and better health care resulting in a greater quality of life and higher life expectancy. By contrast, in underdeveloped countries, there are virtually no atheists.[221] In a 2008 study, researchers found intelligence to be negatively related to religious belief in Europe and the United States. In a sample of 137 countries, the correlation between national IQ and disbelief in God was found to be 0.60.[222]

Links to related articles

Link:

Atheism – Wikipedia, the free encyclopedia

 Posted by at 10:44 am  Tagged with:

Atheism – Conservapedia

 Atheism  Comments Off on Atheism – Conservapedia
Jan 202016
 

Atheism, as defined by the Stanford Encyclopedia of Philosophy, the Routledge Encyclopedia of Philosophy, and other philosophy reference works, is the denial of the existence of God.[1] Beginning in the latter portion of the 20th century and continuing beyond, many agnostics/atheists have argued that the definition of atheism should be defined as a mere lack of belief in God or gods. [2][3][4]

Atheism has been examined by many disciplines in terms of its effects on individuals and society and these effects will be covered shortly.

As far as individuals adopting an atheistic worldview, atheism has a number of causal factors and these will be elaborated on below.

See also: Schools of atheist thought and Atheist factions

The history of atheism can be dated to as early as the 5th century B.C. Diagoras of Melos was a 5th century BC. Greek atheist, poet and sophist. Since this time, there have been many schools of atheist thought that have developed.

Atheists claim there are two main reasons for their denial of the existence of God and/or disbelief in God: the conviction that there is positive evidence or argument that God does not exist (Strong atheism which is also sometimes called positive atheism), and their claim that theists bear the burden of proof to show that God exists, that they have failed to do so, and that belief is therefore unwarranted (Weak atheism).

As as alluded to above, theists and others have posited a number of causes of atheism and this matter will be further addressed in this article.

Charles Bradlaugh, in 1876, proposed that atheism does not assert “there is no God,” and by doing so he endeavored to dilute the traditional definition of atheism.[5][2] As noted above, in the latter portion of the 20th century, the proposition that the definition of atheism be defined as a mere lack of belief in God or gods began to be commonly advanced by agnostics/atheists.[2][6] It is now common for atheists/agnostics and theists to debate the meaning of the word atheism.[2][7]

Critics of a broader definition of atheism to be a mere lack of belief indicate that such a definition is contrary to the traditional/historical meaning of the word and that such a definition makes atheism indistinguishable from agnosticism.[2][4][8]

For more information, please see:

Below are three common ways that atheism manifests itself:

1. Militant atheism which continues to suppress and oppress religious believers today

Topics related to militant atheism

2. Philosophical atheism – Atheist philosophers assert that God does not exist. (See also: Naturalism)

3. Practical atheism: atheism of the life – that is, living as though God does not exist.[9]

See also: Atheist factions and Schools of atheist thought and Atheist cults and Atheism and intolerance

In 2015, Dr. J. Gordon Melton said about the atheist movement (organized atheism) that atheism is not a movement which tends to create community, but in the last few years there has been some growth of organized atheism.[10]

Jacques Rousseau wrote in the Daily Maverick: “Elevatorgate..has resulted in three weeks of infighting in the secular community. Some might observe that we indulge in these squabbles fairly frequently.”[11] An ex-atheist wrote: “As an Atheist for 40 years, I noticed that there is not just a wide variety of Atheist positions, but there exists an actual battle between certain Atheist factions.”[12]

See also: Atheist movement and Atheism and anger

Blair Scott served on the American Atheists board of directors.[13] Mr. Scott formerly served as a State Director for the American Atheists organization in the state of Alabama. On December 1, 2012 he quit his post as a director of outreach for the American Atheists due to infighting within the American atheist movement.[14]

Mr. Blair wrote:

See also: Antitheism and antisocial behavior

See also: Atheism has a lower retention rate compared to other worldviews and Desecularization and Atheism and apathy

In 2012, a Georgetown University study was published indicating that only about 30 percent of those who grow up in an atheist household remain atheists as adults.[15] Similarly, according to recent research by the Pew Forum on Religion and Public Life, in the United States, a majority of those surveyed who were raised in atheist or agnostic households, or where there was no specific religious attachment, later chose to join a religious faith.[16] See also: Atheism and poor relationships with parents

A 2012 study by the General Social Survey of the social science research organization NORC at the University of Chicago found that belief in God rises with age, even in atheistic nations[17] See also: Atheism and immaturity.

In addition, in atheistic Communist China, Christianity is experiencing rapid growth (see: Growth of Christianity in China).

See also:

See also: Atheism and loneliness and Atheism and apathy and Internet atheism and American atheists and church attendance

In comparison to many religious groups, which have many meetings in numerous places in a given day or week which are convenient to attend, atheist meetings are sparse. One of the causes of this situation is the apathy of many atheists (see: Atheism and apathy and Atheism is uninspiring).

Atheist Francois Tremblay wrote about the difficulty of motivating atheists to engage in activities related to atheism: “One last problem that undermines any propagation of atheism is inspiration. Let’s be honest here, “there is no god!” is not a very motivating call for most people.” (see also: Atheism is uninspiring).[19] The atheist Jerry Coyne said about atheist meetings/conferences, “But to me the speakers and talks have often seemed repetitive: the same crew of jet-set skeptics giving the same talks.”[20]

In an essay entitled How the Atheist Movement Failed Me, an atheist woman noted that participation in the atheist community is often expensive due to the cost of attending atheist conferences and even local atheist meetings in restaurants and bars challenged her modest budget.[21] As a result of the challenges that atheists commonly have in terms of socializing in person, many atheists turn to the internet in terms of communicating with other atheists.[22] Often internet communication between atheists turns turns contentious (see: Atheist factions).

For more information, please see: Atheism and loneliness

See also: Atheists doubting the validity of atheism

Hannah More wrote: “[T]he mind, which knows not where to fly, flies to God. In agony, nature is no Atheist. The soul is drawn to God by a sort of natural impulse; not always, perhaps by an emotion of piety; but from a feeling conviction, that every other refuge is ‘a refuge of lies’.”[23]

See also: Atheism and death and Atheist funerals and Atheism and Hell

Science Daily reported that Death anxiety increases atheists’ unconscious belief in God.[25] In a Psychology Today article, Dr. Nathan A. Heflick reported similar results in other studies.[26] Under stress, the brain’s processing works in a way that prefers unconscious thinking.[27]

A United States study and a Taiwanese study indicated that the irreligious fear death more than the very religious.[28]

For additional information, please see the article: Atheism and death

See also: Atheism and Hell

The journalist and ex-atheist Peter Hitchens, who is the brother of the late atheist Christopher Hitchens, said upon seeing an art exhibit of Michelangelo’s painting The Last Judgment he came to the realization that he might be judged which startled him.[29] This started a train of thought within Peter Hitchens that eventually led him to become a Christian.[29]

For more information, please see: Atheism and Hell

See: Atheism and cryonics and Atheist cults

Cryonics is a pseudoscience that tries to extend life or achieve immortality in a non-theistic way after a person is legally dead (Cryonic procedures are performed shortly after a person’s death).[30] Atheists Robert Ettinger and Isaac Asimov played a notable role in the founding of the cryonics movement.[31] According to The Cryonics Society, Asimov said of cryonics, “Though no one can quantify the probability of cryonics working, I estimate it is at least 90%…”[32] For more information, please see: Atheism and cryonics

See: Atheism and transhumanism

See also: There are no atheists in foxholes and Atheists doubting the validity of atheism

Reverend William T. Cummings is famous for declaring “There are no atheists in foxholes.”[34] Chaplain F. W. Lawson of the 302d Machine Gun Battalion, who was wounded twice in wartime, stated “I doubt if there is such a thing as an atheist. At least there isn’t in a front line trench.”[35]On the other hand, the news organization NBC featured a story in which atheist veterans claimed that there are atheists in foxholes.[36]

Research indicates that heavy combat has a positive correlation to the strength of the religious faith in soldiers during the battles and subsequent to the war if they indicated their experience was a negative experience (for more information please see: There are no atheists in foxholes).

Also, due to research showing that death anxiety increases atheists’ unconscious belief in God, Dr. Nathan Heflick declared in a Psychology Today article, “But, at a less conscious (or pre-conscious) level, this research suggests that there might be less atheism in foxholes than atheists in foxholes report.”[26] Please see: Atheism and death

See also: Denials that atheists exist and Atheists doubting the validity of atheism and Atheism and apathy

It has been asserted by various theists that atheists do not exist and that atheists are actively suppressing their belief and knowledge of God and enigmatically engage in self-deception and in the deception of others (see: Denials that atheists exist and Atheism and deception). In atheistic Japan, researchers found that Japanese children see the world as designed.[37]

see also: Atheism and communism and Militant atheism and Atheism and economics and Atheism and mass murder and Atheist cults and Atheism and Karl Marx

Karl Marx said “[Religion] is the opium of the people”. Marx also stated: “Communism begins from the outset (Owen) with atheism; but atheism is at first far from being communism; indeed, that atheism is still mostly an abstraction.[38]

Vladimir Lenin similarly wrote regarding atheism and communism: “A Marxist must be a materialist, i.e., an enemy of religion, but a dialectical materialist, i.e., one who treats the struggle against religion not in an abstract way, not on the basis of remote, purely theoretical, never varying preaching, but in a concrete way, on the basis of the class struggle which is going on in practice and is educating the masses more and better than anything else could.”[39]

In 1955, Chinese communist leader Zhou Enlai declared, “We Communists are atheists”.[40] In 2014, the Communist Party of China reaffirmed that members of their party must be atheists.[41]

According to the University of Cambridge, historically, the “most notable spread of atheism was achieved through the success of the 1917 Russian Revolution, which brought the Marxist-Leninists to power.”[42] Vitalij Lazarevi Ginzburg, a Soviet physicist, wrote that the “Bolshevik communists were not merely atheists but, according to Lenin’s terminology, militant atheists.”[43] However, prior to this, the Reign of Terror of the French Revolution established a state which was anti-Roman Catholicism/Christian in nature [44] (anti-clerical deism and anti-religious atheism and played a significant role in the French Revolution[45]), with the official ideology being the Cult of Reason; during this time thousands of believers were suppressed and executed by the guillotine.[46]

See also: Atheism vs. Christianity

The atheism in communist regimes has been and continues to be militant atheism and various acts of repression including the razing of thousands of religious buildings and the killing, imprisoning, and oppression of religious leaders and believers.[47]

The persecution of Christians in the Soviet Union was the result of the violently atheist Soviet government. In the first five years after the October Revolution, 28 bishops and 1,200 priests were murdered, many on the orders of Leon Trotsky. When Joseph Stalin came to power in 1927, he ordered his secret police, under Genrikh Yagoda to intensify persecution of Christians. In the next few years, 50,000 clergy were murdered, many were tortured, including crucifixion. “Russia turned red with the blood of martyrs”, said Father Gleb Yakunin of the Russian Orthodox Church.[48] According to Orthodox Church sources, as many as fifty million Orthodox believers may have died in the twentieth century, mainly from persecution by Communists.[49]

In addition, in the atheistic and communist Soviet Union, 44 anti-religious museums were opened and the largest was the ‘The Museum of the History of Religion and Atheism’ in Leningrads Kazan cathedral.[50] Despite intense effort by the atheistic leaders of the Soviet Union, their efforts were not effective in converting the masses to atheism.[51]

China is a communist country. In 1999, the publication Christian Century reported that “China has persecuted religious believers by means of harassment, prolonged detention, and incarceration in prison or ‘reform-through-labor’ camps and police closure of places of worship.” In 2003, owners of Bibles in China were sent to prison camps and 125 Chinese churches were closed.[53] China continues to practice religious oppression today.[54]

The efforts of China’s atheist leaders in promoting atheism, however, is increasingly losing its effectiveness and the number of Christians in China is rapidly growing (see: Growth of Christianity in China). China’s state sponsored atheism and atheistic indoctrination has been a failure and a 2007 religious survey in China indicated that only 15% of Chinese identified themselves as atheists.[55]

North Korea is a repressive communist state and is officially atheistic.[56] The North Korean government practices brutal repression and atrocities against North Korean Christians.[57]

See also: Atheism and mass murder

It has been estimated that in less than the past 100 years, governments under the banner of communism have caused the death of somewhere between 40,472,000 to 259,432,000 human lives.[58] Dr. R. J. Rummel, professor emeritus of political science at the University of Hawaii, is the scholar who first coined the term democide (death by government). Dr. R. J. Rummel’s mid estimate regarding the loss of life due to communism is that communism caused the death of approximately 110,286,000 people between 1917 and 1987.[59]Richard Dawkins has attempted to engage in historical revisionism concerning atheist atrocities and Dawkins was shown to be in gross error. See also: Atheism and historical revisionism

See also: Atheistic communism and torture

The website Victimsofcommunism.org declares concerning atheistic communism and the use of torture:

For more information, please see: Atheistic communism and torture

In atheistic communist regimes forced labor has often played a significant role in their economies and this practice continues to this day (see: Atheism and forced labor).[63]

See also: Communist China and involuntary organ harvesting

Several researchers for example, Canadian human rights lawyer David Matas, former Canadian parliamentarian David Kilgour, and the investigative journalist Ethan Gutmann estimate that tens of thousands of Falun Gong prisoners in communist China have been killed to supply a financially lucrative trade in human organs and cadavers, and that these human rights abuses may be ongoing concern.[64] For more information, please see: Communist China and involuntary organ harvesting

Christian apologist Gregory Koukl wrote relative to atheism and mass murder that “the assertion is that religion has caused most of the killing and bloodshed in the world. There are people who make accusations and assertions that are empirically false. This is one of them.”[65] Koukl details the number of people killed in various events involving theism and compares them to the much higher tens of millions of people killed under regimes which advocated atheism.[65] As noted earlier, Richard Dawkins has attempted to engage in historical revisionism concerning atheist atrocities and Dawkins was shown to be in gross error.

Koukl summarized by stating:

Nobel Prize winner Aleksandr Solzhenitsyn was asked to account for the great tragedies that occurred under the brutal communist regime he and fellow citizens suffered under.

Aleksandr Solzhenitsyn wrote:

Since then I have spend well-nigh 50 years working on the history of our revolution; in the process I have read hundreds of books, collected hundreds of personal testimonies, and have already contributed eight volumes of my own toward the effort of clearing away the rubble left by that upheaval. But if I were asked today to formulate as concisely as possible the main cause of the ruinous revolution that swallowed up some 60 million of our people, I could not put it more accurately than to repeat: “Men have forgotten God; that’s why all this has happened.”[66]

Theodore Beale notes concerning atheism and mass murder:

The total body count for the ninety years between 1917 and 2007 is approximately 148 million dead at the bloody hands of fifty-two atheists, three times more than all the human beings killed by war, civil war, and individual crime in the entire twentieth century combined.

The historical record of collective atheism is thus 182,716 times worse on an annual basis than Christianitys worst and most infamous misdeed, the Spanish Inquisition. It is not only Stalin and Mao who were so murderously inclined, they were merely the worst of the whole Hell-bound lot. For every Pol Pot whose infamous name is still spoken with horror today, there was a Mengistu, a Bierut, and a Choibalsan, godless men whose names are now forgotten everywhere but in the lands they once ruled with a red hand.

Is a 58 percent chance that an atheist leader will murder a noticeable percentage of the population over which he rules sufficient evidence that atheism does, in fact, provide a systematic influence to do bad things? If that is not deemed to be conclusive, how about the fact that the average atheist crime against humanity is 18.3 million percent worse than the very worst depredation committed by Christians, even though atheists have had less than one-twentieth the number of opportunities with which to commit them. If one considers the statistically significant size of the historical atheist set and contrasts it with the fact that not one in a thousand religious leaders have committed similarly large-scale atrocities, it is impossible to conclude otherwise, even if we do not yet understand exactly why this should be the case. Once might be an accident, even twice could be coincidence, but fifty-two incidents in ninety years reeks of causation![67]

See also:

See also: Irreligion/religion and war/peace

Louise Ridley (assistant news editor at the Huffington Post UK), Vox Day and others point out that academic studies and other research consistently challenge the link between religion and war.[68]

There is historical evidence indicating that Darwinism was a causal factor for WWI and WWII (see: Irreligion/religion and war/peace and World War I and Darwinism).

See also: Religion and education and Atheistic indoctrination and education and Atheism and intelligence and Atheism and academia and Atheism and academic performance

In the United States, religious belief is positively correlated to education; a study published in an academic journal titled the Review of Religious Research demonstrated that increased education is correlated with belief in God and that “education positively affects religious participation, devotional activities, and emphasizing the importance of religion in daily life.”[69]

One of the reasons education is positively correlated with belief in God in the United States is that the demographics of people attending higher education has shifted due to more women and southerners attending higher education (these two groups are more likely to be theists. See: Atheism and women).[70]

Although atheistic indoctrination in school systems can have an effect on individuals (See: Atheist indoctrination), research indicates that social/economic insecurity often has a more significant impact.[71]

For more information, please see:

See also: Atheism and academia

In 2001, the atheist and philosopher Quentin Smith declared:

In 2004, Professor Alister McGrath, professor of historical theology at Wycliffe Hall, Oxford University declared, “The golden age of atheism is over.”[73]

For more information please see:

See also: Atheism and intelligence and Atheism and Gardner’s theory of multiple intelligences and Causes of atheism

Within various countries, standardized intelligence test (IQ) scores related to the issue of atheists/agnostics vs. theists intelligence scores yield conflicting results.[74][75] Part of the problem is that social scientists use variant definitions of atheism.[76] See also: Atheism, intelligence and the General Social Survey

However, within individuals, families and societies irreligion/religion can have an effect on intelligence – especially over time (See: Atheism and intelligence).

The Flynn effect is the significant and long-sustained increase intelligence test scores measured in many parts of the world from roughly 1930 to the present.[77] In some secular, economically developed countries, the Flynn effect has ceased and their scores on standardized intelligence tests are falling.[78] However, the Flynn effect is continuing in developing countries which tend to be more religious (see: Intelligence trends in religious countries and secular countries).

See also: Atheism and the brain and Religiosity and larger frontal lobes

Brain researchers have conducted a number of studies focusing on the differences between atheists and the religious (see: Atheism and the brain and Religiosity and larger frontal lobes).

In many secular countries intelligence is falling, while in many religious countries intelligence is increasing. See: Intelligence trends in religious countries and secular countries

See: Atheism and the theory of multiple intelligences

Visit link:

Atheism – Conservapedia

 Posted by at 10:44 am  Tagged with:

History of atheism – Wikipedia, the free encyclopedia

 Atheism  Comments Off on History of atheism – Wikipedia, the free encyclopedia
Jan 202016
 

Atheism (derived from the Ancient Greek atheos meaning “without gods; godless; secular; denying or disdaining the gods, especially officially sanctioned gods”[1]) is the absence or rejection of the belief that deities exist. The English term was used at least as early as the sixteenth century and atheistic ideas and their influence have a longer history. Over the centuries, atheists have supported their lack of belief in gods through a variety of avenues, including scientific, philosophical and ideological notions.

Philosophical atheist thought began to appear in Europe and Asia in the sixth or fifth century BCE. Will Durant explains that certain pygmy tribes found in Africa were observed to have no identifiable cults or rites. There were no totems, no deities, and no spirits. Their dead were buried without special ceremonies or accompanying items and received no further attention. They even appeared to lack simple superstitions, according to travelers’ reports.[citation needed] The Vedas of Ceylon[clarification needed] only admitted the possibility that deities might exist, but went no further. Neither prayers nor sacrifices were suggested in any way.[citation needed]

In the East, a contemplative life not centered on the idea of deities began in the sixth century BCE with the rise of Jainism, Buddhism, and certain sects of Hinduism in India, and of Taoism in China. These religions claim to offer a philosophic and salvific path not involving on deity worship. Deities are not seen as necessary to the salvific goal of the early Buddhist tradition, their reality is explicitly questioned and refuted there is a fundamental incompatibility between the notion of gods and basic Buddhist principles.[2]

Within the astika (“orthodox”) schools of Hindu philosophy, the Samkhya and the early Mimamsa school did not accept a creator-deity in their respective systems.

The principal text of the Samkhya school, the Samkhya Karika, was written by Ishvara Krishna in the fourth century CE, by which time it was already a dominant Hindu school. The origins of the school are much older and are lost in legend. The school was both dualistic and atheistic. They believed in a dual existence of Prakriti (“nature”) and Purusha (“spirit”) and had no place for an Ishvara (“God”) in its system, arguing that the existence of Ishvara cannot be proved and hence cannot be admitted to exist. The school dominated Hindu philosophy in its day, but declined after the tenth century, although commentaries were still being written as late as the sixteenth century.

The foundational text for the Mimamsa school is the Purva Mimamsa Sutras of Jaimini (c. third to first century BCE). The school reached its height c. 700 CE, and for some time in the Early Middle Ages exerted near-dominant influence on learned Hindu thought. The Mimamsa school saw their primary enquiry was into the nature of dharma based on close interpretation of the Vedas. Its core tenets were ritualism (orthopraxy), anti-asceticism and anti-mysticism. The early Mimamsakas believed in an adrishta (“unseen”) that is the result of performing karmas (“works”) and saw no need for an Ishvara (“God”) in their system. Mimamsa persists in some subschools of Hinduism today.

Jains see their tradition as eternal. Organized Jainism can be dated back to Parshva who lived in the ninth century BCE, and, more reliably, to Mahavira, a teacher of the sixth century BCE, and a contemporary of the Buddha. Jainism is a dualistic religion with the universe made up of matter and souls. The universe, and the matter and souls within it, is eternal and uncreated, and there is no omnipotent creator deity in Jainism. There are, however, “gods” and other spirits who exist within the universe and Jains believe that the soul can attain “godhood”, however none of these supernatural beings exercise any sort of creative activity or have the capacity or ability to intervene in answers to prayers.

The thoroughly materialistic and anti-religious philosophical Crvka school that originated in India with the Brhaspatya-stras (final centuries BCE) is probably the most explicitly atheist school of philosophy in the region. The school grew out of the generic skepticism in the Mauryan period. Already in the sixth century BCE, Ajita Kesakambalin, was quoted in Pali scriptures by the Buddhists with whom he was debating, teaching that “with the break-up of the body, the wise and the foolish alike are annihilated, destroyed. They do not exist after death.”[3] Crvkan philosophy is now known principally from its Astika and Buddhist opponents. The proper aim of a Crvkan, according to these sources, was to live a prosperous, happy, productive life in this world. The Tattvopaplavasimha of Jayarashi Bhatta (c. eighth century) is sometimes cited as a surviving Carvaka text. The school appears to have died out sometime around the fifteenth century.

The non-adherence[4] to the notion of a supreme deity or a prime mover is seen by many as a key distinction between Buddhism and other religions. While Buddhist traditions do not deny the existence of supernatural beings (many are discussed in Buddhist scripture), it does not ascribe powers, in the typical Western sense, for creation, salvation or judgement, to the “gods”, however, praying to enlightened deities is sometimes seen as leading to some degree of spiritual merit.

Buddhists accept the existence of beings in higher realms, known as devas, but they, like humans, are said to be suffering in samsara,[5] and not particularly wiser than we are. In fact the Buddha is often portrayed as a teacher of the deities,[6] and superior to them.[7] Despite this they do have some enlightened Devas in the path of buddhahood.

In later Mahayana literature, however, the idea of an eternal, all-pervading, all-knowing, immaculate, uncreated, and deathless Ground of Being (the dharmadhatu, inherently linked to the sattvadhatu, the realm of beings), which is the Awakened Mind (bodhicitta) or dharmakaya (“body of Truth”) of the Buddha himself, is attributed to the Buddha in a number of Mahayana sutras, and is found in various tantras as well. In some Mahayana texts, such a principle is occasionally presented as manifesting in a more personalised form as a primordial Buddha, such as Samantabhadra, Vajradhara, Vairochana, Amitabha, and Adi-Buddha, among others.

In western Classical antiquity, theism was the fundamental belief that supported the legitimacy of the state (Polis, later the Roman Empire). Historically, any person who did not believe in any deity supported by the state was fair game to accusations of atheism, a capital crime. For political reasons, Socrates in Athens (399 BCE) was accused of being atheos (“refusing to acknowledge the gods recognized by the state”). Christians in Rome were also considered subversive to the state religion and persecuted as atheists.[8] Thus, charges of atheism, meaning the subversion of religion, were often used similarly to charges of heresy and impiety as a political tool to eliminate enemies.

The roots of Western philosophy began in the Greek world in the sixth century BCE. The first Hellenic philosophers were not atheists, but they attempted to explain the world in terms of the processes of nature instead of by mythological accounts. Thus lightning was the result of “wind breaking out and parting the clouds”,[9] and earthquakes occurred when “the earth is considerably altered by heating and cooling”.[10] The early philosophers often criticised traditional religious notions. Xenophanes (sixth century BCE) famously said that if cows and horses had hands, “then horses would draw the forms of gods like horses, and cows like cows”.[11] Another philosopher, Anaxagoras (fifth century BCE), claimed that the Sun was “a fiery mass, larger than the Peloponnese”; a charge of impiety was brought against him, and he was forced to flee Athens.[12]

The first fully materialistic philosophy was produced by the atomists Leucippus and Democritus (fifth century BCE), who attempted to explain the formation and development of the world in terms of the chance movements of atoms moving in infinite space.

Euripides (480406 BCE), in his play Bellerophon, had the eponymous main character say:

Doth some one say that there be gods above? There are not; no, there are not. Let no fool, Led by the old false fable, thus deceive you.[13]

Aristophanes (ca. 448380 BCE), known for his satirical style, wrote in his play The Knights: “Shrines! Shrines! Surely you don’t believe in the gods. What’s your argument? Where’s your proof?”[14]

In the fifth century BCE the Sophists began to question many of the traditional assumptions of Greek culture. Prodicus of Ceos was said to have believed that “it was the things which were serviceable to human life that had been regarded as gods,”[15] and Protagoras stated at the beginning of a book that “With regard to the gods I am unable to say either that they exist or do not exist.”[16]

Diagoras of Melos (fifth century BCE) is known as the “first atheist”. He blasphemed by making public the Eleusinian Mysteries and discouraging people from being initiated.[17] Somewhat later (c. 300 BCE), the Cyrenaic philosopher Theodorus of Cyrene is supposed to have denied that gods exist, and wrote a book On the Gods expounding his views.

Euhemerus (c. 330260 BCE) published his view that the gods were only the deified rulers, conquerors, and founders of the past, and that their cults and religions were in essence the continuation of vanished kingdoms and earlier political structures.[18] Although Euhemerus was later criticized for having “spread atheism over the whole inhabited earth by obliterating the gods”,[19] his worldview was not atheist in a strict and theoretical sense, because he differentiated that the primordial deities were “eternal and imperishable”.[20] Some historians have argued that he merely aimed at reinventing the old religions in the light of the beginning of deification of political rulers such as Alexander the Great.[21] Euhemerus’ work was translated into Latin by Ennius, possibly to mythographically pave the way for the planned divinization of Scipio Africanus in Rome.[22]

Also important in the history of atheism was Epicurus (c. 300 BCE). Drawing on the ideas of Democritus and the Atomists, he espoused a materialistic philosophy where the universe was governed by the laws of chance without the need for divine intervention. Although he stated that deities existed, he believed that they were uninterested in human existence. The aim of the Epicureans was to attain peace of mind by exposing fear of divine wrath as irrational.

One of the most eloquent expressions of Epicurean thought is Lucretius’ On the Nature of Things (first century BCE) in which he held that gods exist but argued that religious fear was one of the chief causes of human unhappiness and that the gods did not involve themselves in the world.[23][24]

The Epicureans also denied the existence of an afterlife.[25]

Epicureans were not persecuted, but their teachings were controversial, and were harshly attacked by the mainstream schools of Stoicism and Neoplatonism. The movement remained marginal, and gradually died out at the end of the Roman Empire.

In medieval Islam, Muslim scholars recognized the idea of atheism, and frequently attacked unbelievers, although they were unable to name any atheists.[26] When individuals were accused of atheism, they were usually viewed as heretics rather than proponents of atheism.[27] However, outspoken rationalists and atheists existed, one notable figure being the ninth-century scholar Ibn al-Rawandi, who criticized the notion of religious prophecy, including that of Muhammad, and maintained that religious dogmas were not acceptable to reason and must be rejected.[28] Other critics of religion in the Islamic world include the physician and philosopher Abu Bakr al-Razi (865925), the poet Al-Maarri (9731057), and the scholar Abu Isa al-Warraq (fl. 7th century). Al-Maarri, for example, wrote and taught that religion itself was a “fable invented by the ancients”[29] and that humans were “of two sorts: those with brains, but no religion, and those with religion, but no brains.”[30]

In the European Middle Ages, no clear expression of atheism is known. The titular character of the Icelandic saga Hrafnkell, written in the late thirteenth century, says that I think it is folly to have faith in gods. After his temple to Freyr is burnt and he is enslaved, he vows never to perform another sacrifice, a position described in the sagas as golauss “godless”. Jacob Grimm in his Teutonic Mythology observes that

It is remarkable that Old Norse legend occasionally mentions certain men who, turning away in utter disgust and doubt from the heathen faith, placed their reliance on their own strength and virtue. Thus in the Slar lio 17 we read of Vbogi and Rdey sik au tru, “in themselves they trusted”,[31]

citing several other examples, including two kings.

In Christian Europe, people were persecuted for heresy, especially in countries where the Inquisition was active. Thomas Aquinas’ five proofs of God’s existence and Anselm’s ontological argument implicitly acknowledged the validity of the question about God’s existence.[original research?]Frederick Copleston, however, explains that Thomas laid out his proofs not to counter atheism, but to address certain early Christian writers such as John of Damascus, who asserted that knowledge of God’s existence was naturally innate in man, based on his natural desire for happiness.[32] Thomas stated that although there is desire for happiness which forms the basis for a proof of God’s existence in man, further reflection is required to understand that this desire is only fulfilled in God, not for example in wealth or sensual pleasure.[32]

The charge of atheism was used to attack political or religious opponents. Pope Boniface VIII, because he insisted on the political supremacy of the church, was accused by his enemies after his death of holding (unlikely) atheistic positions such as “neither believing in the immortality nor incorruptibility of the soul, nor in a life to come.”[33]

During the time of the Renaissance and the Reformation, criticism of the religious establishment became more frequent in predominantly Christian countries, but did not amount to atheism, per se.

The term athisme was coined in France in the sixteenth century. The word “atheist” appears in English books at least as early as 1566.[34] The concept of atheism re-emerged initially as a reaction to the intellectual and religious turmoil of the Age of Enlightenment and the Reformation as a charge used by those who saw the denial of god and godlessness in the controversial positions being put forward by others. During the sixteenth and seventeenth centuries, the word ‘atheist’ was used exclusively as an insult; nobody wanted to be regarded as an atheist.[35] Although one overtly atheistic compendium known as the Theophrastus redivivus was published by an anonymous author in the seventeenth century, atheism was an epithet implying a lack of moral restraint.[36]

According to Geoffrey Blainey, the Reformation in Europe had paved the way for atheists by attacking the authority of the Catholic Church, which in turn “quietly inspired other thinkers to attack the authority of the new Protestant churches”. Deism gained influence in France, Prussia and England, and proffered belief in a non-interventionist deity, but “while some deists were atheists in disguise, most were religious, and by today’s standards would be called true believers”. The scientific and mathematical discoveries of such as Copernicus, Newton and Descartes sketched a pattern of natural laws that lent weight to this new outlook[37] Blainey wrote that the Dutch philosopher Baruch Spinoza was “probably the first well known ‘semi-atheist’ to announce himself in a Christian land in the modern era”. Spinoza had been expelled from his synagogue for his protests against the teachings of its rabbis and for failing to attend Saturday services. He believed that God did not interfere in the running of the world, but rather that natural laws explained the workings of the universe. In 1661 he published his Short Treatise on God, but he was not a popular figure for the first century following his death: “An unbeliever was expected to be a rebel in almost everything and wicked in all his ways”, wrote Blainey, “but here was a virtuous one. He lived the good life and made his living in a useful way… It took courage to be a Spinoza or even one of his supporters. If a handful of scholars agreed with his writings, they did not so say in public.”[38]

How dangerous it was to be accused of being an atheist at this time is illustrated by the examples of tienne Dolet who was strangled and burned in 1546, and Giulio Cesare Vanini who received a similar fate in 1619. In 1689 the Polish nobleman Kazimierz yszczyski, who had denied the existence of God in his philosophical treatise De non existentia Dei, was imprisoned unlawfully; despite Warsaw Confederation tradition and king Sobieski’s intercession, yszczyski was condemned to death for atheism and beheaded in Warsaw after his tongue was pulled out with a burning iron and his hands slowly burned. Similarly in 1766, the French nobleman Franois-Jean de la Barre, was tortured, beheaded, and his body burned for alleged vandalism of a crucifix, a case that became a cause clbre because Voltaire tried unsuccessfully to have the judgment reversed.

The English philosopher Thomas Hobbes (15881679) was also accused of atheism, but he denied it. His theism was unusual, in that he held god to be material. Even earlier, the British playwright and poet Christopher Marlowe (15631593) was accused of atheism when a tract denying the divinity of Christ was found in his home. Before he could finish defending himself against the charge, Marlowe was murdered.

In early modern times, the first explicit atheist known by name was the German-languaged Danish critic of religion Matthias Knutzen (1646after 1674), who published three atheist writings in 1674.[39]

Kazimierz yszczyski, a Polish philosopher (executed in 1689, following a hasty and controversial trial pressed by the Catholic Church) demonstrated strong atheism in his work De non existentia Dei:

II – the Man is a creator of God, and God is a concept and creation of a Man. Hence the people are architects and engineers of God and God is not a true being, but a being existing only within mind, being chimaeric by its nature, because a God and a chimaera are the same.[40]

IV – simple folk are cheated by the more cunning with the fabrication of God for their own oppression; whereas the same oppression is shielded by the folk in a way, that if the wise attempted to free them by the truth, they would be quelled by the very people.[41][42]

While not gaining converts from large portions of the population, versions of deism became influential in certain intellectual circles. Jean Jacques Rousseau challenged the Christian notion that human beings had been tainted by sin since the Garden of Eden, and instead proposed that humans were originally good, only later to be corrupted by civilisation. The influential figure of Voltaire, spread deistic notions of to a wide audience. “After the French Revolution and its outbursts of atheism, Voltaire was widely condemned as one of the causes”, wrote Blainey, “Nonetheless, his writings did concede that fear of God was an essential policeman in a disorderly world: ‘If God did not exist, it would be necessary to invent him’, wrote Voltaire”.[43]

Arguably the first book in modern times solely dedicated to promoting atheism was written by French Catholic priest Jean Meslier (16641729), whose posthumously published lengthy philosophical essay (part of the original title: Thoughts and Feelings of Jean Meslier … Clear and Evident Demonstrations of the Vanity and Falsity of All the Religions of the World[44]) rejects the concept of god (both in the Christian and also in the Deistic sense), the soul, miracles and the discipline of theology.[45] Philosopher Michel Onfray states that Meslier’s work marks the beginning of “the history of true atheism”.[45]

By the 1770s, atheism in some predominantly Christian countries was ceasing to be a dangerous accusation that required denial, and was evolving into a position openly avowed by some. The first open denial of the existence of God and avowal of atheism since classical times may be that of Baron d’Holbach (17231789) in his 1770 work, The System of Nature. D’Holbach was a Parisian social figure who conducted a famous salon widely attended by many intellectual notables of the day, including Denis Diderot, Jean-Jacques Rousseau, David Hume, Adam Smith, and Benjamin Franklin. Nevertheless, his book was published under a pseudonym, and was banned and publicly burned by the Executioner.[citation needed] Diderot, one of the Enlightenment’s most prominent philosophes, and editor-in-chief of the Encyclopdie, which sought to challenge religious, particularly Catholic, dogma said, “Reason is to the estimation of the philosophe what grace is to the Christian”, he wrote. “Grace determines the Christian’s action; reason the philosophe’s”.[46] Diderot was briefly imprisoned for his writing, some of which was banned and burned.[citation needed]

In Scotland, David Hume produced a six volume history of England in 1754, which gave little attention to God. He implied that if God existed he was impotent in the face of European upheaval. Hume ridiculed miracles, but walked a careful line so as to avoid being too dismissive of Christianity. With Hume’s presence, Edinburgh gained a reputation as a “haven of atheism”, alarming many ordinary Britons.[47]

The culte de la Raison developed during the uncertain period 179294 (Years I and III of the Revolution), following the September Massacres, when Revolutionary France was ripe with fears of internal and foreign enemies. Several Parisian churches were transformed into Temples of Reason, notably the Church of Saint-Paul Saint-Louis in the Marais. The churches were closed in May 1793 and more securely, 24 November 1793, when the Catholic Mass was forbidden.

Blainey wrote that “atheism seized the pedestal in revolutionary France in the 1790s. The secular symbols replaced the cross. In the cathedral of Notre Dame the altar, the holy place, was converted into a monument to Reason…” During the Terror of 1792-93, France’s Christian calendar was abolished, monasteries, convents and church properties were seized and monks and nuns expelled. Historic churches were dismantled.[48] The Cult of Reason was a creed based on atheism devised during the French Revolution by Jacques Hbert, Pierre Gaspard Chaumette, and their supporters. It was stopped by Maximilien Robespierre, a Deist, who instituted the Cult of the Supreme Being.[49] Both cults were the outcome of the “de-Christianization” of French society during the Revolution and part of the Reign of Terror.

The Cult of Reason was celebrated in a carnival atmosphere of parades, ransacking of churches, ceremonious iconoclasm, in which religious and royal images were defaced, and ceremonies which substituted the “martyrs of the Revolution” for Christian martyrs. The earliest public demonstrations took place en province, outside Paris, notably by Hbertists in Lyon, but took a further radical turn with the Fte de la Libert (“Festival of Liberty”) at Notre Dame de Paris, 10 November (20 Brumaire) 1793, in ceremonies devised and organised by Pierre-Gaspard Chaumette.

The pamphlet Answer to Dr. Priestley’s Letters to a Philosophical Unbeliever (1782) is considered to be the first published declaration of atheism in Britain plausibly the first in English (as distinct from covert or cryptically atheist works). The otherwise unknown ‘William Hammon’ (possibly a pseudonym) signed the preface and postscript as editor of the work, and the anonymous main text is attributed to Matthew Turner (d. 1788?), a Liverpool physician who may have known Priestley. Historian of atheism David Berman has argued strongly for Turner’s authorship, but also suggested that there may have been two authors.[50]

The French Revolution of 1789 catapulted atheistic thought into political notability in some Western countries, and opened the way for the nineteenth century movements of Rationalism, Freethought, and Liberalism. Born in 1792, Romantic poet Percy Bysshe Shelley, a child of the Age of Enlightenment, was expelled from England’s Oxford University in 1811 for submitting to the Dean an anonymous pamphlet that he wrote entitled, The Necessity of Atheism. This pamphlet is considered by scholars as the first atheistic ideas published in the English language. An early atheistic influence in Germany was The Essence of Christianity by Ludwig Feuerbach (18041872). He influenced other German nineteenth century atheistic thinkers like Karl Marx, Max Stirner, Arthur Schopenhauer (17881860), and Friedrich Nietzsche (18441900).

The freethinker Charles Bradlaugh (18331891) was repeatedly elected to the British Parliament, but was not allowed to take his seat after his request to affirm rather than take the religious oath was turned down (he then offered to take the oath, but this too was denied him). After Bradlaugh was re-elected for the fourth time, a new Speaker allowed Bradlaugh to take the oath and permitted no objections.[51] He became the first outspoken atheist to sit in Parliament, where he participated in amending the Oaths Act.[52]

In 1844, Karl Marx (18181883), an atheistic political economist, wrote in his Contribution to the Critique of Hegel’s Philosophy of Right: “Religious suffering is, at one and the same time, the expression of real suffering and a protest against real suffering. Religion is the sigh of the oppressed creature, the heart of a heartless world, and the soul of soulless conditions. It is the opium of the people.” Marx believed that people turn to religion in order to dull the pain caused by the reality of social situations; that is, Marx suggests religion is an attempt at transcending the material state of affairs in a society the pain of class oppression by effectively creating a dream world, rendering the religious believer amenable to social control and exploitation in this world while they hope for relief and justice in life after death. In the same essay, Marx states, “…[m]an creates religion, religion does not create man…”[53]

Friedrich Nietzsche, a prominent nineteenth century philosopher, is well known for coining the aphorism “God is dead” (German: “Gott ist tot”); incidentally the phrase was not spoken by Nietzsche directly, but was used as a dialogue for the characters in his works. Nietzsche argued that Christian theism as a belief system had been a moral foundation of the Western world, and that the rejection and collapse of this foundation as a result of modern thinking (the death of God) would naturally cause a rise in nihilism or the lack of values. While Nietzsche was staunchly atheistic, he was also concerned about the negative effects of nihilism on humanity. As such, he called for a re-evaluation of old values and a creation of new ones, hoping that in doing so humans would achieve a higher state he labeled the Overman.

Atheist feminism also began in the nineteenth century. Atheist feminism is a movement that advocates feminism within atheism.[54] Atheist feminists also oppose religion as a main source of female oppression and inequality, believing that the majority of the religions are sexist and oppressive to women.[55]

Atheism in the twentieth century found recognition in a wide variety of other, broader philosophies in the Western tradition, such as existentialism, Objectivism,[56]secular humanism, nihilism, logical positivism, Marxism, anarchism, feminism,[57] and the general scientific and rationalist movement. Neopositivism and analytical philosophy discarded classical rationalism and metaphysics in favor of strict empiricism and epistemological nominalism. Proponents such as Bertrand Russell emphatically rejected belief in God. In his early work, Ludwig Wittgenstein attempted to separate metaphysical and supernatural language from rational discourse. H. L. Mencken sought to debunk both the idea that science and religion are compatible, and the idea that science is a dogmatic belief system just like any religion.[58]

A. J. Ayer asserted the unverifiability and meaninglessness of religious statements, citing his adherence to the empirical sciences. The structuralism of Lvi-Strauss sourced religious language to the human subconscious, denying its transcendental meaning. J. N. Findlay and J. J. C. Smart argued that the existence of God is not logically necessary. Naturalists and materialists such as John Dewey considered the natural world to be the basis of everything, denying the existence of God or immortality.[59][60]

The historian Geoffrey Blainey wrote that during the twentieth century, atheists in Western societies became more active and even militant, though they often “relied essentially on arguments used by numerous radical Christians since at least the eighteenth century”. They rejected the idea of an interventionist God, and said that Christianity promoted war and violence, though “the most ruthless leaders in the Second World War were atheists and secularists who were intensely hostile to both Judaism and Christianity” and “Later massive atrocities were committed in the East by those ardent atheists, Pol Pot and Mao Zedong”. Some scientists were meanwhile articulating a view that as the world becomes more educated, religion will be superseded.[61]

Often, the state’s opposition to religion took more violent forms; Aleksandr Solzhenitsyn documents widespread persecution, imprisonments and torture of believers, in his seminal work The Gulag Archipelago. Consequently, religious organizations, such as the Catholic Church, were among the most stringent opponents of communist regimes. In some cases, the initial strict measures of control and opposition to religious activity were gradually relaxed in communist states. Pope Pius XI followed his encyclicals challenging the new right-wing creeds of Italian Fascism, (Non abbiamo bisogno 1931); and Nazism (Mit brennender Sorge, 1937); with a denunciation of atheist Communism in Divini redemptoris (1937).[62]

The Russian Orthodox Church, for centuries the strongest of all Orthodox Churches, was suppressed by Russia’s atheists.[63] In 1922, the Soviet regime arrested the Patriarch of the Russian Orthodox Church.[64] The Soviet leaders Vladimir Lenin and Joseph Stalin energetically pursued the persecution of the Church through the 1920s and 1930s. Lenin wrote that every religious idea and every idea of God “is unutterable vileness… of the most dangerous kind, ‘contagion of the most abominable kind”.[65] Many priests were killed and imprisoned. Thousands of churches were closed, some turned into hospitals. In 1925 the government founded the League of Militant Atheists to intensify the persecution. The regime only relented in its persecution following the Nazi invasion of the Soviet Union in 1941.[63] Bullock wrote that “A Marxist regime was ‘godless’ by definition, and Stalin had mocked religious belief since his days in the Tiflis seminary”. His assault on the Russian peasantry, wrote Bullock, “had been as much an attack on their traditional religion as on their individual holdings, and the defence of it had played a major part in arousing peasant resistance… “.[66] In Divini Redemptoris, Pius XI said that atheistic Communism being led by Moscow was aimed at “upsetting the social order and at undermining the very foundations of Christian civilization”:[67]

The central figure in Italian Fascism was the atheist Benito Mussolini.[68] In his early career, Mussolini was a strident opponent of the Church, and the first Fascist programme, written in 1919, had called for the secularization of Church property in Italy.[69] More pragmatic than his German ally Adolf Hitler, Mussolini later moderated his stance, and in office, permitted the teaching of religion in schools and came to terms with the Papacy in the Lateran Treaty.[68] Nevertheless, Non abbiamo bisogno condemned his Fascist movement’s “pagan worship of the State” and “revolution which snatches the young from the Church and from Jesus Christ, and which inculcates in its own young people hatred, violence and irreverence.”[70]

As noted by Steigmann-Gall, in October 1928 Hitler had publicly declared: “We tolerate no one in our ranks who attacks the ideas of Christianity … in fact our movement is Christian.”[71] In contrast to that, Richard J. Evans wrote that “Hitler emphasised again and again his belief that Nazism was a secular ideology founded on modern science. Science, he declared, would easily destroy the last remaining vestiges of superstition [-] ‘In the long run’, [Hitler] concluded in July 1941, ‘National Socialism and religion will no longer be able to exist together’ […] The ideal solution would be to leave the religions to devour themselves, without persecutions.’ “[72][73] On Steigmann-Gall’s research, Evans says, “Far from being uniformly anti-Christian, Nazism contained a wide variety of religious beliefs, and Steigmann-Gall has performed a valuable service in providing a meticulously documented account of them in all their bizarre variety.”[71]

The majority of Nazis did not leave their churches. Evans wrote that, by 1939, 95% of Germans still called themselves Protestant or Catholic, while 3.5% were gottglubig and 1.5% atheist. Most in these latter categories were “convinced Nazis who had left their Church at the behest of the Party, which had been trying since the mid 1930s to reduce the influence of Christianity in society”.[74] The majority of the three million Nazi Party members continued to pay their church taxes and register as either Roman Catholic or Evangelical Protestant Christians.[75] “Gottglubig” (lit. “believers in god”) were a non-denominational nazified outlook on god beliefs, often described as predominantly based on creationist and deistic views.[76]Heinrich Himmler, who himself was fascinated with Germanic paganism[citation needed], was a strong promoter of the gottglubig movement and didn’t allow atheists into the SS, arguing that their “refusal to acknowledge higher powers” would be a “potential source of indiscipline”.[77]

Across Eastern Europe following World War Two, the parts of the Nazi Empire conquered by the Soviet Red Army, and Yugsolavia became one party Communist states, which, like the Soviet Union, were antipathetic to religion. Persecutions of religious leaders followed.[78][79] The Soviet Union ended its truce against the Russian Orthodox Church, and extended its persecutions to the newly Communist Eastern block: “In Poland, Hungary, Lithuania and other Eastern European countries, Catholic leaders who were unwilling to be silent were denounced, publicly humiliated or imprisoned by the Communists. Leaders of the national Orthodox Churches in Romania and Bulgaria had to be cautious and submissive”, wrote Blainey.[63] While the churches were generally not as severely treated at they had been in the USSR, nearly all their schools and many of their churches were closed, and they lost their formally prominent roles in public life. Children were taught atheism, and clergy were imprisoned by the thousands.[80]

Albania under Enver Hoxha became, in 1967, the first (and to date only) formally declared atheist state,[81] going far beyond what most other countries had attempted completely prohibiting religious observance, and systematically repressing and persecuting adherents. The right to religious practice was restored in the fall of communism in 1991.

Further post-war communist victories in the East saw religion purged by atheist regimes across China, North Korea and much of Indo-China.[80] In 1949, China became a Communist state under the leadership of Mao Zedong’s Communist Party of China. China itself had been a cradle of religious thought since ancient times, being the birthplace of Confucianism and Daoism, and Buddhists having arrived in the first century AD. Under Mao, China became officially atheist, and though some religious practices were permitted to continue under State supervision, religious groups deemed a threat to order have been suppressed – as with Tibetan Buddhism from 1959 and Falun Gong in recent years. Today around two-fifths of the population claim to be nonreligious or atheist.[82] Religious schools and social institutions were closed, foreign missionaries expelled, and local religious practices discouraged.[80] During the Cultural Revolution, Mao instigated “struggles” against the Four Olds: “old ideas, customs, culture, and habits of mind”.[83] In 1999, the Communist Party launched a three-year drive to promote atheism in Tibet, saying intensifying propaganda on atheism is “especially important for Tibet because atheism plays an extremely important role in promoting economic construction, social advancement and socialist spiritual civilization in the region”.[84]

In India, E. V. Ramasami Naicker (Periyar), a prominent atheist leader, fought against Hinduism and the Brahmins for discriminating and dividing people in the name of caste and religion.[85] This was highlighted in 1956 when he made the Hindu god Rama wear a garland made of slippers and made antitheistic statements.[86]

During this period, Christianity in the United States retained its popular appeal, and, wrote Blainey, the country “was the guardian, militarily of the “free world” and the defender of its religion in the face of militant communism”.[87] During the Cold War, wrote Thomas Aiello the United States often characterized its opponents as “godless communists”, which tended to reinforce the view that atheists were unreliable and unpatriotic.[88] Against this background, the words “under God” were inserted into the pledge of allegiance in 1954,[89] and the national motto was changed from E Pluribus Unum to In God We Trust in 1956. However, there were some prominent atheist activists active at this time. Atheist Vashti McCollum was the plaintiff in a landmark 1948 Supreme Court case (McCollum v. Board of Education) that struck down religious education in U.S. public schools.[90][91]Madalyn Murray O’Hair was perhaps one of the most influential American atheists; she brought forth the 1963 Supreme Court case Murray v. Curlett which banned compulsory prayer in public schools.[92] Also in 1963 she founded American Atheists, an organization dedicated to defending the civil liberties of atheists and advocating for the complete separation of church and state.[93][94]

The early twenty-first century has continued to see secularism and atheism promoted in the Western world, with the general consensus being that the number of people not affiliated with any particular religion has increased.[95][96] This has been assisted by non-profit organizations such as the Freedom From Religion Foundation in the United States (co-founded by Anne Nicol Gaylor and her daughter, Annie Laurie Gaylor, in 1976 and incorporated nationally in 1978, it promotes the separation of church and state[97][98]), and the Brights movement, which aims to promote public understanding and acknowledgment of the naturalistic worldview.[99] In addition, a large number of accessible antitheist and secularist books, many of which have become bestsellers, have been published by authors such as Sam Harris, Richard Dawkins, Daniel Dennett, Christopher Hitchens, and Victor J. Stenger.[100][101] This period has seen the rise of the New Atheism movement, a label that has been applied, sometimes pejoratively, to outspoken critics of theism.[102] Richard Dawkins also propounds a more visible form of atheist activism which he light-heartedly describes as ‘militant atheism’.[103]

Atheist feminism has also become more prominent in the 2010s. In 2012 the first “Women in Secularism” conference was held.[104] Also, Secular Woman was founded on June 28, 2012 as the first national American organization focused on nonreligious women. The mission of Secular Woman is to amplify the voice, presence, and influence of non-religious women. The atheist feminist movement has also become increasingly focused on fighting sexism and sexual harassment within the atheist movement itself.

In 2013 the first atheist monument on American government property was unveiled at the Bradford County Courthouse in Florida; it is a 1,500-pound granite bench and plinth inscribed with quotes by Thomas Jefferson, Benjamin Franklin, and Madalyn Murray O’Hair.[105][106]

In 2015, Madison, Wisconsin’s common council amended their city’s equal opportunity ordinance, adding atheism as a protected class in the areas of employment, housing, and public accommodations.[107] This makes Madison the first city in America to pass an ordinance protecting atheists.[107]

See more here:

History of atheism – Wikipedia, the free encyclopedia

 Posted by at 10:44 am  Tagged with:

Rationalism – New World Encyclopedia

 Rationalism  Comments Off on Rationalism – New World Encyclopedia
Jan 202016
 

Rationalism is a broad family of positions in epistemology. Perhaps the best general description of rationalism is the view that there are some distinctive aspects or faculties of the mind that (1) are distinct from passive aspects of the mind such as sense-perceptions and (2) someway or other constitute a special source (perhaps only a partial source) of knowledge. These distinctive aspects are typically associated or identified with human abilities to engage in mathematics and abstract reasoning, and the knowledge they provide is often seen as of a type that could not have come from other sources. Philosophers who resist rationalism are usually grouped under the heading of empiricists, who are often allied under the claim that all human knowledge comes from experience.

The debate around which the rationalism/empiricism distinction revolves is one of the oldest and most continuous in philosophy. Some of Plato’s most explicit arguments address the topic and it was arguably the central concern of many of the Modern thinkers. Indeed, Kant’s principal works were concerned with “pure” faculties of reason. Contemporary philosophers have advanced and refined the issue, though there are current thinkers who align themselves with either side of the tradition.

It is difficult to identify a major figure in the history to whom some rationalist doctrine has not been attributed at some point. One reason for this is that there is no question that humans possess some sort of reasoning ability that allows them to come to know some facts they otherwise wouldn’t (for instance, mathematical facts), and every philosopher has had to acknowledge this fact. Another reason is that the very business of philosophy is to achieve knowledge by using the rational faculties, in contrast to, for instance, mystical approaches to knowledge. Nevertheless, some philosophical figures stand out as attributing even greater significance to reasoning abilities. Three are discussed here: Plato, Descartes, and Kant.

The most famous metaphysical doctrine of the great Greek philosopher Plato is his doctrine of “Forms,” as espoused in The Republic and other dialogues. The Forms are described as being outside of the world as experience by the senses, but as somehow constituting the metaphysical basis of the world. Exactly how they fulfill this function is generally only gestured at through analogies, though the Timaeus describes the Forms as operating as blueprints for the craftsman of the universe.

The distinctiveness of Plato’s rationalism lies in another aspect of his theory of Forms. Though the common sense position is that the senses are one’s best means of getting in touch with reality, Plato held that human reasoning ability was the one thing that allowed people to approach the Forms, the most fundamental aspects of reality. It is worth pausing to reflect on how radical this idea is: On such a view, philosophical attempts to understand the nature of “good” or “just” are not mere analyses of concepts formed, but rather explorations of eternal things that are responsible for shaping the reality of the sensory world.

The French philosopher Ren Descartes, whose Meditations on First Philosophy defined the course of much philosophy from then up till the present day, stood near the beginning of the Western European Enlightenment. Impressed by the power of mathematics and the development of the new science, Descartes was confronted with two questions: How was it that people were coming to attain such deep knowledge of the workings of the universe, and how was it that they had spent so long not doing so?

Regarding the latter question, Descartes concluded that people had been mislead by putting too much faith in the testimony of their senses. In particular, he thought such a mistake was behind the then-dominant physics of Aristotle. Aristotle and the later Scholastics, in Descartes’ mind, had used their reasoning abilities well enough on the basis of what their senses told them. The problem was that they had chosen the wrong starting point for their inquiries.

By contrast, the advancements in the new science (some of which Descartes could claim for himself) were based in a very different starting point: The “pure light of reason.” In Descartes’ view, God had equipped humans with a faculty that was able to understand the fundamental essence of the two types of substance that made up the world: Intellectual substance (of which minds are instances) and physical substance (matter). Not only did God give people such a faculty, Descartes claimed, but he made them such that, when using the faculty, they are unable to question its deliverances. Not only that, but God left humanity the means to conclude that the faculty was a gift from a non-deceptive omnipotent creator.

In some respects, the German philosophy Immanuel Kant is the paradigm of an anti-rationalist philosopher. A major portion of his central work, the 1781 Critique of Pure Reason, is specifically devoted to attacking rationalist claims to have insight through reason alone into the nature of the soul, the spatiotemporal/causal structure of the universe, and the existence of God. Plato and Descartes are among his most obvious targets.

For instance, in his evaluation of rationalist claims concerning the nature of the soul (the chapter of the Critique entitled “The Paralogisms of Pure Reason”), Kant attempts to diagnose how a philosopher like Descartes could have been tempted into thinking that he could accomplish deep insight into his own nature by thought alone. One of Descartes’ conclusions was that his mind, unlike his body, was utterly simple and so lacked parts. Kant claimed that Descartes mistook a simple experience (the thought, “I think”) for an experience of simplicity. In other words, he saw Descartes as introspecting, being unable to find any divisions within himself, and thereby concluding that he lacked any such divisions and so was simple. But the reason he was unable to find divisions, in Kant’s view, was that by mere thought alone we are unable to find anything.

At the same time, however, Kant was an uncompromising advocate of some key rationalist intuitions. Confronted with the Scottish philosopher David Hume’s claim that the concept of “cause” was merely one of the constant conjunction of resembling entities, Kant insisted that all Hume really accomplished was in proving that the concept of causation could not possibly have its origin in human senses. What the senses cannot provide, Kant claimed, is any notion of necessity, yet a crucial part of our concept of causation is that it is the necessary connection of two entities or events. Kant’s conclusion was that this concept, and others like it, must be a precondition of sensory experience itself.

In his moral philosophy (most famously expounded in his Groundwork for the Metaphysics of Morals), Kant made an even more original claim on behalf of reason. The sensory world, in his view, was merely ideal, in that the spatiotemporal/sensory features of the objects people experience have their being only in humanity’s representations, and so are not features of the objects in themselves. But this means that most everyday concepts are simply inadequate for forming any notion whatsoever of what the world is like apart from our subjective features. By contrast, Kant claimed that there was no parallel reason for thinking that objects in themselves (which include our soul) do not conform to the most basic concepts of our higher faculties. So while those faculties are unable to provide any sort of direct, reliable access to the basic features of reality as envisioned by Plato and Descartes, they and they alone give one the means to at least contemplate what true reality might be like.

In the early part of the twentieth century, a philosophical movement known as Logical Positivism set the ground for a new debate over rationalism. The positivists (whose ranks included Otto Neurath and Rudolf Carnap) claimed that the only meaningful claims were those that could potentially be verified by some set of experiential observations. Their aim was to do away with intellectual traditions that they saw as simply vacuous, including theology and the majority of philosophy, in contrast with science.

As it turned out, the Positivists were unable to explain how all scientific claims were verifiable by experience, thus losing their key motivation (for instance, no set of experiences could verify that all stars are hot, since no set of experiential observations could itself confirm that one had observed all the stars). Nevertheless, their vision retained enough force that later philosophers felt hard-pressed to explain what, if anything, was epistemically distinctive about the non-sensory faculties. One recent defense of rationalism can be found in the work of contemporary philosophers such as Laurence Bonjour (the recent developments of the position are, in general, too subtle to be adequately addressed here). Yet the charge was also met by a number of thinkers working in areas as closely related to psychology as to philosophy.

A number of thinkers have argued for something like Kant’s view that people have concepts independently of experience. Indeed, the groundbreaking work of the linguist Noam Chomsky (which he occasionally tied to Descartes) is largely based on the assumption that there is a “universal grammar”that is, some basic set of linguistic categories and abilities that necessarily underlie all human languages. One task of linguistics, in Chomsky’s view, is to look at a diversity of languages in order to determine what the innate linguistic categories and capacities are.

A similar proposal concerning human beliefs about mentality itself has been advanced by Peter Carruthers. One intuitive view is that each of us comes to attribute mental states to other people only after a long developmental process where people learn to associate observable phenomena with their own mental states, and thereby with others. Yet, Carruthers argues, this view simply cannot account for the speed and complexity of humans’ understanding of others’ psychology at very early ages. The only explanation is that some understanding of mentality is “hard-wired” in the human brain.

All links retrieved June 25, 2015.

New World Encyclopedia writers and editors rewrote and completed the Wikipedia article in accordance with New World Encyclopedia standards. This article abides by terms of the Creative Commons CC-by-sa 3.0 License (CC-by-sa), which may be used and disseminated with proper attribution. Credit is due under the terms of this license that can reference both the New World Encyclopedia contributors and the selfless volunteer contributors of the Wikimedia Foundation. To cite this article click here for a list of acceptable citing formats.The history of earlier contributions by wikipedians is accessible to researchers here:

Note: Some restrictions may apply to use of individual images which are separately licensed.

See more here:

Rationalism – New World Encyclopedia

Eugenics – Simple English Wikipedia, the free encyclopedia

 Eugenics  Comments Off on Eugenics – Simple English Wikipedia, the free encyclopedia
Jan 162016
 

Eugenics is a social and political philosophy. It tries to influence the way people choose to mate and raise children, with the aim of improving the human species.

Eugenics rests on some basic ideas. The first is that, in genetics, what is true of animals is also true of man. The characteristics of animals are passed on from one generation to the next in heredity, including mental characteristics. For example, the behaviour and mental characteristics of different breeds of dog differ, and all modern breeds are greatly changed from wolves.[1] The breeding and genetics of farm animals show that if the parents of the next generation are chosen, then that affects what offspring are born.

Negative eugenics aims to cut out traits that lead to suffering, by limiting people with the traits from reproducing. Positive eugenics aims to produce more healthy and intelligent humans, by persuading people with those traits to have more children.[2]p85 In the past, many ways were proposed for doing this, and even today eugenics means different things to different people. The idea of eugenics is controversial, because in the past it was sometimes used to justify discrimination and injustice against people who were thought to be genetically unhealthy or inferior.

Modern eugenics was first invented in 1865 by Sir Francis Galton, a British scientist who was the cousin of Charles Darwin.[3] Galton believed that intelligence and talent were hereditary and were passed from parent to their children. Based on this, he thought that people could be bred to be smarter, just like animals were bred to be larger or smaller. Galton thought the best way to do this was to learn more about heredity, and also to tell people that they should only marry people who were smart and strong. Galton chose the name “Eugenics” because it was very similar to the Greek for “well born”.

Galton developed the idea of eugenics throughout his life. He understood the two types of eugenics, positive and negative eugenics. One problem, which critics brought up, is the difficulty of agreeing on who is a healthy person, genetically speaking, and who is an inferior person. Obviously, opinions might differ.

The rediscovery of the scientific work of Gregor Mendel in 1900 led to modern genetics, and an understanding of how heredity worked. Mendel himself experimented on peas, and found that many characteristics of the pea plants, such as their colour or their height, could be turned on and off through heredity like a switch. For example, his peas could be either yellow or green, one or the other.

When applied to humans, people thought this meant that human characteristics, like being smart or not, could be influenced by heredity.

Another line of thought goes like this. During their evolution, humans were subject to natural selection like any other form of life. On average, healthy and intelligent people had a better chance of reproducing. In modern civilisation, however, it often seems that this process does not apply. Alfred Russel Wallace and Charles Darwin had discussed this very point, with concern.[2]p70 In countries where statistics were collected, those statistics showed that in many cases the poor had more children than the rich. Also, statistics showed that the total population of some great nations was declining.[2]p73 One startling piece of information was produced by research directed by Karl Pearson, the Galton Professor of Eugenics at University College London, and the founder of the Department of Applied Statistics. The finding was that half of each succeeding generation was produced by no more than a quarter of the previous generation, and that quarter was “disproportionately located amongst the dregs of society”.[2]p74

The evolutionary biologist Julian Huxley was also a supporter of eugenics. He used this argument several times:

The American historian of science Garland Allen commented: “The agricultural analogy appears over and over again as it did in the writings of many American eugenicists”.[4]

Similarly, the American geneticist Charles Davenport was a lifelong promoter of eugenics, and wrote one of its first textbooks.[2][5][6]Chapter 3 There is no doubt of the support given to eugenics by professional scientists of undoubted repute.

In the United States, eugenics became a very popular idea in the early 20th century. People thought it would cure society of all of its problems at the time, like crime and poverty, because they thought that all aspects of human behavior were probably hereditary. Very important scientists and politicians supported eugenics, and most thought it was a very progressive and scientific philosophy.

But some of those who led the eugenics movement used it to justify racism and prejudice. They used eugenics as an excuse to pass laws which to restrict immigration from countries that they did not like, saying that the people in them were genetically “unfit”. They also passed laws which said that people of different races could not get married to one another. Most importantly, they passed laws which said that people who were thought to have mental illness or mental disability could be sterilised against their will. Under these laws over 60,000 people were sterilised in the United States between 1907 and the 1970s.

Today we know that interpreting statistics of this type is a complex business, and that many of the studies published early in the 20th century have serious flaws. Nevertheless, what stopped the eugenics movement was not better science. It was the realization, after World War II, of the effects of Nazi policies on race in Germany and other countries occupied during the war.[7] Such war crimes were not, of course, advocated by any eugenicist. All the same, there was a common theme. This theme was the growing interest in the rights of individuals as against the rights of the state.

With the end of the Second World War, forced sterilisation ended in Germany. It was continued in the United States until 1974. The main targets were at first those that were ill or that had some physical or mental disabilities. Later on, the focus shifted towards convicted criminals, as well as black people.

Only in 1985 was a law of the Swiss canton of Vaud abolished. This law allowed for the forced sterilisation of a certain group of people. It was replaced by a law on the national level, that tells under which circumstances people who are unable to consent, may be sterilised.

Though there are few people who openly advocate eugenics today, many people wonder what improvements in genetic technologies will mean in the future.

Genetic counselling exists, where parents can get information about their heredity and even prevent the birth of a child if it has a risk of hereditary illness. Some people do not think the issue is so clear, though, and wonder if genetic screening, genetic counselling, and birth control, are all just another type of eugenics. Some people wonder if it is bad because it infringes human dignity. Some people oppose eugenics and genetic counselling for religious reasons. The idea of eugenics is controversial today for these reasons.

Much of this concern is misplaced. Genetic counselling is not going to change the genetic composition of the human population to any noticeable extent. More relevant is the developing power to identify, and then to change directly, elements of the human genome (genetic engineering). This does have the potential to change the genetic structure of human populations.

See the article here:
Eugenics – Simple English Wikipedia, the free encyclopedia

Freedom America, Inc.

 Freedom  Comments Off on Freedom America, Inc.
Jan 142016
 

Freedom AMERICA Inc. | Call us at 813-600-5314 (Brandon, FL), 863-682-6381 (Lakeland, FL)

We understand how important it is for you to be able to trust your advisor, particularly in the wake of recent highly publicized corporate failures and investment management misdeeds. Regardless of the direction the stock market and interest rates it’s important to have a trusted advisor to look after your best interests. That trusted advisor is Jonathan Jackson. He isexperienced, responsive and understands your need for integrity and transparency.

RETIREMENT PLANNING: People can no longer rely on Social Security to cover all their retirement needs. Individuals are living longer, health costs are rising, non-traditional retirement plans are being eliminated and the cost of living is constantly increasing. Freedom America Inc. can help you start planning today to safeguard your future retirement needs. At Freedom America Inc., our advisors are thoroughly trained to help our clients avoid unnecessary risks during or before their retirement years. We will help you protect your hard-earned retirement assets in diverse markets and provide you with the lifetime income you will need (while potentially reducing your tax liabilities).

Our goal is to help you not worry about your money while you try to experience complete enjoyment during your retirement years. To schedule a FREE no-obligation consultation, please call us at: 813-600-5314 (Brandon office) 863-682-6381 (Lakeland office)

Double Click Logos To View Profiles

Jonathan Jackson, president of Freedom America Inc. was recently featured on RETIREMENT NEWS TODAY. Click the videos tab to view these informative videos.

TAX PLANNING

Protecting What’s Important to You

Contact Us To Attend One Of Our Informative Workshops

For a married couple, the difference between a good Social Security election decision and a poor one is often well over $100,000! What’s At Stake For You?

To Be On The List For Tax Preparation, Please CLICK HERE

See the rest here:
Freedom America, Inc.

 Posted by at 10:46 pm  Tagged with:

Problems Associated with Cryonics – Cryonics: Alcor Life …

 Cryonics  Comments Off on Problems Associated with Cryonics – Cryonics: Alcor Life …
Dec 272015
 

(and some possible solutions)

When you buy a house, the seller is legally obliged to disclose any known defects. When you review a company’s annual report, it tells you every problem that could affect the corporate share value. Since arrangements for cryopreservation may have a much greater impact on your life than home ownership or stock investments, we feel an ethical obligation to disclose problems that affect cryonics in general and Alcor specifically. We also believe that an organization which admits its problems is more likely to address them than an organization which pretends it has none. Thus full disclosure should encourage, rather than discourage, consumer confidence.

As of 2011, Alcor is nearly 40 years old. Our Patient Care Trust Fund is endowed with more than 7 million dollars and is responsible for the long-term care of over 100 cryopatients. In almost every year since its inception Alcor has enjoyed positive membership growth. We are the largest cryonics organization in the world yet in many respects we are still a startup company. We have fewer than a dozen employees in Scottsdale, Arizona and approximately 20 part-time independent contractors in various locations around the USA, mostly dedicated to emergency standby and rescue efforts. We serve fewer than 1,000 members and the protocols that aid our pursuit of the goal of reversible suspended animation continue to be developed. At the present time the technology required for the realization of our goal far exceeds current technical capabilities. Cryonics will not be comparable with mainstream medicine until our patients can be revived using contemporary technology, and we expect to wait for decades to see this vision fulfilled. Nevertheless, we have made important progress by introducing brain vitrification to improve patient tissue structure preservation.

Alcor shares some of the characteristics of startup companies. The organization is understaffed in some important areas and lacks as much capitalization as would be desired to support maximum growth. Limited resources prevent the organization from hiring as many highly qualified and experienced personnel as desired, and sometimes we have to postpone enhancements to equipment and procedures.

Because Alcor must react quickly to circumstances, it cannot always handle multiple tasks simultaneously. We feel a significant impact if, for example, several members experience legal death in quick succession. A heavy caseload generally means that administrative and even technical development work is postponed while member emergencies take precedence.

On the other hand, Alcor staff believe very strongly in the mission of the organization and are extremely dedicated. Alcor transport team members feel that they are saving lives, and behave accordingly. Most of all, everyone at Alcor is concerned with insuring the security of the patients who have been cryopreserved for the indefinite future. The organization’s powerful sense of purpose is reinforced by the fact that all Alcor directors and most staff members have made arrangements to be cryopreserved themselves in the future.

Unlike most startups, Alcor is unlikely to fail for financial reasons. Due to the legally independent status of the Patient Care Trust from Alcor, patients can be maintained indefinitely through its portfolio of cash, investments, real estate, and capital equipment. Some wealthy Alcor members have contributed gifts and endowments to help the organization to advance, and in the event of a financial crisis, many of the people who hope ultimately to be cryopreserved would probably provide assistance. In this sense Alcor benefits from its small size, since it maintains an intimate relationship with many members which would be more problematic if our membership was ten times as large.

Inability to Verify Results

When a conventional surgical procedure is successful, usually the patient recovers and is cured. If the same surgical procedure is unsuccessful or a surgeon makes a serious error, the patient may die. These clear outcomes provide prompt feedback for the people involved. A physician may feel deeply satisfied if a life is saved, or may be deeply troubled (and may be sued for malpractice) if errors cause a death that should have been avoidable.

Clear feedback of this type does not exist in cryonics, because the outcome of our procedures will not be known definitively until decades or even a century from now. We have good reason to expect future technologies capable of repairing cellular damage in cryonics patients, but we feel equally certain that if a patient experiences very severe brain damage prior to cryopreservation, repairs may be delayed, may be incomplete, or may be impossible. The dividing line between these positive and negative outcomes cannot be established clearly at this time.

Suppose a patient experiences 30 minutes of warm ischemia (lack of blood flow at near-normal body temperature) after legal death occurs. Will this downtime create damage that is irreversible by any imaginable technology? Probably not. But what if the ischemic interval lasts for an hour or two hours, or a day? We simply don’t know where to draw the line between one patient who is potentially viable, and another who is not.

Of course we can refer to experimental work that has evaluated the injury which occurs when cells are deprived of essential nutrients. These studies provide some guidance regarding the likely damage that a patient may experience, but they still cannot tell us with certainty if future science will be able to reverse that damage.

Another problem afflicting cryonics cases is that many uncontrolled variables prevent us from developing objective criteria to compare one case with another. Consider these two examples:

In the first case, will the long transport time negate the advantage of a rapid initial response and replacement of blood with a chilled preservation solution? In the second case, will the initial hours of warm ischemia outweigh the advantage of the rapid transport to Alcor? We can make educated guesses, but we cannot answer these questions definitively. We have no certain way of knowing which case will work out better, because we have no evidence no outcome.

We do have some simple ways to determine if a patient’s circulatory system allows good perfusion with cryoprotectant. Personnel in the operating room will notice if blood clots emerge when perfusion begins. The surface of the brain, visible through burr holes which are created to enable observation, should be pearly white in color. The brain should shrink slightly as water is replaced with cryoprotectant. When perfusion is complete the patient’s features should have acquired a sallow color indicating that cryoprotectant has diffused through the tissues.

These simple observations are helpful, but still the people who work hard to minimize transport time and maximize the rate of cooling can never enjoy the satisfying payoff that a physician receives when one of his patients recovers and returns to a normal, active life. This lack of positive outcome can cause feelings of frustration and futility, sometimes leading to disillusionment and burnout.

Conversely, if a case goes badly, team members will be protected from negative feedback. A team leader can never say to one of the personnel, “Because of your error, the patient has no chance of recovery.”

The lack of a clear outcome also prevents us from refuting people who claim that future science will be able to undo almost any degree of damage. The danger of this extreme positive thinking is that it can lead to laziness. Why bother to make heroic efforts to minimize injury, if nanotechnology will fix everything?

Alcor’s stated policy firmly rejects this attitude. Team members are very highly motivated to minimize injury because we believe that our members should not bet their lives on unknown capabilities of future science. Alcor generally hosts a debriefing after each case, encouraging all participants to share complaints, frustrations, and suggestions for improvement. Ideally, each case should be a learning experience, and participants should welcome criticism as an opportunity to identify weaknesses and overcome them in the future.

Still the lack of a clear outcome remains one of the biggest weaknesses in cryonics, since it encourages complacency and prevents accountability. The antidote to this problem is a better set of objective criteria to evaluate cases, and Alcor is working in consultation with brain ischemia experts to develop such criteria.

Volunteer Help

During the 1960s the first cryonics organizations were run entirely by volunteers. The field was not sufficiently reputable to attract qualified medical staff, and no one could have paid for professional help anyway.

Today cryonics is making a transition to professionalism, but financial limitations are prolonging the process. Some paramedics are associated with Alcor, and we hope for more in the future. We have an MD medical director, access to three contract surgeons, access to a hospice nurse, and assistance from an ischemia research laboratory in California where staff has extensive experience in relevant procedures such as vascular cannulation and perfusion. Alcor also communicates with a cryobiology laboratory that has made the most important advances in organ preservation during the past decade. Still, most transport team members who work remotely from the facility are volunteers who receive a week or two of training and modest payment for their work.

In the future, as Alcor becomes more financially secure and is able to offer higher salaries, the organization will attract more medical professionals. At this time, the transition is incomplete.

Limited Support from Mainstream Science

In the 1960s scientists in mainstream laboratories investigated techniques to cryopreserve whole organs. By the end of the 1970s most of this work had ended, and the field of cryobiology separated itself very emphatically from cryonics. The Society for Cryobiology has discouraged scientists from doing work that could advance cryonics, and has adopted a bylaw that threatens to expel any member who practices or promotes cryonics. Consequently the few scientists who are willing to do cryonics-related research live in fear of being excluded from the scientific specialty that is most relevant to their work.

The rift between cryonics and cryobiology may have been caused initially by fears among mainstream scientists that cryonics had a “tabloid journalism” flavor incompatible with science. In addition many scientists have been dissatisfied with the idea of applying procedures without a complete and full understanding of their outcome. Generally, in medicine, first a technique is studied, validated, and perfected, and then it is applied clinically. Cryonics has, of necessity, done an end-run around this formal approach by rushing to apply a technique based on theoretical arguments rather than validated clinical effectiveness.

During the past decade our knowledge and procedures have advanced far beyond the crude freezing methods imagined by most cryobiologists, and experts in molecular nanotechnology have voiced strong support. As more papers are published describing technical advances, we expect that cryobiologists and other scientists will revise their negative assessment of cryonics. In the future we believe that the arbitrary barrier between cryonics and cryobiology will gradually dissolve, and cryonics research will be recognized as a legitimate specialty of the field. However, for the time being the dim view taken of cryonics by most cryobiologists remains problematic, impairing Alcor’s ability to achieve respectable status among other relevant groups such as prospective members, regulatory officials, and legislators.

Limited Legal and Government Support

Cryonics is not explicitly recognized in the laws of any state in the United States (see The Legal Status of Cryonics Patients). This does not mean that cryonics is illegal or unregulated. In fact, Alcor must comply with state laws controlling the transport and disposition of human remains, and we make arrangements with licensed morticians to insure that these requirements are met. Alcor also complies with federal regulations established by agencies such as OSHA and EPA.

Still, the lack of specific enabling legislation for cryonics can cause problems. In the late 1980s the California Department of Health Services (DHS) asserted that because there was no statutory procedure for becoming a cryonics organization, human remains could not be conveyed to a cryonics organization via the Uniform Anatomical Gift Act (UAGA), and therefore cryonics was illegal. Fortunately, the courts were unimpressed by this argument. In 1992 the legality of cryonics, and the legality of using the UAGA for cryonics, were upheld at the appellate level.

In 1990 the Canadian province of British Columbia enacted a law that specifically banned the sale of cryonics services in that province. In 2002 the Solicitor General (Canadian equivalent of a state Attorney General) issued a written clarification stating that the law only prohibited funeral homes from selling cryonics arrangements. Cryonics could still be performed in the province, even with the paid assistance of funeral homes, provided they were not involved in the direct sale of cryonics. This position is affirmed by the Business Practices and Consumer Protection Authority of British Columbia. Despite these assurances, anxiety about the law remains.

In 2004 a bill was passed by the Arizona House of Representatives to place cryonics and cryonics procedures under the regulation of the state funeral board. In its original form this law would have prevented our use of the UAGA. The bill was ultimately withdrawn, but may be revived at a later date. Very hostile comments were made about cryonics during the floor debate of this bill. We cannot guarantee that any future legislation will be friendly to cryonics or will permit cryonics to continue in Arizona.

Despite these uncertainties, the United States enjoys a strong cultural tradition to honor the wishes of terminal patients. We believe that the freedom to choose cryonics is constitutionally protected, and so far courts have agreed. We are hopeful that we will be able to continue performing cryonics without technical compromise, under state supervision where necessary, for the indefinite future.

Limited Mainstream Medical Support

Cryonics is not an accepted or recognized “therapy” in the general medical community. To the average medical professional, cryonics is at best an unusual anatomical donation. At worst it can be viewed by some physicians as fraud upon their patient. Hospitals have sometimes deliberately delayed pronouncement of legal death, delayed release of patients to Alcor, or forbade the use of cryonics life support equipment or medications within their facilities. On one occasion in 1988 Alcor had to obtain a court order to compel a hospital to release a patient to Alcor promptly at legal death and permit our stabilization procedures on their premises.

Relations with hospitals and their staff are not always difficult. Usually when nurses and physicians learn that cryonics is a sincere practice that is overseen by other medical professionals, they will be willing to accommodate a patient’s wishes, or at least will not interfere with them. Sometimes medical staff will even assist with cryonics procedures such as administering medications and performing chest compressions if Alcor personnel are not present when legal death occurs.

The lack of formal medical recognition or support for cryonics generally means that cryonics patients remote from Alcor must be moved to a mortuary for blood replacement before transport to Alcor. Ideally these preparatory procedures should be performed within hospitals, not mortuaries. Hospitals presently allow organ procurement personnel to harvest organs from deceased patients (a fairly elaborate procedure) within their walls. We are hopeful that similar privileges will be extended to cryonics more often as the process becomes better understood and accepted, but we cannot predict how quickly this change will occur.

High Incidence of Poor Cases

In more than 50 percent of cryonics cases legal death occurs before Alcor standby personnel can be deployed, and is often followed by hours of warm ischemia. This downtime may cause severe cellular damage.

The threat of autopsy, in which the brain is routinely dissected, is an even greater danger. Any person who suffers legal death under unexpected circumstances, especially involving accidents or foul play, is liable to be autopsied. Alcor strongly urges members living in California, Maryland, New Jersey, New York, and Ohio to sign Religious Objection to Autopsy forms.

Sometimes cryonicists perish under circumstances resulting in complete destruction or disappearance of their remains. Cryonicists have been lost at sea, suffered misadventures abroad, or even disappeared without a trace. Two members of cryonics organizations were lost in the 2001 collapse of the World Trade Center towers. One was a policeman performing rescue operations.

Cryonics is not a panacea or a “cure” for death. The cryonics ideal of immediate cooling and cardiopulmonary support following cardiac arrest cannot be achieved in the majority of cases. We have good reasons to believe that molecular records of memory persist in the brain even after hours of clinical death, but only future physicians using medical technology which we do not yet possess will be able to determine, finally, whether such a person is really still “there.”

What can be done?

If you are:

…then please contact us at .

…or check out our volunteer opportunities.

Read the rest here:
Problems Associated with Cryonics – Cryonics: Alcor Life …

 Posted by at 9:44 pm  Tagged with:

Molecular nanotechnology – Wikipedia, the free encyclopedia

 Nano Technology  Comments Off on Molecular nanotechnology – Wikipedia, the free encyclopedia
Dec 152015
 

Molecular nanotechnology (MNT) is a technology based on the ability to build structures to complex, atomic specifications by means of mechanosynthesis.[1] This is distinct from nanoscale materials. Based on Richard Feynman’s vision of miniature factories using nanomachines to build complex products (including additional nanomachines), this advanced form of nanotechnology (or molecular manufacturing[2]) would make use of positionally-controlled mechanosynthesis guided by molecular machine systems. MNT would involve combining physical principles demonstrated by biophysics, chemistry, other nanotechnologies, and the molecular machinery of life with the systems engineering principles found in modern macroscale factories.

While conventional chemistry uses inexact processes obtaining inexact results, and biology exploits inexact processes to obtain definitive results, molecular nanotechnology would employ original definitive processes to obtain definitive results. The desire in molecular nanotechnology would be to balance molecular reactions in positionally-controlled locations and orientations to obtain desired chemical reactions, and then to build systems by further assembling the products of these reactions.

A roadmap for the development of MNT is an objective of a broadly based technology project led by Battelle (the manager of several U.S. National Laboratories) and the Foresight Institute.[3] The roadmap was originally scheduled for completion by late 2006, but was released in January 2008.[4] The Nanofactory Collaboration[5] is a more focused ongoing effort involving 23 researchers from 10 organizations and 4 countries that is developing a practical research agenda[6] specifically aimed at positionally-controlled diamond mechanosynthesis and diamondoid nanofactory development. In August 2005, a task force consisting of 50+ international experts from various fields was organized by the Center for Responsible Nanotechnology to study the societal implications of molecular nanotechnology.[7]

One proposed application of MNT is so-called smart materials. This term refers to any sort of material designed and engineered at the nanometer scale for a specific task. It encompasses a wide variety of possible commercial applications. One example would be materials designed to respond differently to various molecules; such a capability could lead, for example, to artificial drugs which would recognize and render inert specific viruses. Another is the idea of self-healing structures, which would repair small tears in a surface naturally in the same way as self-sealing tires or human skin.

A MNT nanosensor would resemble a smart material, involving a small component within a larger machine that would react to its environment and change in some fundamental, intentional way. A very simple example: a photosensor might passively measure the incident light and discharge its absorbed energy as electricity when the light passes above or below a specified threshold, sending a signal to a larger machine. Such a sensor would supposedly cost less and use less power than a conventional sensor, and yet function usefully in all the same applications for example, turning on parking lot lights when it gets dark.

While smart materials and nanosensors both exemplify useful applications of MNT, they pale in comparison with the complexity of the technology most popularly associated with the term: the replicating nanorobot.

MNT nanofacturing is popularly linked with the idea of swarms of coordinated nanoscale robots working together, a popularization of an early proposal by K. Eric Drexler in his 1986 discussions of MNT, but superseded in 1992. In this early proposal, sufficiently capable nanorobots would construct more nanorobots in an artificial environment containing special molecular building blocks.

Critics have doubted both the feasibility of self-replicating nanorobots and the feasibility of control if self-replicating nanorobots could be achieved: they cite the possibility of mutations removing any control and favoring reproduction of mutant pathogenic variations. Advocates address the first doubt by pointing out that the first macroscale autonomous machine replicator, made of Lego blocks, was built and operated experimentally in 2002.[8] While there are sensory advantages present at the macroscale compared to the limited sensorium available at the nanoscale, proposals for positionally controlled nanoscale mechanosynthetic fabrication systems employ dead reckoning of tooltips combined with reliable reaction sequence design to ensure reliable results, hence a limited sensorium is no handicap; similar considerations apply to the positional assembly of small nanoparts. Advocates address the second doubt by arguing that bacteria are (of necessity) evolved to evolve, while nanorobot mutation could be actively prevented by common error-correcting techniques. Similar ideas are advocated in the Foresight Guidelines on Molecular Nanotechnology,[9] and a map of the 137-dimensional replicator design space[10] recently published by Freitas and Merkle provides numerous proposed methods by which replicators could, in principle, be safely controlled by good design.

However, the concept of suppressing mutation raises the question: How can design evolution occur at the nanoscale without a process of random mutation and deterministic selection? Critics argue that MNT advocates have not provided a substitute for such a process of evolution in this nanoscale arena where conventional sensory-based selection processes are lacking. The limits of the sensorium available at the nanoscale could make it difficult or impossible to winnow successes from failures. Advocates argue that design evolution should occur deterministically and strictly under human control, using the conventional engineering paradigm of modeling, design, prototyping, testing, analysis, and redesign.

In any event, since 1992 technical proposals for MNT do not include self-replicating nanorobots, and recent ethical guidelines put forth by MNT advocates prohibit unconstrained self-replication.[9][11]

One of the most important applications of MNT would be medical nanorobotics or nanomedicine, an area pioneered by Robert Freitas in numerous books[12] and papers.[13] The ability to design, build, and deploy large numbers of medical nanorobots would, at a minimum, make possible the rapid elimination of disease and the reliable and relatively painless recovery from physical trauma. Medical nanorobots might also make possible the convenient correction of genetic defects, and help to ensure a greatly expanded lifespan. More controversially, medical nanorobots might be used to augment natural human capabilities.

Another proposed application of molecular nanotechnology is “utility fog”[14] in which a cloud of networked microscopic robots (simpler than assemblers) would change its shape and properties to form macroscopic objects and tools in accordance with software commands. Rather than modify the current practices of consuming material goods in different forms, utility fog would simply replace many physical objects.

Yet another proposed application of MNT would be phased-array optics (PAO).[15] However, this appears to be a problem addressable by ordinary nanoscale technology. PAO would use the principle of phased-array millimeter technology but at optical wavelengths. This would permit the duplication of any sort of optical effect but virtually. Users could request holograms, sunrises and sunsets, or floating lasers as the mood strikes. PAO systems were described in BC Crandall’s Nanotechnology: Molecular Speculations on Global Abundance in the Brian Wowk article “Phased-Array Optics.”[16]

Nanotechnology (or molecular nanotechnology to refer more specifically to the goals discussed here) will let us continue the historical trends in manufacturing right up to the fundamental limits imposed by physical law. It will let us make remarkably powerful molecular computers. It will let us make materials over fifty times lighter than steel or aluminium alloy but with the same strength. We’ll be able to make jets, rockets, cars or even chairs that, by today’s standards, would be remarkably light, strong, and inexpensive. Molecular surgical tools, guided by molecular computers and injected into the blood stream could find and destroy cancer cells or invading bacteria, unclog arteries, or provide oxygen when the circulation is impaired.

Nanotechnology will replace our entire manufacturing base with a new, radically more precise, radically less expensive, and radically more flexible way of making products. The aim is not simply to replace today’s computer chip making plants, but also to replace the assembly lines for cars, televisions, telephones, books, surgical tools, missiles, bookcases, airplanes, tractors, and all the rest. The objective is a pervasive change in manufacturing, a change that will leave virtually no product untouched. Economic progress and military readiness in the 21st Century will depend fundamentally on maintaining a competitive position in nanotechnology.

[17]

Despite the current early developmental status of nanotechnology and molecular nanotechnology, much concern surrounds MNT’s anticipated impact on economics[18][19] and on law. Whatever the exact effects, MNT, if achieved, would tend to reduce the scarcity of manufactured goods and make many more goods (such as food and health aids) manufacturable.

It is generally considered[by whom?] that future citizens of a molecular-nanotechnological society would still need money, in the form of unforgeable digital cash or physical specie[20] (in special circumstances). They might use such money to buy goods and services that are unique, or limited within the solar system. These might include: matter, energy, information, real estate, design services, entertainment services, legal services, fame, political power, or the attention of other people to one’s political/religious/philosophical message. Furthermore, futurists must consider war, even between prosperous states, and non-economic goals.

If MNT were realized, some resources would remain limited, because unique physical objects are limited (a plot of land in the real Jerusalem, mining rights to the larger near-earth asteroids) or because they depend on the goodwill of a particular person (the love of a famous person, a live audience in a musical concert). Demand will always exceed supply for some things, and a political economy may continue to exist in any case. Whether the interest in these limited resources would diminish with the advent of virtual reality, where they could be easily substituted, is yet unclear. One reason why it might not is a hypothetical preference for “the real thing”, although such an opinion could easily be mollified if virtual reality were to develop to a certain level of quality.

MNT should make possible nanomedical capabilities able to cure any medical condition not already cured by advances in other areas. Good health would be common, and poor health of any form would be as rare as smallpox and scurvy are today. Even cryonics would be feasible, as cryopreserved tissue could be fully repaired.

Molecular nanotechnology is one of the technologies that some analysts believe could lead to a Technological Singularity. Some feel that molecular nanotechnology would have daunting risks.[21] It conceivably could enable cheaper and more destructive conventional weapons. Also, molecular nanotechnology might permit weapons of mass destruction that could self-replicate, as viruses and cancer cells do when attacking the human body. Commentators generally agree that, in the event molecular nanotechnology were developed, its self-replication should be permitted only under very controlled or “inherently safe” conditions.

A fear exists that nanomechanical robots, if achieved, and if designed to self-replicate using naturally occurring materials (a difficult task), could consume the entire planet in their hunger for raw materials,[22] or simply crowd out natural life, out-competing it for energy (as happened historically when blue-green algae appeared and outcompeted earlier life forms). Some commentators have referred to this situation as the “grey goo” or “ecophagy” scenario. K. Eric Drexler considers an accidental “grey goo” scenario extremely unlikely and says so in later editions of Engines of Creation.

In light of this perception of potential danger, the Foresight Institute (founded by K. Eric Drexler to prepare for the arrival of future technologies) has drafted a set of guidelines[23] for the ethical development of nanotechnology. These include the banning of free-foraging self-replicating pseudo-organisms on the Earth’s surface, at least, and possibly in other places.

The feasibility of the basic technologies analyzed in Nanosystems has been the subject of a formal scientific review by U.S. National Academy of Sciences, and has also been the focus of extensive debate on the internet and in the popular press.

In 2006, U.S. National Academy of Sciences released the report of a study of molecular manufacturing as part of a longer report, A Matter of Size: Triennial Review of the National Nanotechnology Initiative[24] The study committee reviewed the technical content of Nanosystems, and in its conclusion states that no current theoretical analysis can be considered definitive regarding several questions of potential system performance, and that optimal paths for implementing high-performance systems cannot be predicted with confidence. It recommends experimental research to advance knowledge in this area:

A section heading in Drexler’s Engines of Creation reads[25] “Universal Assemblers”, and the following text speaks of multiple types of assemblers which, collectively, could hypothetically “build almost anything that the laws of nature allow to exist.” Drexler’s colleague Ralph Merkle has noted that, contrary to widespread legend,[26] Drexler never claimed that assembler systems could build absolutely any molecular structure. The endnotes in Drexler’s book explain the qualification “almost”: “For example, a delicate structure might be designed that, like a stone arch, would self-destruct unless all its pieces were already in place. If there were no room in the design for the placement and removal of a scaffolding, then the structure might be impossible to build. Few structures of practical interest seem likely to exhibit such a problem, however.”

In 1992, Drexler published Nanosystems: Molecular Machinery, Manufacturing, and Computation,[27] a detailed proposal for synthesizing stiff covalent structures using a table-top factory. Diamondoid structures and other stiff covalent structures, if achieved, would have a wide range of possible applications, going far beyond current MEMS technology. An outline of a path was put forward in 1992 for building a table-top factory in the absence of an assembler. Other researchers have begun advancing tentative, alternative proposed paths [5] for this in the years since Nanosystems was published.

In 2004 Richard Jones wrote Soft Machines (nanotechnology and life), a book for lay audiences published by Oxford University. In this book he describes radical nanotechnology (as advocated by Drexler) as a deterministic/mechanistic idea of nano engineered machines that does not take into account the nanoscale challenges such as wetness, stickness, Brownian motion, and high viscosity. He also explains what is soft nanotechnology or more appropriatelly biomimetic nanotechnology which is the way forward, if not the best way, to design functional nanodevices that can cope with all the problems at a nanoscale. One can think of soft nanotechnology as the development of nanomachines that uses the lessons learned from biology on how things work, chemistry to precisely engineer such devices and stochastic physics to model the system and its natural processes in detail.

Several researchers, including Nobel Prize winner Dr. Richard Smalley (19432005),[28] attacked the notion of universal assemblers, leading to a rebuttal from Drexler and colleagues,[29] and eventually to an exchange of letters.[30] Smalley argued that chemistry is extremely complicated, reactions are hard to control, and that a universal assembler is science fiction. Drexler and colleagues, however, noted that Drexler never proposed universal assemblers able to make absolutely anything, but instead proposed more limited assemblers able to make a very wide variety of things. They challenged the relevance of Smalley’s arguments to the more specific proposals advanced in Nanosystems. Also, Smalley argued that nearly all of modern chemistry involves reactions that take place in a solvent (usually water), because the small molecules of a solvent contribute many things, such as lowering binding energies for transition states. Since nearly all known chemistry requires a solvent, Smalley felt that Drexler’s proposal to use a high vacuum environment was not feasible. However, Drexler addresses this in Nanosystems by showing mathematically that well designed catalysts can provide the effects of a solvent and can fundamentally be made even more efficient than a solvent/enzyme reaction could ever be. It is noteworthy that, contrary to Smalley’s opinion that enzymes require water, “Not only do enzymes work vigorously in anhydrous organic media, but in this unnatural milieu they acquire remarkable properties such as greatly enhanced stability, radically altered substrate and enantiomeric specificities, molecular memory, and the ability to catalyse unusual reactions.””Enzymatic catalysis in anhydrous organic solvents.”. April 1989.””Enzymatic catalysis in anhydrous organic solvents” (PDF). April 1989.

For the future, some means have to be found for MNT design evolution at the nanoscale which mimics the process of biological evolution at the molecular scale. Biological evolution proceeds by random variation in ensemble averages of organisms combined with culling of the less-successful variants and reproduction of the more-successful variants, and macroscale engineering design also proceeds by a process of design evolution from simplicity to complexity as set forth somewhat satirically by John Gall: “A complex system that works is invariably found to have evolved from a simple system that worked. . . . A complex system designed from scratch never works and can not be patched up to make it work. You have to start over, beginning with a system that works.” [31] A breakthrough in MNT is needed which proceeds from the simple atomic ensembles which can be built with, e.g., an STM to complex MNT systems via a process of design evolution. A handicap in this process is the difficulty of seeing and manipulation at the nanoscale compared to the macroscale which makes deterministic selection of successful trials difficult; in contrast biological evolution proceeds via action of what Richard Dawkins has called the “blind watchmaker” [32] comprising random molecular variation and deterministic reproduction/extinction.

At present in 2007 the practice of nanotechnology embraces both stochastic approaches (in which, for example, supramolecular chemistry creates waterproof pants) and deterministic approaches wherein single molecules (created by stochastic chemistry) are manipulated on substrate surfaces (created by stochastic deposition methods) by deterministic methods comprising nudging them with STM or AFM probes and causing simple binding or cleavage reactions to occur. The dream of a complex, deterministic molecular nanotechnology remains elusive. Since the mid-1990s, thousands of surface scientists and thin film technocrats have latched on to the nanotechnology bandwagon and redefined their disciplines as nanotechnology. This has caused much confusion in the field and has spawned thousands of “nano”-papers on the peer reviewed literature. Most of these reports are extensions of the more ordinary research done in the parent fields.

The feasibility of Drexler’s proposals largely depends, therefore, on whether designs like those in Nanosystems could be built in the absence of a universal assembler to build them and would work as described. Supporters of molecular nanotechnology frequently claim that no significant errors have been discovered in Nanosystems since 1992. Even some critics concede[33] that “Drexler has carefully considered a number of physical principles underlying the ‘high level’ aspects of the nanosystems he proposes and, indeed, has thought in some detail” about some issues.

Other critics claim, however, that Nanosystems omits important chemical details about the low-level ‘machine language’ of molecular nanotechnology.[34][35][36][37] They also claim that much of the other low-level chemistry in Nanosystems requires extensive further work, and that Drexler’s higher-level designs therefore rest on speculative foundations. Recent such further work by Freitas and Merkle [38] is aimed at strengthening these foundations by filling the existing gaps in the low-level chemistry.

Drexler argues that we may need to wait until our conventional nanotechnology improves before solving these issues: “Molecular manufacturing will result from a series of advances in molecular machine systems, much as the first Moon landing resulted from a series of advances in liquid-fuel rocket systems. We are now in a position like that of the British Interplanetary Society of the 1930s which described how multistage liquid-fueled rockets could reach the Moon and pointed to early rockets as illustrations of the basic principle.”[39] However, Freitas and Merkle argue [40] that a focused effort to achieve diamond mechanosynthesis (DMS) can begin now, using existing technology, and might achieve success in less than a decade if their “direct-to-DMS approach is pursued rather than a more circuitous development approach that seeks to implement less efficacious nondiamondoid molecular manufacturing technologies before progressing to diamondoid”.

To summarize the arguments against feasibility: First, critics argue that a primary barrier to achieving molecular nanotechnology is the lack of an efficient way to create machines on a molecular/atomic scale, especially in the absence of a well-defined path toward a self-replicating assembler or diamondoid nanofactory. Advocates respond that a preliminary research path leading to a diamondoid nanofactory is being developed.[6]

A second difficulty in reaching molecular nanotechnology is design. Hand design of a gear or bearing at the level of atoms might take a few to several weeks. While Drexler, Merkle and others have created designs of simple parts, no comprehensive design effort for anything approaching the complexity of a Model T Ford has been attempted. Advocates respond that it is difficult to undertake a comprehensive design effort in the absence of significant funding for such efforts, and that despite this handicap much useful design-ahead has nevertheless been accomplished with new software tools that have been developed, e.g., at Nanorex.[41]

In the latest report A Matter of Size: Triennial Review of the National Nanotechnology Initiative[24] put out by the National Academies Press in December 2006 (roughly twenty years after Engines of Creation was published), no clear way forward toward molecular nanotechnology could yet be seen, as per the conclusion on page 108 of that report: “Although theoretical calculations can be made today, the eventually attainable range of chemical reaction cycles, error rates, speed of operation, and thermodynamic efficiencies of such bottom-up manufacturing systems cannot be reliably predicted at this time. Thus, the eventually attainable perfection and complexity of manufactured products, while they can be calculated in theory, cannot be predicted with confidence. Finally, the optimum research paths that might lead to systems which greatly exceed the thermodynamic efficiencies and other capabilities of biological systems cannot be reliably predicted at this time. Research funding that is based on the ability of investigators to produce experimental demonstrations that link to abstract models and guide long-term vision is most appropriate to achieve this goal.” This call for research leading to demonstrations is welcomed by groups such as the Nanofactory Collaboration who are specifically seeking experimental successes in diamond mechanosynthesis.[42] The “Technology Roadmap for Productive Nanosystems”[43] aims to offer additional constructive insights.

It is perhaps interesting to ask whether or not most structures consistent with physical law can in fact be manufactured. Advocates assert that to achieve most of the vision of molecular manufacturing it is not necessary to be able to build “any structure that is compatible with natural law.” Rather, it is necessary to be able to build only a sufficient (possibly modest) subset of such structuresas is true, in fact, of any practical manufacturing process used in the world today, and is true even in biology. In any event, as Richard Feynman once said, “It is scientific only to say what’s more likely or less likely, and not to be proving all the time what’s possible or impossible.”[44]

There is a growing body of peer-reviewed theoretical work on synthesizing diamond by mechanically removing/adding hydrogen atoms [45] and depositing carbon atoms [46][47][48][49][50][51] (a process known as mechanosynthesis). This work is slowly permeating the broader nanoscience community and is being critiqued. For instance, Peng et al. (2006)[52] (in the continuing research effort by Freitas, Merkle and their collaborators) reports that the most-studied mechanosynthesis tooltip motif (DCB6Ge) successfully places a C2 carbon dimer on a C(110) diamond surface at both 300K (room temperature) and 80K (liquid nitrogen temperature), and that the silicon variant (DCB6Si) also works at 80K but not at 300K. Over 100,000 CPU hours were invested in this latest study. The DCB6 tooltip motif, initially described by Merkle and Freitas at a Foresight Conference in 2002, was the first complete tooltip ever proposed for diamond mechanosynthesis and remains the only tooltip motif that has been successfully simulated for its intended function on a full 200-atom diamond surface.

The tooltips modeled in this work are intended to be used only in carefully controlled environments (e.g., vacuum). Maximum acceptable limits for tooltip translational and rotational misplacement errors are reported in Peng et al. (2006) — tooltips must be positioned with great accuracy to avoid bonding the dimer incorrectly. Peng et al. (2006) reports that increasing the handle thickness from 4 support planes of C atoms above the tooltip to 5 planes decreases the resonance frequency of the entire structure from 2.0THz to 1.8THz. More importantly, the vibrational footprints of a DCB6Ge tooltip mounted on a 384-atom handle and of the same tooltip mounted on a similarly constrained but much larger 636-atom “crossbar” handle are virtually identical in the non-crossbar directions. Additional computational studies modeling still bigger handle structures are welcome, but the ability to precisely position SPM tips to the requisite atomic accuracy has been repeatedly demonstrated experimentally at low temperature,[53][54] or even at room temperature[55][56] constituting a basic existence proof for this capability.

Further research[57] to consider additional tooltips will require time-consuming computational chemistry and difficult laboratory work.

A working nanofactory would require a variety of well-designed tips for different reactions, and detailed analyses of placing atoms on more complicated surfaces. Although this appears a challenging problem given current resources, many tools will be available to help future researchers: Moore’s Law predicts further increases in computer power, semiconductor fabrication techniques continue to approach the nanoscale, and researchers grow ever more skilled at using proteins, ribosomes and DNA to perform novel chemistry.

Go here to read the rest:
Molecular nanotechnology – Wikipedia, the free encyclopedia

 Posted by at 7:47 am  Tagged with:

Freedom to Tinker Research and expert commentary on …

 Freedom  Comments Off on Freedom to Tinker Research and expert commentary on …
Nov 032015
 

Yesterday I posted some thoughts about Purdue Universitys decision to destroy a video recording of my keynote address at its Dawn or Doom colloquium. The organizers had gone dark, and a promised public link was not forthcoming. After a couple of weeks of hoping to resolve the matter quietly, I did some digging and decided to write up what I learned. I posted on the web site of the Century Foundation, my main professional home:

It turns out that Purdue has wiped all copies of my video and slides from university servers, on grounds that I displayed classified documents briefly on screen. A breach report was filed with the universitys Research Information Assurance Officer, also known as the Site Security Officer, under the terms of Defense Department Operating Manual 5220.22-M. I am told that Purdue briefly considered, among other things, whether to destroy the projector I borrowed, lest contaminants remain.

I was, perhaps, naive, but pretty much all of that came as a real surprise.

Lets rewind. Information Assurance? Site Security?

These are familiar terms elsewhere, but new to me in a university context. I learned that Purdue, like a number of its peers, has a facility security clearance to perform classified U.S. government research. The manual of regulations runs to 141 pages. (Its terms forbid uncleared trustees to ask about the work underway on their campus, but thats a subject for another day.) The pertinent provision here, spelled out at length in a manual called Classified Information Spillage, requires sanitization, physical removal, or destruction of classified information discovered on unauthorized media.

Two things happened in rapid sequence around the time I told Purdue about my post.

First, the university broke a week-long silence and expressed a measure of regret:

UPDATE: Just after posting this item I received an email from Julie Rosa, who heads strategic communications for Purdue. She confirmed that Purdue wiped my video after consulting the Defense Security Service, but the university now believes it went too far.

In an overreaction while attempting to comply with regulations, the video was ordered to be deleted instead of just blocking the piece of information in question. Just FYI: The conference organizers were not even aware that any of this had happened until well after the video was already gone.

Im told we are attempting to recover the video, but I have not heard yet whether that is going to be possible. When I find out, I will let you know and we will, of course, provide a copy to you.

Then Edward Snowden tweeted the link, and the Century Foundations web site melted down. It now redirects to Medium, where you can find the full story.

I have not heard back from Purdue today about recovery of the video. It is not clear to me how recovery is even possible, if Purdue followed Pentagon guidelines for secure destruction. Moreover, although the university seems to suggest it could have posted most of the video, it does not promise to do so now. Most importantly, the best that I can hope for here is that my remarks and slides will be made available in redacted form with classified images removed, and some of my central points therefore missing. There would be one version of the talk for the few hundred people who were in the room on Sept. 24, and for however many watched the live stream, and another version left as the only record.

For our purposes here, the most notable questions have to do with academic freedom in the context of national security. How did a university come to sanitize a public lecture it had solicited, on the subject of NSA surveillance, from an author known to possess the Snowden documents? How could it profess to be shocked to find that spillage is going on at such a talk? The beginning of an answer came, I now see, in the question and answer period after my Purdue remarks. A post-doctoral research engineer stood up to ask whether the documents I had put on display were unclassified. No, I replied. Theyre classified still. Eugene Spafford, a professor of computer science there, later attributed that concern to junior security rangers on the faculty and staff. But the display of Top Secret material, he said, once noted, is something that cannot be unnoted.

Someone reported my answer to Purdues Research Information Assurance Officer, who reported in turn to Purdues representative at the Defense Security Service. By the terms of its Pentagon agreement, Purdue decided it was now obliged to wipe the video of my talk in its entirety. I regard this as a rather devout reading of the rules, which allowed Purdue to realistically consider the potential harm that may result from compromise of spilled information. The slides I showed had been viewed already by millions of people online. Even so, federal funding might be at stake for Purdue, and the notoriously vague terms of the Espionage Act hung over the decision. For most lawyers, abundance of caution would be the default choice. Certainly that kind of thinking is commonplace, and sometimes appropriate, in military and intelligence services.

But universities are not secret agencies. They cannot lightly wear the shackles of a National Industrial Security Program, as Purdue agreed to do. The values at their core, in principle and often in practice, are open inquiry and expression.

I do not claim I suffered any great harm when Purdue purged my remarks from its conference proceedings. I do not lack for publishers or public forums. But the next person whose talk is disappeared may have fewer resources.

More importantly, to my mind, Purdue has compromised its own independence and that of its students and faculty. It set an unhappy precedent, even if the people responsible thought they were merely following routine procedures.

One can criticize the university for its choices, and quite a few have since I published my post. What interests me is how nearly the results were foreordained once Purdue made itself eligible for Top Secret work.

Think of it as a classic case of mission creep. Purdue invited the secret-keepers of the Defense Security Service into one cloistered corner of campus (a small but significant fraction of research in certain fields, as the university counsel put it). The trustees accepted what may have seemed a limited burden, confined to the precincts of classified research.

Now the security apparatus claims jurisdiction over the campus (facility) at large. The university finds itself sanitizing a conference that has nothing to do with any government contract.

I am glad to see that Princeton takes the view that [s]ecurity regulations and classification of information are at variance with the basic objectives of a University. It does not permit faculty members to do classified work on campus, which avoids Purdues facility problem. And even so, at Princeton and elsewhere, there may be an undercurrent of self-censorship and informal restraint against the use of documents derived from unauthorized leaks.

Two of my best students nearly dropped a course I taught a few years back, called Secrecy, Accountability and the National Security State, when they learned the syllabus would include documents from Wikileaks. Both had security clearances, for summer jobs, and feared losing them. I told them I would put the documents on Blackboard, so they need not visit the Wikileaks site itself, but the readings were mandatory. Both, to their credit, stayed in the course. They did so against the advice of some of their mentors, including faculty members. The advice was purely practical. The U.S. government will not give a clear answer when asked whether this sort of exposure to published secrets will harm job prospects or future security clearances. Why take the risk?

Every student and scholar must decide for him- or herself, but I think universities should push back harder, and perhaps in concert. There is a treasure trove of primary documents in the archives made available by Snowden and Chelsea Manning. The government may wish otherwise, but that information is irretrievably in the public domain. Should a faculty member ignore the Snowden documents when designing a course on network security architecture? Should a student write a dissertation on modern U.S.-Saudi relations without consulting the numerous diplomatic cables on Wikileaks? To me, those would be abdications of the basic duty to seek out authoritative sources of knowledge, wherever they reside.

I would be interested to learn how others have grappled with these questions. I expect to write about them in my forthcoming book on surveillance, privacy and secrecy.

See more here:
Freedom to Tinker Research and expert commentary on …

 Posted by at 8:42 pm  Tagged with:

Regenerative Medicine – Transplant Center – Mayo Clinic

 Regenerative Medicine  Comments Off on Regenerative Medicine – Transplant Center – Mayo Clinic
Oct 302015
 

At Mayo Clinic, an integrated team, including stem cell biologists, bioengineers, doctors and scientists, work together and study regenerative medicine. The goal of the team is to treat diseases using novel therapies, such as stem cell therapy and bioengineering. Doctors in transplant medicine and transplant surgery have pioneered the study of regenerative medicine during the past five decades, and doctors continue to study new innovations in transplant medicine and surgery.

In stem cell therapy, or regenerative medicine, researchers study how stem cells may be used to replace, repair, reprogram or renew your diseased cells. Stem cells are able to grow and develop into many different types of cells in your body. Stem cell therapy may use adult cells that have been genetically reprogrammed in the laboratory (induced pluripotent stem cells), your own adult stem cells that have been reprogrammed or cells developed from an embryo (embryonic stem cells).

Researchers also study and test how reprogrammed stem cells may be turned into specialized cells that can repair or regenerate cells in your heart, blood, nerves and other parts of your body. These stem cells have the potential to treat many conditions. Stem cells also may be studied to understand how other conditions occur, to develop and test new medications, and for other research.

Researchers across Mayo Clinic, with coordination through the Center for Regenerative Medicine, are discovering, translating and applying stem cell therapy as a potential treatment for cardiovascular diseases, diabetes, degenerative joint conditions, brain and nervous system (neurological) conditions, such as Parkinson’s disease, and many other conditions. For example, researchers are studying the possibility of using stem cell therapy to repair or regenerate injured heart tissue to treat many types of cardiovascular diseases, from adult acquired disorders to congenital diseases. Read about regenerative medicine research for hypoplastic left heart syndrome.

Cardiovascular diseases, neurological conditions and diabetes have been extensively studied in stem cell therapy research. They’ve been studied because the stem cells affected in these conditions have been the same cell types that have been generated in the laboratory from various types of stem cells. Thus, translating stem cell therapy to a potential treatment for people with these conditions may be a realistic goal for the future of transplant medicine and surgery.

Researchers conduct ongoing studies in stem cell therapy. However, research and development of stem cell therapy is unpredictable and depends on many factors, including regulatory guidelines, funding sources and recent successes in stem cell therapy. Mayo Clinic researchers aim to expand research and development of stem cell therapy in the future, while keeping the safety of patients as their primary concern.

Mayo Clinic offers stem cell transplant (bone marrow transplant) for people who’ve had leukemia, lymphoma or other conditions that have been treated with chemotherapy.

Mayo Clinic currently offers a specialty consult service for regenerative medicine within the Transplant Center, the first consult service established in the United States to provide guidance for patients and families regarding stem cell-based protocols. This consult service provides medical evaluations for people with many conditions who have questions about the potential use of stem cell therapy. The staff provides guidance to determine whether stem cell clinical trials are appropriate for these individuals. Regenerative medicine staff may be consulted if a doctor or patient has asked about the potential use of stem cell therapies for many conditions, including degenerative or congenital diseases of the heart, liver, pancreas or lungs.

People sometimes have misconceptions about the use and applications of stem cell therapies. This consult service provides people with educational guidance and appropriate referrals to research studies and clinical trials in stem cell therapies for the heart, liver, pancreas and other organs. Also, the consult service supports ongoing regenerative medicine research activities within Mayo Clinic, from basic science to clinical protocols.

Read more about stem cells.

.

Excerpt from:
Regenerative Medicine – Transplant Center – Mayo Clinic

 Posted by at 7:41 am  Tagged with:

First Amendment – constitution | Laws.com

 Misc  Comments Off on First Amendment – constitution | Laws.com
Oct 282015
 

The First Amendment of the United States Constitution is contained in the Bill of Rights. The First Amendment has proven to be one of the most fundamental and important in respects to the rights attributed to the populace of the United States. Originally, the First Amendment was implemented and applied solely to Congress. However, by the beginning of the twentieth century, it was upheld that the First Amendment is to apply to all forms of government, including state and local levels. The Supreme Court decided that the Fourteenth Amendment Due Process Clause would apply to the 1st Amendment, and thus rendering such a decision.

As stated in the United States Constitution, Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, of of the press; or the right of the people peaceably to assemble and to petition the Government for a redress of grievances. Though a relatively short and concise assertion, the text provides for quite an encompassing set of rights that protect the citizens of the United States, and some of the most important and basic human rights. The First Amendment has many clauses that relate to each of the concepts that it sets out to protect. Religion is discussed in two clauses, one regarding the establishment of religion, and the other the free exercise of religion.

This proves to be one of the most important rights to secure by the Fathers of the Constitution, for so many people of European descent immigrated to the American Colonies to avoid religious persecution, and to find a safe haven to practice their religion of choice without any dire consequences. The First Amendment prohibits the government to establish a formal or national religion for the nation. It also addresses that there will be no preference of any particular religion, including the practice of no religion, or non religion.

The 1st Amendment guarantees the people of the United States the free exercise of religion, without interference from governmental factions. This right would also extend to any organization or individual infringing on such right, and would be deemed as unconstitutional.

One of the most commonly referred to clauses under the 1st Amendment is the freedom of speech. This clause has proven to be of great importance, particularly in the twentieth century and continues on with such regard in our lifetime. Under the text of the First Amendment, many issues are addressed regarding Freedom of Speech, and restrictions to exist in which such a practice may prove to be harmful to the general population or public. An example is the concept of sedition, and how this conduct can lead to insurrection against the government.

Other concepts also addressed include commercial speech, political speech, obscenity, libel, slander, and symbolic speech, such as the desecration of the American Flag. Under the First Amendment, there have been important and key court cases that have established a form precedence in how to apply the Amendment to these kinds of circumstances. The Freedom of the press is also included, and subject to similar restrictions as the freedom of speech.

The rights to petition and assembly often seem to be overlooked, for freedom of religion and speech are most commonly associated with the 1st Amendment. The right to petition is important because it gives citizens the opportunity to address their government in issues that have relevance and importance to the commonwealth. The formulation of an assembly, under the First Amendment, can be interpreted as citizens gathering and unifying for the purpose of communicating views or opinions on national issues, and for the relaying of pertinent information. The right to assembly is often related to that of petition, in such a way where citizens may assemble in the process of petitioning the government.

comments

Originally posted here:
First Amendment – constitution | Laws.com

First Amendment – National Constitution Center

 Misc  Comments Off on First Amendment – National Constitution Center
Oct 282015
 

Clauses of the First Amendment

The Establishment Clause

Americas early settlers came from a variety of religious backgrounds: Puritans predominated in New England; Anglicans predominated in the South; Quakers and Lutherans flocked especially to Pennsylvania; Roman Catholics settled mostly in Maryland; Presbyterians were most numerous in the middle colonies; and there were Jewish congregations in five cities.

During colonial times, the Church of England was established by law in all of the southern colonies, while localized Puritan (or Congregationalist) establishments held sway in most New England states. In those colonies, clergy were appointed and disciplined by colonial authorities and colonists were required to pay religious taxes and (often) to attend church services. Dissenters were often punished for preaching without a license or refusing to pay taxes to a church they disagreed with. Delaware, New Jersey, Pennsylvania, Rhode Island, and much of New York had no established church.

After Independence, there was widespread agreement that there should be no nationally established church. The Establishment Clause of the First Amendment, principally authored by James Madison, reflects this consensus. The language of the Establishment Clause itself applies only to the federal government (Congress shall pass no law respecting an establishment of religion). All states disestablished religion by 1833, and in the 1940s the Supreme Court held that disestablishment applies to state governments through the Fourteenth Amendment.

Virtually all jurists agree that it would violate the Establishment Clause for the government to compel attendance or financial support of a religious institution as such, for the government to interfere with a religious organizations selection of clergy or religious doctrine; for religious organizations or figures acting in a religious capacity to exercise governmental power; or for the government to extend benefits to some religious entities and not others without adequate secular justification. Beyond that, the meaning of the Amendment is often hotly contested, and Establishment cases in the Supreme Court often lead to 5-4 splits.

The Lemon Test

In 1971, the Supreme Court surveyed its previous Establishment Clause cases and identified three factors that identify when a government practice violates the Establishment Clause: First, the statute must have a secular legislative purpose; second, its principal or primary effect must be one that neither advances nor inhibits religion; finally, the statute must not foster an excessive entanglement with religion. Lemon v. Kurtzman (1971). In the years since Lemon, the test has been much criticized and the Court often decides Establishment Clause cases without reference to it. Yet the Justices have not overruled the Lemon test, meaning the lower courts remain obliged to use it. In some specific areas of controversy, however, the Court has adopted specific, more targeted tests to replace Lemon.

The vast majority of Establishment Clause cases have fallen in four areas: monetary aid to religious education or other social welfare activities conducted by religious institutions; government-sponsored prayer; accommodation of religious dissenters from generally-applicable laws; and government owned or sponsored religious symbols.

Aid to religious institutions

Scholars have long debated between two opposing interpretations of the Establishment Clause as it applies to government funding: (1) that the government must be neutral between religious and non-religious institutions that provide education or other social services; or (2) that no taxpayer funds should be given to religious institutions if they might be used to communicate religious doctrine. Initially, the Court tended toward the first interpretation, in the 1970s and 1980s the Court shifted to the second interpretation, and more recently the Court has decisively moved back to the first idea.

After two early decisions upholding state statutes allowing students who attend private religious schools to receive transportation, Everson v. Board of Education (1947), and textbook subsidies available to all elementary and secondary students, Board of Education v. Allen (1968), the Court attempted for about fifteen years to draw increasingly sharp lines against the use of tax-funded assistance for the religious aspects of education. At one point the Court even forbade public school teaching specialists from going on the premises of religious schools to provide remedial assistance. Aguilar v. Felton (1985). More recently, the Court has upheld programs that provide aid to educational or social programs on a neutral basis only as a result of the genuine and independent choices of private individuals. Zelman v. Simmons-Harris (2002). Indeed, the Court has held that it is unconstitutional under free speech or free exercise principles to exclude otherwise eligible recipients from government assistance solely because their activity is religious in nature. Rosenberger v. University of Virginia (1995).

Government-sponsored prayer

The Courts best-known Establishment Clause decisions held it unconstitutional for public schools to lead schoolchildren in prayer or Bible reading, even on an ostensibly voluntary basis. Engel v. Vitale (1962); Abington School District v. Schempp (1963). Although these decisions were highly controversial among the public (less so among scholars), the Court has not backed down. Instead it has extended the prohibition to prayers at graduation ceremonies, Lee v. Weisman (1992), and football games, Santa Fe Independent School District v. Doe (2000).

In less coercive settings involving adults, the Court has generally allowed government-sponsored prayer. In Marsh v. Chambers (1983), the Court upheld legislative prayer, specifically because it was steeped in history. More recently, the Court approved an opening prayer or statement at town council meetings, where the Town represented that it would accept any prayers of any faith. Town of Greece v. Galloway (2014).

Accommodation of religion

Hundreds of federal, state, and local laws exempt or accommodate religious believers or institutions from otherwise neutral, generally-applicable laws for whom compliance would conflict with religiously motivated conduct. Examples include military draft exemptions, kosher or halal meals for prisoners, medical neglect exemptions for parents who do not believe in medical treatment for their ill children, exemptions from some anti-discrimination laws for religious entities, military headgear requirements, and exemptions for the sacramental use of certain drugs. The Supreme Court has addressed very few of these exemptions. While the Court held that a state sales tax exemption limited to religious publications was unconstitutional in Texas Monthly, Inc. v. Bullock (1989), it unanimously upheld the exemption of religious organizations from prohibitions on employment discrimination for ministers. Hosanna-Tabor Evangelical Lutheran Church and School v. E.E.O.C. (2012).

Two federal laws, the Religious Freedom Restoration Act (RFRA) and the Religious Land Use and Institutionalized Persons Act (RLUIPA), provide broad-based statutory accommodations for religious practice when it conflicts with federal and certain state and local laws. A unanimous Court upheld this approach for prisoners against a claim that granting religious accommodations violates the Establishment Clause, reasoning that RLUIPA alleviates exceptional government-created burdens on private religious exercise in prisons. Cutter v. Wilkinson (2005).

The Court in Cutter left open the question whether such a regime applied to land use is constitutional and it also left open the possibility that even some applications in prisons may be unconstitutional if they are not even-handed among religions or impose too extreme a burden on non-believers. The Courts recent 5-4 decision in Burwell v. Hobby Lobby Stores, Inc. (2014), holding that RFRA exempts for-profit employers from paying for insurance coverage of contraceptive drugs that they believe are abortion-inducing, has reinvigorated the debate over such laws.

Government-sponsored religious symbols

The cases involving governmental displays of religious symbolssuch as Ten Commandment displays in public school classrooms, courthouses, or public parks; nativity scenes in courthouses and shopping districts; or crosses on public landhave generated much debate. The most prominent approach in more recent cases is called the endorsement test; it asks whether a reasonable observer acquainted with the full context would regard the display as the government endorsing religion and, therefore, sending a message of disenfranchisement to other believers and non-believers.

The Courts decisions in this arena are often closely divided. They also illustrate that the Court has declined to take a rigid, absolutist view of the separation of church and state. In Lynch v. Donnelly (1984), the Court allowed display of a nativity scene surrounded by other holiday decorations in the heart of a shopping district, stating that it engenders a friendly community spirit of good will in keeping with the season. But in County of Allegheny v. American Civil Liberties Union (1989), a different majority of Justices held that the display of a nativity scene by itself at the top of the grand stairway in a courthouse violated the Establishment Clause because it was indisputably religiousindeed sectarian. In McCreary County v. American Civil Liberties Union (2005), the Court held that a prominent display of the Ten Commandments at the county courthouse, which was preceded by an officials description of the Ten Commandments as the embodiment of ethics in Christ, was a religious display that was unconstitutional. The same day, it upheld a Ten Commandments monument, which was donated by a secular organization dedicated to reducing juvenile delinquency and surrounded by other monuments on the spacious statehouse grounds. Van Orden v. Perry (2005). Only one Justice was in the majority in both cases.

More broadly, the Establishment Clause provides a legal framework for resolving disagreements about the public role of religion in our increasingly pluralistic republic.

An accurate recounting of history is necessary to appreciate the need for disestablishment and a separation between church and state.

The Establishment Clause of the First Amendment Congress shall pass no law respecting an establishment of religion is one of the most misunderstood in the Constitution.

The Establishment Clause: A Check on Religious Tyranny by Marci A. Hamilton

An accurate recounting of history is necessary to appreciate the need for disestablishment and a separation between church and state. The religiosity of the generation that framed the Constitution and the Bill of Rights (of which the First Amendment is the first as a result of historical accident, not the preference for religious liberty over any other right) has been overstated. In reality, many of the Framers and the most influential men of that generation rarely attended church, were often Deist rather than Christian, and had a healthy understanding of the potential for religious tyranny. This latter concern is to be expected as European history was awash with executions of religious heretics: Protestant, Catholic, Jewish, and Muslim. Three of the most influential men in the Framing era provide valuable insights into the mindset at the time: Benjamin Franklin, James Madison, and John Adams. Franklin saw a pattern:

If we look back into history for the character of the present sects in Christianity, we shall find few that have not in their turns been persecutors, and complainers of persecution. The primitive Christians thought persecution extremely wrong in the Pagans, but practiced it on one another. The first Protestants of the Church of England blamed persecution in the Romish Church, but practiced it upon the Puritans. These found it wrong in the Bishops, but fell into the same practice themselves both here [England] and in New England.

Benjamin Franklin, Letter to the London Packet (June 3, 1772).

The father of the Constitution and primary drafter of the First Amendment, James Madison, in his most important document on the topic, Memorial and Remonstrance against Religious Assessments (1785), stated:

During almost fifteen centuries has the legal establishment of Christianity been on trial. What have been its fruits? More or less in all places, pride and indolence in the Clergy, ignorance and servility in the laity, in both, superstition, bigotry and persecution. . . . What influence, in fact, have ecclesiastical establishments had on society? In some instances they have been seen to erect a spiritual tyranny on the ruins of the Civil authority; in many instances they have been seen upholding the thrones of political tyranny; in no instance have they been the guardians of the liberties of the people.

Two years later, John Adams described the states as having been derived from reason, not religious belief:

It will never be pretended that any persons employed in that service had any interviews with the gods, or were in any degree under the influence of Heaven, any more than those at work upon ships or houses, or laboring in merchandise or agriculture; it will forever be acknowledged that these governments were contrived merely by the use of reason and the senses. . . .Thirteen governments [of the original states] thus founded on the natural authority of the people alone, without a pretence of miracle or mystery, which are destined to spread over the northern part of that whole quarter of the globe, are a great point gained in favor of the rights of mankind.

The Works of John Adams, Second President of the United States, Vol. 4, 292-93 (Charles C. Little & James Brown, eds., 1851).

Massachusetts and Pennsylvania are examples of early discord. In Massachusetts, the Congregationalist establishment enforced taxation on all believers and expelled or even put to death dissenters. Baptist clergy became the first in the United States to advocate for a separation of church and state and an absolute right to believe what one chooses. Baptist pastor John Leland was an eloquent and forceful proponent of the freedom of conscience and the separation of church and state. For him, America was not a Christian nation, but rather should recognize the equality of all believers, whether Jews, Turks, Pagans [or] Christians. Government should protect every man in thinking and speaking freely, and see that one does not abuse another. He proposed an amendment to the Massachusetts Constitution in 1794 because of the evils . . . occasioned in the world by religious establishments, and to keep up the proper distinction between religion and politics.”

Pennsylvania, dubbed the Holy Experiment by founder William Penn, was politically controlled by Quakers, who advocated tolerance of all believers and the mutual co-existence of differing faiths, but who made their Christianity a prerequisite for public office, only permitted Christians to vote, and forbade work on the Sabbath. Even so, the Quakers set in motion a principle that became a mainstay in religious liberty jurisprudence: the government may not coerce citizens to believe what they are unwilling to believe. If one looks carefully into the history of the United States religious experiment, one also uncovers a widely-shared view that too much liberty, or licentiousness, is as bad as no liberty. According to historian John Philip Reid, those in the eighteenth century had as great a duty to oppose licentiousness as to defend liberty.

Establishment Clause Doctrine

The Establishment Clause has yielded a wide array of doctrines (legal theories articulated by courts), each of which is largely distinct from the others, some of which are described in Professor McConnells and my joint contribution on the Establishment Clause. The reason for this proliferation of distinct doctrines is that the Establishment Clause is rooted in a concept of separating the power of church and state. These are the two most authoritative forces of human existence, and drawing a boundary line between them is not easy. The further complication is that the exercise of power is fluid, which leads both state and church to alter their positions to gain power either one over the other or as a union in opposition to the general public or particular minorities.

The separation of church and state does not mean that there is an impermeable wall between the two, but rather that the Framers fundamentally understood that the union of power between church and state would lead inevitably to tyranny. The established churches of Europe were well-known to the Founding era and the Framers and undoubtedly contributed to James Madisons inclusion of the Establishment Clause in the First Amendment, and its ratification. The following are some of the most important principles.

The Government May Not Delegate Governing Authority to Religious Entities

The Court has been sensitive to incipient establishments of religion. A Massachusetts law delegated authority to churches and schools to determine who could receive a liquor license within 500 feet of their buildings. The Supreme Court struck down the law, because it delegated to churches zoning power, which belongs to state and local government, not private entities. Larkin v. Grendels Den, Inc. (1982). According to the Court: The law substitutes the unilateral and absolute power of a church for the reasoned decision making of a public legislative body . . . on issues with significant economic and political implications. The challenged statute thus enmeshes churches in the processes of government and creates the danger of [p]olitical fragmentation and divisiveness along religious lines.

In another scenario, the Supreme Court rejected an attempt to define political boundaries solely according to religion. In Board of Education of Kiryas Joel Village School District v. Grumet (1994), the state of New York designated the neighborhood boundaries of Satmar Hasidim Orthodox Jews in Kiryas Joel Village as a public school district to itself. Thus, the boundary was determined solely by religious identity, in part because the community did not want their children to be exposed to children outside the faith. The Court invalidated the school district because political boundaries identified solely by reference to religion violate the Establishment Clause.

There Is No Such Thing as Church Autonomy Although There Is a Doctrine that Forbids the Courts from Determining What Religious Organizations Believe

In recent years, religious litigants have asserted a right to church autonomythat churches should not be subject to governmental regulationin a wide variety of cases, and in particular in cases involving the sexual abuse of children by clergy. The phrase, however, is misleading. The Supreme Court has never interpreted the First Amendment to confer on religious organizations a right to autonomy from the law. In fact, in the case in which they have most recently demanded such a right, arguing religious ministers should be exempt from laws prohibiting employment discrimination, the Court majority did not embrace the theory, not even using the term once. Hosanna-Tabor Evangelical Lutheran Church and School v. E.E.O.C. (2012).

The courts are forbidden, however, from getting involved in determining what a religious organization believes, how it organizes itself internally, or who it chooses to be ministers of the faith. Therefore, if the dispute brought to a court can only be resolved by a judge or jury settling an intra-church, ecclesiastical dispute, the dispute is beyond judicial consideration. This is a corollary to the absolute right to believe what one chooses; it is not a right to be above the laws that apply to everyone else. There is extraordinary slippage in legal briefs in numerous cases where the entity is arguing for autonomy, but what they really mean is freedom from the law, per se. For the Court and basic common sense, these are arguments for placing religion above the law, and in violation of the Establishment Clause. They are also fundamentally at odds with the common sense of the Framing generation that understood so well the evils of religious tyranny.

The Establishment Clause: Co-Guarantor of Religious Freedom by Michael McConnell

The Establishment Clause of the First Amendment Congress shall pass no law respecting an establishment of religion is one of the most misunderstood in the Constitution. Unlike most of the Constitution, it refers to a legal arrangement, the establishment of religion, which has not existed in the United States in almost two centuries. We understand what freedom of speech is, we know what private property” is, and we know what searches and seizures are, but most of us have no familiarity with what an establishment of religion would be.

The Church by Law Established in Britain was a church under control of the government. The monarch was (and is) the supreme head of the established church and chooses its leadership; Parliament enacted its Articles of Faith; the state composed or directed the content of its prayers and liturgy; clergy had to take an oath of allegiance to the king or queen; and not surprisingly, the established church was used to inculcate the idea that British subjects had a religious as well as a civic obligation to obey royal authority. The established church was a bit like a government-controlled press: it was a means by which the government could mold public opinion.

British subjects (including Americans in eight of the colonies) were legally required to attend and financially support the established church, ministers were licensed or selected by the government, and the content of church services was partially dictated by the state.

The establishment of religion was bad for liberty and it was bad for religion, too. It was opposed by a coalition of the most fervently evangelical religious sects in America (especially the Baptists), who thought the hand of government was poisonous to genuine religion, joined by the enlightenment and often deist elite (like Thomas Jefferson and Benjamin Franklin), who thought church and state should be separate, and by the leadership of minority religions, who worried that government involvement would disadvantage them. Accordingly, there was virtually no opposition to abolishing establishment of religion at the national level. Establishments survived for a while in a few states, but the last state (Massachusetts) ended its establishment in 1833.

The abolition of establishment of religion entails a number of obvious and uncontroversial elements. Individuals may not be required to contribute to, attend, or participate in religious activities. These must be voluntary. The government may not control the doctrine, liturgy, or personnel of religious organizations. These must be free of state control. Other issues are harder.

For a few decades between the late 1960s and the early 1990s, the Supreme Court attempted to forbid states to provide tax subsidies to schools that teach religious doctrine along with ordinary secular subjects. Most of these schools were Roman Catholic. This effort was largely based on a misinterpretation of history, egged on by residual anti-Catholicism. The Justices said that neutral aid to schools is just like a 1785 effort to force Virginians to contribute to the church of their choice. The analogy, however, made little sense: there is all the difference in the world between funding churches because they inculcate religion and funding schools because they provide education. In fact, the history of the early republic shows that states (and later the federal government, during Reconstruction) funded education by subsidizing all schools on a nondiscriminatory basis, and no one ever suggested this violated the non-establishment principle. By 2002, in Zelman v. Simmons-Harris, the Supreme Court returned to this original idea, allowing the government to fund schools on a neutral basis so long as the choice of religious schools was left to voluntary choice. Not only was ruling this true to history, it also best serves the ideal of religious freedom, making it possible for families to choose the type of education they want for their children.

It is sometimes suggested that laws making special accommodations for people whose religious beliefs are at odds with government policy violate the Establishment Clause, on the theory that these accommodations privilege or advance religion. This is a recently-minted idea, and not a sensible one. In all cases of accommodation, the religion involved is dissenting from prevailing policy, which means, by definition, that the religion is not dominating society. The idea that making exceptions for the benefit of people whose beliefs conflict with the majority somehow establishes religion is a plain distortion of the words. And the Supreme Court has unanimously held that religious accommodations are permissible so long as they lift a governmental obstacle to the exercise of religion, take account of costs to others, and do not favor one faith over another. Nonetheless, when religions take unpopular stances on hot-button issues (for example, regarding abortion-inducing contraceptives or same-sex marriage), critics are quick to assert that it violates the Constitution to accommodate their differences, no matter how little support that position has in history or Supreme Court precedent.

The fundamental error is to think that the Establishment Clause is designed to reduce the role of religion in American life. A better understanding is captured in this statement by Justice William O. Douglas of the Supreme Court: this country sponsor[s] an attitude on the part of government that shows no partiality to any one group and that lets each flourish according to the zeal of its adherents and the appeal of its dogma. Zorach v. Clauson (1952).

The Free Exercise Clause

Many settlers from Europe braved the hardships of immigration to the American colonies to escape religious persecution in their home countries and to secure the freedom to worship according to their own conscience and conviction. Although the colonists often understood freedom of religion more narrowly than we do today, support for protection of some conception of religious freedom was broad and deep. By the time of Independence and the construction of a new Constitution, freedom of religion was among the most widely recognized inalienable rights, protected in some fashion by state bills of rights and judicial decisions. James Madison, for example, the principal author of the First Amendment, eloquently expressed his support for such a provision in Virginia: It is the duty of every man to render to the Creator such homage, and such only, as he believes to be acceptable to him. This duty is precedent both in order of time and degree of obligation, to the claims of Civil Society.

Although the original Constitution contained only a prohibition of religious tests for federal office (Article VI, Clause 3), the Free Exercise Clause was added as part of the First Amendment in 1791. In drafting the Clause, Congress considered several formulations, but ultimately settled on protecting the free exercise of religion. This phrase makes plain the protection of actions as well as beliefs, but only those in some way connected to religion.

From the beginning, courts in the United States have struggled to find a balance between the religious liberty of believers, who often claim the right to be excused or exempted from laws that interfere with their religious practices, and the interests of society reflected in those very laws. Early state court decisions went both ways on this central question.

The Supreme Court first addressed the question in a series of cases involving nineteenth-century laws aimed at suppressing the practice of polygamy by members of the Church of Jesus Christ of Latter-day Saints (LDS), also known as Mormons. The Court unanimously rejected free exercise challenges to these laws, holding that the Free Exercise Clause protects beliefs but not conduct. Laws are made for the government of actions, and while they cannot interfere with mere religious belief and opinions, they may with practices. Reynolds v. United States (1878). What followed was perhaps the most extreme government assault on religious freedom in American history. Hundreds of church leaders were jailed, rank-and-file Mormons were deprived of their right to vote, and Congress dissolved the LDS Church and expropriated most of its property, until the church finally agreed to abandon polygamy.

The belief-action distinction ignored the Free Exercise Clauses obvious protection of religious practice, but spoke to the concern that allowing believers to disobey laws that bind everyone else would undermine the value of a government of laws applied to all. Doing so, Reynolds warned, would be to make the professed doctrines of religious belief superior to the law of the land, and in effect to permit every citizen to become a law unto himself.

Reynolds influenced the meaning of the Free Exercise Clause well into the twentieth century. In 1940, for example, the Court extended the Clausewhich by its terms constrains only the federal governmentto limit state laws and other state actions that burden religious exercise. Cantwell v. Connecticut (1940). Though it recognized that governments may not unduly infringe religious exercise, the Court reiterated that [c]onduct remains subject to regulation for the protection of society, citing Reynolds as authority. Similarly, the Court held in 1961 that the Free Exercise Clause did not exempt an orthodox Jewish merchant from Sunday closing laws, again citing Reynolds.

In the 1960s and early 1970s, the Court shifted, strengthening protection for religious conduct by construing the Free Exercise Clause to protect a right of religious believers to exemption from generally applicable laws which burden religious exercise. The Court held that the government may not enforce even a religiously-neutral law that applies generally to all or most of society unless the public interest in enforcement is compelling. Wisconsin v. Yoder (1972). Yoder thus held that Amish families could not be punished for refusing to send their children to school beyond the age of 14.

Although the language of this compelling-interest test suggested powerful protections for religion, these were never fully realized. The cases in which the Supreme Court denied exemptions outnumbered those in which it granted them. Aside from Yoder, the Court exempted believers from availability for work requirements, which denied unemployment benefits to workers terminated for prioritizing religious practices over job requirements. But it denied exemptions to believers and religious organizations which found their religious practices burdened by conditions for federal tax exemption, military uniform regulations, federal minimum wage laws, state prison regulations, state sales taxes, federal administration of public lands, and mandatory taxation and other requirements of the Social Security system. In all of these cases the Court found, often controversially, either that the governments interest in enforcement was compelling, or that the law in question did not constitute a legally-recognizable burden on religious practice.

In 1990, the Supreme Court changed course yet again, holding that the Free Exercise Clause does not relieve an individual of the obligation to comply with a valid and neutral law of general applicability on the ground that the law proscribes (or prescribes) conduct that his religion prescribes (or proscribes). Employment Division v. Smith (1990). Though it did not return to the belief-action distinction, the Court echoed Reynolds concern that religious exemptions permit a person, by virtue of his beliefs, to become a law unto himself, contradicting both constitutional tradition and common sense. Any exceptions to religiously-neutral and generally-applicable laws, therefore, must come from the political process. Smith went on to hold that the Free Exercise Clause does not protect the sacramental use of peyote, a hallucinogenic drug, by members of the Native American Church.

Smith proved to be controversial. In 1993, overwhelming majorities in Congress voted to reinstate the pre-Smith compelling-interest test by statute with the Religious Freedom Restoration Act (RFRA). RFRA authorizes courts to exempt a person from any law that imposes a substantial burden on sincere religious beliefs or actions, unless the government can show that the law is the least restrictive means of furthering a compelling governmental interest. Almost half of the states have passed similar lawsstate RFRAsapplicable to their own laws. In 1997 the Supreme Court held that Congress had constitutional authority only to apply RFRA to federal laws, and not to state or local laws. Congress then enacted a narrower law, the Religious Land Use and Institutionalized Persons Act (RLUIPA), which applies the compelling-interest test to state laws affecting prisoners and land use. RFRA and RLUIPA have afforded exemptions in a wide range of federal and state contextsfrom kosher and halal diets for prisoners, to relief from zoning and landmark regulations on churches and ministries, to exemptions from jury service.

Although some exemption claims brought under these religious freedom statutes have been relatively uncontroversialthe Supreme Court unanimously protected the right of a tiny religious sect to use a hallucinogenic drug prohibited by federal law and the right of a Muslim prisoner to wear a half-inch beard prohibited by state prison rulessome touch on highly contested moral questions. For example, the Court by a 5-4 vote excused a commercial family-owned corporation from complying with the contraception mandate, a regulation which required the corporations health insurance plan to cover what its owners believe are abortion-inducing drugs. Burwell v. Hobby Lobby Stores Inc. (2014). In the wake of Hobby Lobby and the Courts subsequent determination that states may not deny gays and lesbians the right to civil marriage, state RFRAs have become a flashpoint in conflicts over whether commercial vendors with religious objections may refuse their products and services to same-sex weddings.

Besides RFRA and other exemption statutes, the Free Exercise Clause itself, even after Smith, continues to provide protection for believers against burdens on religious exercise from laws that target religious practices, or that disadvantage religion in discretionary, case-by-case decision making. In Church of the Lukumi Babalu Aye, Inc. v. City of Hialeah (1993), for example, the Court unanimously struck down a local ordinance against the unnecessary killing of animals in a ritual or ceremonya law that was drawn to apply only to a small and unpopular religious sect whose worship includes animal sacrifice.

The Court recently recognized that the Free Exercise Clause (along with the Establishment Clause) required a religious exemption from a neutral and general federal antidiscrimination law that interfered with a churchs freedom to select its own ministers. The Court distinguished Smith on the ground that it involved government regulation of only outward physical acts, while this case concerns government interference with an internal church decision that affects the faith and mission of the church itself. Hosanna-Tabor Evangelical Lutheran Church & School v. E.E.O.C. (2012).

It remains unclear whether Lukumi and Hosanna-Tabor are narrow exceptions to Smiths general presumption against religious exemptions, or foreshadow yet another shift towards a more exemption-friendly free exercise doctrine.

At the time the United States adopted the First Amendment to the Constitution, other nations routinely imposed disabilities on religious minorities within their borders, depriving them of legal rights, making it difficult or impossible to practice their faith, and often enabling violent persecution.

One of this nations deepest commitments is to the full, equal, and free exercise of religion a right that protects not only believers, but unbelievers as well.

Religious Liberty Is Equal Liberty by Frederick Gedicks

At the time the United States adopted the First Amendment to the Constitution, other nations routinely imposed disabilities on religious minorities within their borders, depriving them of legal rights, making it difficult or impossible to practice their faith, and often enabling violent persecution. The Free Exercise Clause was thus an exceptional political achievement, imposing a constitutional norm of civic equality by prohibiting the federal government from interfering with all religious exerciseregardless of affiliation.

Only a few years before the First Amendment was ratified, James Madison wrote that all people naturally retain equal title to the free exercise of Religion according to the dictates of conscience without the governments subjecting some to peculiar burdens or granting to others peculiar exemptions. A Memorial and Remonstrance against Religious Assessments (1785). As Madison suggested, at the time the Constitution and Bill of Rights were ratified, the guarantee of religious free exercise was understood to protect against government discrimination or abuse on the basis of religion, but not to require favorable government treatment of believers. In particular, there is little evidence that the Founders understood the Free Exercise Clause to mandate religious exemptions that would excuse believers from complying with neutral and general laws that constrain the rest of society.

The Supreme Court has historically left the question of religious exemptions to Congress and the state legislatures. The first judicially-ordered exemptions arose in the 1960s and early 1970s, when the Supreme Court held the Free Exercise Clause required religious exemptions for Amish families who objected to sending their children to high school, and for employees who were denied unemployment benefits when they lost their jobs for refusing to work on their Sabbath. This doctrine of judicially-ordered exemptions, however, was an historical aberration. In Employment Division v. Smith (1990), the Court considered a claim by members of a Native American religion who lost their jobs as drug counselors for using an illegal drug in a religious ritual. The Court abandoned its new doctrine of religious exemptions, ruling that the Free Exercise Clause did not grant believers a right to exemptions from religiously neutral, generally applicable laws, though legislatures were free to grant such exemptions if they wished. This relegation of exemptions to the political process in most circumstances returned the Free Exercise Clause to its historical baseline. Notwithstanding the narrow ministerial exception recognized in Hosanna-Tabor Evangelical Church & School v. EEOC (2012), the Court has repeatedly affirmed Smith and the century of precedent cited in that case, and has shown no inclination to overturn its basic principle that neutral and general laws should apply equally to all, regardless of religious belief or unbelief.

The growth of social welfare entitlements and religious diversity in the United States has underscored the wisdom of the Smith rule. Exempting believers from social welfare laws may give them a competitive advantage, and also may harm those whom the law was designed to protect or benefit.

For example, the Court refused to exempt an Amish employer from paying Social Security taxes for his employees, reasoning that doing so would impose the employers religious faith on the employees by reducing their social security benefits regardless of whether they shared their employers religious objection to government entitlement programs. United States v. Lee (1982). Similarly, the Court refused to exempt a religious employer from federal minimum wage laws, because doing so would give the employer an advantage over competitors and depress the wages of all employees in local labor markets. Tony & Susan Alamo Foundation v. Secretary of Labor (1985).

Read the full discussion here.

The Court seems poised to adopt this third-party burden principle in decisions interpreting the 1993 Religious Freedom Restoration Act (RFRA) as well. Five Justices in Burwell v. Hobby Lobby Stores, Inc. (2014), expressly stated that RFRA exemptions imposing significant costs on others are not allowed. The majority opinion likewise acknowledged that courts must take adequate account of third-party burdens before ordering a RFRA exemption.

The growth of religious diversity makes a religious exemption regime doubly impractical. The vast range of religious beliefs and practices in the United States means that there is a potential religious objector to almost any law the government might enact. If religious objectors were presumptively entitled to exemption from any burdensome law, religious exemptions would threaten to swallow the rule of law, which presupposes its equal application to everyone. As the Court observed in Lee, a religiously diverse social welfare state cannot shield every person . . . from all the burdens incident to exercising every aspect of the right to practice religious beliefs.

Even under the equal-liberty regime contemplated by the Founders and restored by Smith, government remains subject to important constraints that protect religious liberty. Religious gerrymanders, or laws that single out particular religions for burdens not imposed on other religions or on comparable secular conduct, must satisfy strict scrutiny under the Free Exercise Clause. Church of the Lukumi Babalu Aye, Inc. v. City of Hialeah (1993); Sherbert v. Verner (1963). Under RFRA and the related Religious Land Use and Institutionalized Persons Act of 2000 (RLUIPA), the federal government and often the state governments are prohibited from burdening religious exercise without adequate justification. Holt v. Hobbs (2015); Gonzales v. O Centro Espirita Beneficiente Uniao Do Vegetal (2005). And, like judicially-ordered exemptions, legislative exemptions that impose material costs on others in order to protect believers free exercise interests may be invalid under the Establishment Clause, which protects believers and unbelievers alike from bearing the burdens of practicing someone elses religion. Estate of Thornton v. Caldor (1985).

If exemptions are to be afforded to those whose religious practices are burdened by neutral and general laws, they should generally not be granted by courts, but by the politically accountable branches of the federal and state governments. These branches are better situated to weigh and balance the competing interests of believers and others in a complex and religiously-diverse society.

Free Exercise: A Vital Protection for Diversity and Freedom by Michael McConnell

One of this nations deepest commitments is to the full, equal, and free exercise of religion a right that protects not only believers, but unbelievers as well. The government cannot use its authority to forbid Americans to conduct their lives in accordance with their religious beliefs or to require them to engage in actions contrary to religious conscience even when the vast majority of their countrymen regard those beliefs as backward, mistaken, or even immoral.

Unfortunately, in the last few years and especially since the Supreme Courts decision requiring states to recognize same-sex marriage this consensus in favor of tolerance has been slipping. All too often, we hear demands that religious people and religious institutions such as colleges or adoption agencies must join the state in recognizing same-sex marriages (or performing abortions or supplying contraceptives, or whatever the issues happen to be), or lose their right to operate.

That has not been the American way. When this country severed its ties with the British Empire, one thing that went with it was the established church. To an unprecedented degree, the young United States not only tolerated but actively welcomed people of all faiths. For example, despite his annoyance with the Quakers for their refusal to support the revolutionary war effort, Washington wrote to a Quaker Society to express his wish and desire, that the laws may always be as extensively accommodated to them, as a due regard for the protection and essential interests of the nation may justify and permit. Letter to the Annual Meeting of Quakers (1789).

What would it mean to have a regime of free exercise of religion? No one knew; there had been no such thing before. It quickly became clear that it was not enough just to cease persecution or discrimination against religious minorities. Just two years after the ink was dry on the First Amendment, the leader of the Jewish community in Philadelphia went to court and asked, under authority of his states free exercise clause, to be excused from complying with a subpoena to appear in court on his day of sabbath. He did not ask that the state cease to do official business on Saturday, but he did ask the court to make an exception an accommodation that would enable him to be faithful to the Jewish law.

This would become the central interpretive question under the Free Exercise Clause: Does it give Americans whose religions conflict with government practices the right to ask for special accommodation, assuming an accommodation can be made without great harm to the public interest or the rights of others?

Read the full discussion here.

In the early years, some religious claimants won and some lost. The Mormon Church lost in a big way, in the first such case to reach the United States Supreme Court. Reynolds v. United States (1878). In 1963, the Supreme Court held that the Free Exercise Clause of the First Amendment does require the government to make accommodations for religious exercise, subject as always to limitations based on the public interest and the rights of others. Sherbert v. Verner (1963). In 1990, the Court shifted to the opposite view, in a case involving the sacramental use of peyote by members of the Native American Church. Employment Division v. Smith (1990).

Today we have a patchwork of rules. When the federal government is involved, legislation called the Religious Freedom Restoration Act grants us the right to seek appropriate accommodation when our religious practices conflict with government policy. About half the states have similar rules, and a similar rule protects prisoners like the Muslim prisoner who recently won the right to wear a half-inch beard in accordance with Islamic law, by a 9-0 vote in the Supreme Court. Holt v. Hobbs (2015).

The range of claims has been as diverse as the religious demography of the country. A small Brazilian sect won the right to use a hallucinogenic drug in worship ceremonies; Amish farmers have won exceptions from traffic rules; Muslim soldiers have been given special accommodation when fasting for Ramadan; Orthodox Jewish boys won the right to wear their skullcaps when playing high school basketball; a Jehovahs Witness won the right to unemployment compensation after he quit rather than working to produce tank turrets; a Mormon acting student won the right to refuse roles involving nudity or profanity; and in the most controversial recent case, a family-owned business with religious objections to paying for abortion-inducing drugs persuaded the Supreme Court that the government should make those contraceptives available without forcing them to be involved.

In all these cases, courts or agencies came to the conclusion that religious exercise could be accommodated with little or no harm to the public interest or to others. As Justice Sandra Day OConnor (joined by liberal lions Brennan, Marshall, and Blackmun) wrote: courts have been quite capable of applying our free exercise jurisprudence to strike sensible balances between religious liberty and competing state interests. Employment Division v. Smith (1989) (concurring opinion).

At a time when the Supreme Courts same-sex marriage decision has allowed many millions of Americans to live their lives in accordance with their own identity, it would be tragic if we turned our backs on the right to live in accordance with our religious conviction, which is also part of who we are. A robust protection for free exercise of religion is not only part of the American tradition, it is vital to our protection for diversity and freedom.

Freedom of Speech and the Press

Congress shall make no law . . . abridging the freedom of speech, or of the press. What does this mean today? Generally speaking, it means that the government may not jail, fine, or impose civil liability on people or organizations based on what they say or write, except in exceptional circumstances.

Although the First Amendment says Congress, the Supreme Court has held that speakers are protected against all government agencies and officials: federal, state, and local, and legislative, executive, or judicial. The First Amendment does not protect speakers, however, against private individuals or organizations, such as private employers, private colleges, or private landowners. The First Amendment restrains only the government.

The Supreme Court has interpreted speech and press broadly as covering not only talking, writing, and printing, but also broadcasting, using the Internet, and other forms of expression. The freedom of speech also applies to symbolic expression, such as displaying flags, burning flags, wearing armbands, burning crosses, and the like.

The Supreme Court has held that restrictions on speech because of its contentthat is, when the government targets the speakers messagegenerally violate the First Amendment. Laws that prohibit people from criticizing a war, opposing abortion, or advocating high taxes are examples of unconstitutional content-based restrictions. Such laws are thought to be especially problematic because they distort public debate and contradict a basic principle of self-governance: that the government cannot be trusted to decide what ideas or information the people should be allowed to hear.

There are generally three situations in which the government can constitutionally restrict speech under a less demanding standard.

The rest is here:
First Amendment – National Constitution Center

 Posted by at 11:40 am  Tagged with:

Annotation 6 – First Amendment – FindLaw

 Misc  Comments Off on Annotation 6 – First Amendment – FindLaw
Oct 282015
 

FREEDOM OF EXPRESSION–SPEECH AND PRESS

Adoption and the Common Law Background

Madison’s version of the speech and press clauses, introduced in the House of Representatives on June 8, 1789, provided: ”The people shall not be deprived or abridged of their right to speak, to write, or to publish their sentiments; and the freedom of the press, as one of the great bulwarks of liberty, shall be inviolable.”1 The special committee rewrote the language to some extent, adding other provisions from Madison’s draft, to make it read: ”The freedom of speech and of the press, and the right of the people peaceably to assemble and consult for their common good, and to apply to the Government for redress of grievances, shall not be infringed.”2 In this form it went to the Senate, which rewrote it to read: ”That Congress shall make no law abridging the freedom of speech, or of the press, or the right of the people peaceably to assemble and consult for their common good, and to petition the government for a redress of grievances.”3 Subsequently, the religion clauses and these clauses were combined by the Senate.4 The final language was agreed upon in conference.

Debate in the House is unenlightening with regard to the meaning the Members ascribed to the speech and press clause and there is no record of debate in the Senate.5 In the course of debate, Madison warned against the dangers which would arise ”from discussing and proposing abstract propositions, of which the judgment may not be convinced. I venture to say, that if we confine ourselves to an enumeration of simple, acknowledged principles, the ratification will meet with but little difficulty.”6 That the ”simple, acknowledged principles” embodied in the First Amendment have occasioned controversy without end both in the courts and out should alert one to the difficulties latent in such spare language. Insofar as there is likely to have been a consensus, it was no doubt the common law view as expressed by Blackstone. ”The liberty of the press is indeed essential to the nature of a free state; but this consists in laying no previous restraints upon publications, and not in freedom from censure for criminal matter when published. Every freeman has an undoubted right to lay what sentiments he pleases before the public; to forbid this, is to destroy the freedom of the press: but if he publishes what is improper, mischievous, or illegal, he must take the consequences of his own temerity. To subject the press to the restrictive power of a licenser, as was formerly done, both before and since the Revolution, is to subject all freedom of sentiment to the prejudices of one man, and make him the arbitrary and infallible judge of all controverted points in learning, religion and government. But to punish as the law does at present any dangerous or offensive writings, which, when published, shall on a fair and impartial trial be adjudged of a pernicious tendency, is necessary for the preservation of peace and good order, of government and religion, the only solid foundations of civil liberty. Thus, the will of individuals is still left free: the abuse only of that free will is the object of legal punishment. Neither is any restraint hereby laid upon freedom of thought or inquiry; liberty of private sentiment is still left; the disseminating, or making public, of bad sentiments, destructive to the ends of society, is the crime which society corrects.”7

Whatever the general unanimity on this proposition at the time of the proposal of and ratification of the First Amendment,8 it appears that there emerged in the course of the Jeffersonian counterattack on the Sedition Act9 and the use by the Adams Administration of the Act to prosecute its political opponents,10 something of a libertarian theory of freedom of speech and press,11 which, however much the Jeffersonians may have departed from it upon assuming power,12 was to blossom into the theory undergirding Supreme Court First Amendment jurisprudence in modern times. Full acceptance of the theory that the Amendment operates not only to bar most prior restraints of expression but subsequent punishment of all but a narrow range of expression, in political discourse and indeed in all fields of expression, dates from a quite recent period, although the Court’s movement toward that position began in its consideration of limitations on speech and press in the period following World War I.13 Thus, in 1907, Justice Holmes could observe that even if the Fourteenth Amendment embodied prohibitions similar to the First Amendment, ”still we should be far from the conclusion that the plaintiff in error would have us reach. In the first place, the main purpose of such constitutional provisions is ‘to prevent all such previous restraints upon publications as had been practiced by other governments,’ and they do not prevent the subsequent punishment of such as may be deemed contrary to the public welfare . . . . The preliminary freedom extends as well to the false as to the true; the subsequent punishment may extend as well to the true as to the false. This was the law of criminal libel apart from statute in most cases, if not in all.”14 But as Justice Holmes also observed, ”[t]here is no constitutional right to have all general propositions of law once adopted remain unchanged.”15

But in Schenck v. United States,16 the first of the post-World War I cases to reach the Court, Justice Holmes, in the opinion of the Court, while upholding convictions for violating the Espionage Act by attempting to cause insubordination in the military service by circulation of leaflets, suggested First Amendment restraints on subsequent punishment as well as prior restraint. ”It well may be that the prohibition of laws abridging the freedom of speech is not confined to previous restraints although to prevent them may have been the main purpose . . . . We admit that in many places and in ordinary times the defendants in saying all that was said in the circular would have been within their constitutional rights. But the character of every act depends upon the circumstances in which it is done. The most stringent protection of free speech would not protect a man in falsely shouting fire in a theater and causing a panic. . . . The question in every case is whether the words used are used in such a nature as to create a clear and present danger that they will bring about the substantive evils that Congress has a right to prevent.” Justice Holmes along with Justice Brandeis soon went into dissent in their views that the majority of the Court was misapplying the legal standards thus expressed to uphold suppression of speech which offered no threat of danger to organized institutions.17 But it was with the Court’s assumption that the Fourteenth Amendment restrained the power of the States to suppress speech and press that the doctrines developed.18 At first, Holmes and Brandeis remained in dissent, but in Fiske v. Kansas,19 the Court sustained a First Amendment type of claim in a state case, and in Stromberg v. California,20 a state law was voided on grounds of its interference with free speech.21 State common law was also voided, the Court in an opinion by Justice Black asserting that the First Amendment enlarged protections for speech, press, and religion beyond those enjoyed under English common law.22 Development over the years since has been uneven, but by 1964 the Court could say with unanimity: ”we consider this case against the background of a profound national commitment to the principle that debate on public issues should be uninhibited, robust, and wide-open, and that it may well include vehement, caustic and sometimes unpleasantly sharp attacks on government and public officials.”23 And in 1969, it was said that the cases ”have fashioned the principle that the constitutional guarantees of free speech and free press do not permit a State to forbid or proscribe advocacy of the use of force or of law violation except where such advocacy is directed to inciting or producing imminent lawless action and is likely to incite or produce such action.”24 This development and its myriad applications are elaborated in the following sections. The First Amendment by its terms applies only to laws enacted by Congress, and not to the actions of private persons. Supp.15 This leads to a ”state action” (or ”governmental action”) limitation similar to that applicable to the Fourteenth Amendment. Supp.16 The limitation has seldom been litigated in the First Amendment context, but there is no obvious reason why analysis should differ markedly from Fourteenth Amendment state action analysis. Both contexts require ”cautious analysis of the quality and degree of Government relationship to the particular acts in question.” Supp.17 In holding that the National Railroad Passenger Corporation (Amtrak) is a governmental entity for purposes of the First Amendment, the Court declared that ”[t]he Constitution constrains governmental action ‘by whatever instruments or in whatever modes that action may be taken.’. . . [a]nd under whatever congressional label.”Supp.18 The relationship of the government to broadcast licensees affords other opportunities to explore the breadth of ”governmental action.”Supp.19

Footnotes

[Footnote 1] 1 Annals of Congress 434 (1789). Madison had also proposed language limiting the power of the States in a number of respects, including a guarantee of freedom of the press, Id. at 435. Although passed by the House, the amendment was defeated by the Senate, supra, p.957.

[Footnote 2] Id. at 731 (August 15, 1789).

[Footnote 3] The Bill of Rights: A Documentary History 1148-49 (B. Schwartz ed. 1971).

[Footnote 4] Id. at 1153.

[Footnote 5] The House debate insofar as it touched upon this amendment was concerned almost exclusively with a motion to strike the right to assemble and an amendment to add a right of the people to instruct their Representatives. 1 Annals of Congress 731-49 (August 15, 1789). There are no records of debates in the States on ratification.

[Footnote 6] Id. at 738.

[Footnote 7] 4 W. Blackstone’s Commentaries on the Laws of England 151-52 (T. Cooley 2d rev. ed. 1872). See 3 J. Story, Commentaries on the Constitution of the United States 1874-86 (Boston: 1833). The most comprehensive effort to assess theory and practice in the period prior to and immediately following adoption of the Amendment is L. Levy, Legacy of Suppression: Freedom of Speech and Press in Early American History (1960), which generally concluded that the Blackstonian view was the prevailing one at the time and probably the understanding of those who drafted, voted for, and ratified the Amendment.

[Footnote 8] It would appear that Madison advanced libertarian views earlier than his Jeffersonian compatriots, as witness his leadership of a move to refuse officially to concur in Washington’s condemnation of ”[c]ertain self-created societies,” by which the President meant political clubs supporting the French Revolution, and his success in deflecting the Federalist intention to censure such societies. I. Brant, James Madison–Father of the Constitution 1787-1800, 416-20 (1950). ”If we advert to the nature of republican government,” Madison told the House, ”we shall find that the censorial power is in the people over the government, and not in the government over the people.” 4 Annals of Congress 934 (1794). On the other hand, the early Madison, while a member of his county’s committee on public safety, had enthusiastically promoted prosecution of Loyalist speakers and the burning of their pamphlets during the Revolutionary period. 1 Papers of James Madison 147, 161-62, 190-92 (W. Hutchinson & W. Rachal eds. 1962). There seems little doubt that Jefferson held to the Blackstonian view. Writing to Madison in 1788, he said: ”A declaration that the federal government will never restrain the presses from printing anything they please, will not take away the liability of the printers for false facts printed.” 13 Papers of Thomas Jefferson 442 (J. Boyd ed. 1955). Commenting a year later to Madison on his proposed amendment, Jefferson suggested that the free speech-free press clause might read something like: ”The people shall not be deprived or abridged of their right to speak, to write or otherwise to publish anything but false facts affecting injuriously the life, liberty, property, or reputation of others or affecting the peace of the confederacy with foreign nations.” 15 Papers, supra, at 367.

[Footnote 9] The Act, Ch. 74, 1 Stat. 596 (1798), punished anyone who would ”write, print, utter or publish . . . any false, scandalous and malicious writing or writings against the government of the United States, or either house of the Congress of the United States, or the President of the United States, with intent to defame the said government, or either house of the said Congress, or the said President, or to bring them, or either of them, into contempt or disrepute.” See J. Smith, Freedom’s Fetters–The Alien and Sedition Laws and American Civil Liberties (1956).

[Footnote 10] Id. at 159 et seq.

[Footnote 11] L. Levy, Legacy of Suppression: Freedom of Speech and Press in Early American History, ch. 6 (Cambridge, 1960); New York Times Co. v. Sullivan, 376 U.S. 254, 273-76 (1964). But compare L. Levy, Emergence of a Free Press (1985), a revised and enlarged edition of Legacy of Suppression, in which Professor Levy modifies his earlier views, arguing that while the intention of the Framers to outlaw the crime of seditious libel, in pursuit of a free speech principle, cannot be established and may not have been the goal, there was a tradition of robust and rowdy expression during the period of the framing that contradicts his prior view that a modern theory of free expression did not begin to emerge until the debate over the Alien and Sedition Acts.

[Footnote 12] L. Levy, Jefferson and Civil Liberties–The Darker Side (Cambridge, 1963). Thus President Jefferson wrote to Governor McKean of Pennsylvania in 1803: ”The federalists having failed in destroying freedom of the press by their gag-law, seem to have attacked it in an opposite direction; that is, by pushing its licentiousness and its lying to such a degree of prostitution as to deprive it of all credit. . . . This is a dangerous state of things, and the press ought to be restored to its credibility if possible. The restraints provided by the laws of the States are sufficient for this if applied. And I have, therefore, long thought that a few prosecutions of the most prominent offenders would have a wholesome effect in restoring the integrity of the presses. Not a general prosecution, for that would look like persecution; but a selected one.” 9 Works of Thomas Jefferson 449 (P. Ford, ed. 1905).

[Footnote 13] New York Times Co. v. Sullivan, 376 U.S. 254 (1964), provides the principal doctrinal justification for the development, although the results had long since been fully applied by the Court. In Sullivan, Justice Brennan discerned in the controversies over the Sedition Act a crystallization of ”a national awareness of the central meaning of the First Amendment,” id. at 273, which is that the ”right of free public discussion of the stewardship of public officials . . . [is] a fundamental principle of the American form of government.” Id. at 275. This ”central meaning” proscribes either civil or criminal punishment for any but the most maliciously, knowingly false criticism of government. ”Although the Sedition Act was never tested in this Court, the attack upon its validity has carried the day in the court of history. . . . [The historical record] reflect[s] a broad consensus that the Act, because of the restraint it imposed upon criticism of government and public officials, was inconsistent with the First Amendment.” Id. at 276. Madison’s Virginia Resolutions of 1798 and his Report in support of them brought together and expressed the theories being developed by the Jeffersonians and represent a solid doctrinal foundation for the point of view that the First Amendment superseded the common law on speech and press, that a free, popular government cannot be libeled, and that the First Amendment absolutely protects speech and press. 6 Writings of James Madison, 341-406 (G. Hunt. ed. 1908).

[Footnote 14] Patterson v. Colorado, 205 U.S. 454, 462 (1907) (emphasis original). Justice Frankfurter had similar views in 1951: ”The historic antecedents of the First Amendment preclude the notion that its purpose was to give unqualified immunity to every expression that touched on matters within the range of political interest. . . . ‘The law is perfectly well settled,’ this Court said over fifty years ago, ‘that the first ten amendments to the Constitution, commonly known as the Bill of Rights, were not intended to lay down any novel principles of government, but simply to embody certain guaranties and immunities which we had inherited from our English ancestors, and which had from time immemorial been subject to certain well-recognized exceptions arising from the necessities of the case. In incorporating these principles into the fundamental law there was no intention of disregarding the exceptions, which continued to be recognized as if they had been formally expressed.’ That this represents the authentic view of the Bill of Rights and the spirit in which it must be construed has been recognized again and again in cases that have come here within the last fifty years.” Dennis v. United States, 341 U.S. 494, 521-522, 524 (1951) (concurring opinion). The internal quotation is from Robertson v. Baldwin, 165 U.S. 275, 281 (1897).

[Footnote 15] Patterson v. Colorado, 205 U.S. 454, 461 (1907).

[Footnote 16] 249 U.S. 47, 51-52 (1919) (citations omitted).

[Footnote 17] Debs v. United States, 249 U.S. 211 (1919); Abrams v. United States, 250 U.S. 616 (1919); Schaefer v. United States, 251 U.S. 466 (1920); Pierce v. United States, 252 U.S. 239 (1920); United States ex rel. Milwaukee Social Democratic Pub. Co. v. Burleson, 255 U.S. 407 (1921). A state statute similar to the federal one was upheld in Gilbert v. Minnesota, 254 U.S. 325 (1920).

[Footnote 18] Gitlow v. New York, 268 U.S. 652 (1925); Whitney v. California, 274 U.S. 357 (1927). The Brandeis and Holmes dissents in both cases were important formulations of speech and press principles.

[Footnote 19] 274 U.S. 380 (1927).

[Footnote 20] 283 U.S. 359 (1931). By contrast, it was not until 1965 that a federal statute was held unconstitutional under the First Amendment. Lamont v. Postmaster General, 381 U.S. 301 (1965). See also United States v. Robel, 389 U.S. 258 (1967).

[Footnote 21] And see Near v. Minnesota ex rel. Olson, 283 U.S. 697 (1931); Herndon v. Lowry, 301 U.S. 242 (1937); De Jonge v. Oregon, 299 U.S. 353 (1937); Lovell v. Griffin, 303 U.S. 444 (1938).

[Footnote 22] Bridges v. California, 314 U.S. 252, 263-68 (1941) (overturning contempt convictions of newspaper editor and others for publishing commentary on pending cases).

[Footnote 23] New York Times Co. v. Sullivan, 376 U.S. 254, 270 (1964).

[Footnote 24] Brandenburg v. Ohio, 395 U.S. 444, 447 (1969).

[Footnote 15 (1996 Supplement)] Through interpretation of the Fourteenth Amendment, the prohibition extends to the States as well. See discussion on incorporation, main text, pp. 957-64.

[Footnote 16 (1996 Supplement)] See discussion on state action, main text, pp.1786-1802.

[Footnote 17 (1996 Supplement)] CBS v. Democratic Nat’l Comm., 412 U.S. 94, 115 (1973) (opinion of Chief Justice Burger).

[Footnote 18 (1996 Supplement)] Lebron v. National R.R. Passenger Corp., 115 S. Ct. 961, 971 (1995) (quoting Ex parte Virginia, 100 U.S. 339, 346-47 (1880)). The Court refused to be bound by the statement in Amtrak’s authorizing statute that the corporation is ”not . . . an agency or establishment of the United States Government.” This assertion can be effective ”only for purposes of matters that are within Congress’ control,” the Court explained. ”It is not for Congress to make the final determination of Amtrak’s status as a governmental entity for purposes of determining the constitutional rights of citizens affected by its actions.” 115 S. Ct. at 971.

[Footnote 19 (1996 Supplement)] In CBS v. Democratic Nat’l Comm., 412 U.S. 94 (1973), the Court held that a broadcast licensee could refuse to carry a paid editorial advertisement. Chief Justice Burger, joined only by Justices Stewart and Rehnquist in that portion of his opinion, reasoned that a licensee’s refusal to accept such an ad did not constitute ”governmental action” for purposes of the First Amendment. ”The First Amendment does not reach acts of private parties in every instance where the Congress or the [Federal Communications] Commission has merely permitted or failed to prohibit such acts.” Id. at 119.

Visit link:
Annotation 6 – First Amendment – FindLaw

First Amendment – Text, Origins, and Meaning

 Misc  Comments Off on First Amendment – Text, Origins, and Meaning
Oct 232015
 

Text of Amendment: Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the government for a redress of grievances. Jeff Hunter/The Image Bank/Getty Images Origins of the First Amendment

The founding father most concerned–some might say obsessed–with free speech and free religious exercise was Thomas Jefferson, who had already implemented several similar protections in the constitution of his home state of Virginia. It was Jefferson who ultimately persuaded James Madison to propose the Bill of Rights, and the First Amendment was Jefferson’s top priority.

The first clause in the First Amendment–“Congress shall make no law respecting an establishment of religion”–is generally referred to as the establishment clause. It is the establishment clause that grants “separation of church and state,” preventing–for example–a government-funded Church of the United States from coming into being. More

The second clause in the First Amendment–“or prohibiting the free exercise thereof”–protects freedom of religion. Religious persecution was for all practical purposes universal during the 18th century, and in the already religiously diverse United States there was immense pressure to guarantee that the U.S. government would not require uniformity of belief.

Congress is also prohibited from passing laws “abridging the freedom of speech.” What free speech means, exactly, has varied from era to era. It is noteworthy that within ten years of the Bill of Rights’ ratification, President John Adams successfully passed an act specifically written to restrict the free speech of supporters of Adams’ political opponent, Thomas Jefferson. More

During the 18th century, pamphleteers such as Thomas Paine were subject to persecution for publishing unpopular opinions. The freedom of press clause makes it clear that the First Amendment is meant to protect not only freedom to speak, but also freedom to publish and distribute speech. More

The “right of the people to peaceably assemble” was frequently violated by the British in the years leading up to the American Revolution, as efforts were made to ensure that radical colonists would not be able to foment a revolutionary movement. The Bill of Rights, written as it was by revolutionaries, was intended to prevent the government from restricting future social movements.

View original post here:
First Amendment – Text, Origins, and Meaning

 Posted by at 9:40 am  Tagged with:

Second Amendment – Conservapedia

 Second Amendment  Comments Off on Second Amendment – Conservapedia
Oct 192015
 

See also gun control.

The Second Amendment to the United States Constitution states:[1]

For several decades, the lower federal courts had interpreted the Second Amendment as protecting merely a collective right of state militias.[2] However, the U.S Supreme Court has always called it an individual right. The 2008 Supreme Court decision of District of Columbia v. Heller ruled 5-4 that the Second Amendment protects an individual right.

In 1786, the United States existed as a loose national government under the Articles of Confederation. This confederation was perceived to have several weaknesses, among which was the inability to mount a Federal military response to an armed uprising in western Massachusetts known as Shays’ Rebellion.

In 1787, to address these weaknesses, the Constitutional Convention was held with the idea of amending the Articles. When the convention ended with a proposed Constitution, those who debated the ratification of the Constitution divided into two camps; the Federalists (who supported ratification of the Constitution) and the Anti-Federalists (who opposed it).

Among their objections to the Constitution, anti-Federalists feared a standing army that could eventually endanger democracy and civil liberties. Although the anti-Federalists were unsuccessful at blocking ratification of the Constitution, through the Massachusetts Compromise they insured that a Bill of Rights would be made, which would provide constitutional guarantees against taking away certain rights.

One of those rights was the right to bear arms. This was intended to prevent the Federal Government from taking away the ability of the states to raise an army and defend itself and arguably to prevent them from taking away from individuals the ability to bear arms.

The meaning of this amendment is controversial with respect to gun control.

The National Rifle Association, which supports gun rights, has a stone plaque in front of its headquarters bearing the words “The right of the people to keep and bear arms shall not be infringed.” The slogan means that individual citizens have the right to own and use guns.

American law has always said that the militia includes ordinary private citizens, and gun rights advocates say that the amendment means individuals have the right to own and use guns. Gun control advocates began in the late 20th century to say it means only that there is only some sort of collective or state-controlled right.

Supreme Court opinions have all been consistent with the individual rights interpretation of the Second Amendment, but the lower court opinions are mixed.

As of 2007, people argue about the meaning of the Second Amendment, but there is no definitive answer. The latest ruling is Parker v District of Columbia, in which the DC Circuit court of appeals ruled on March 9, 2007 that the DC gun ban violated individual rights under the Second Amendment.

The One Comma vs. The Three Comma Debate

A well regulated militia being necessary to the security of a free state, the right of the people to keep and bear arms shall not be infringed.””’

Quoted from: http://www.freerepublic.com/forum/a39388c210c1b.htm

Down to the Last Second (Amendment)

Participants in the various debates on firearms, crime, and constitutional law may have noticed that the Second Amendment is often quoted differently by those involved. The two main variations differ in punctuation- specifically, in the number of commas used to separate those twenty-seven words. But which is the correct one? The answer to this question must be found in official records from the early days of the republic. Therefore, a look into the progression of this declaratory and restrictive clause from its inception to its final form is in order.

Before beginning, one must note that common nouns, like “state” and “people,” were often capitalized in official and unofficial documents of the era. Also, an obsolete formation of the letter s used to indicate the long s sound was in common usage. The long ‘s’ is subject to confusion with the lower case ‘f’ ,therefore, Congress” is sometimes spelled as “Congrefs,” as is the case in the parchment copy of the Bill of Rights displayed by the National Archives. The quotations listed here are accurate. With the exception of the omission of quotations marks, versions of what is now known as the Second Amendment in boldface appear with the exact spelling, capitalization, and punctuation as the cited originals.

During ratification debates on the Constitution in the state conventions, several states proposed amendments to that charter. Anti-Federalist opposition to ratification was particularly strong in the key states of New York and Virginia, and one of their main grievances was that the Constitution lacked a declaration of rights. During the ratification process, Federalist James Madison became a champion of such a declaration, and so it fell to him, as a member of the 1st Congress, to write one. On June 8, 1789, Madison introduced his declaration of rights on the floor of the House. One of its articles read:

The right of the people to keep and bear arms shall not be infringed; a well armed and well regulated militia being the best security of a free country: but no person religiously scrupulous of bearing arms shall be compelled to render military service in person.1

On July 21, John Vining of Delaware was appointed to chair a select committee of eleven to review, and make a report on, the subject of amendments to the Constitution. Each committeeman represented one of the eleven states (Rhode Island and North Carolina had not ratified the Constitution at that time), with James Madison representing Virginia. Unfortunately, no record of the committee’s proceedings is known to exist. Seven days later, Vining duly issued the report, one of the amendments reading:

A well regulated militia, composed of the body of the people, being the best security of a free State, the right of the people to keep and bear arms shall not be infringed, but no person religiously scrupulous shall be compelled to bear arms. 2

In debates on the House floor, some congressmen, notably Elbridge Gerry of Massachusetts and Thomas Scott of Pennsylvania, objected to the conscientious objector clause in the fifth article. They expressed concerns that a future Congress might declare the people religiously scrupulous in a bid to disarm them, and that such persons could not be called up for military duty. However, motions to strike the clause were not carried. On August 21, the House enumerated the Amendments as modified, with the fifth article listed as follows:

5. A well regulated militia, composed of the body of the People, being the best security of a free State, the right of the People to keep and bear arms shall not be infringed; but no one religiously scrupulous of bearing arms, shall be compelled to render military service in person. 3

Finally, on August 24, the House of Representatives passed its proposals for amendments to the Constitution and sent them to the Senate for their consideration. The next day, the Senate entered the document into their official journal. The Senate Journal shows Article the Fifth as:

Art. V. A well regulated militia, composed of the body of the people, being the best security of a free state, the right of the people to keep and bear arms, shall not be infringed, but no one religiously scrupulous of bearing arms shall be compelled to render military service in person. 4

On September 4, the Senate debated the amendments proposed by the House, and the conscientious objector clause was quickly stricken. Sadly, these debates were held in secret, so records of them do not exist. The Senators agreed to accept Article the Fifth in this form:

…a well regulated militia, being the best security of a free state, the right of the people to keep and bear arms, shall net be infringed. 5

In further debates on September 9, the Senate agreed to strike the words, “the best,” and replace them with, “necessary to the.” Since the third and fourth articles had been combined, the Senators also agreed to rename the amendment as Article the Fourth. The Senate Journal that day carried the article without the word, “best,” but also without the replacements, “necessary to.” Note that the extraneous commas have been omitted:

A well regulated militia being the security of a free state, the right of the people to keep and bear arms shall not be infringed. 6

With two-thirds of the Senate concurring on the proposed amendments, they were sent back to the House for the Representatives’ perusal. On September 21, the House notified the Senate that it agreed to some of their amendments, but not all of them. However, they agreed to Article the Fourth in its entirety:

Resolved, That this House doth agree to the second, fourth, eighth, twelfth, thirteenth, sixteenth, eighteenth, nineteenth, twenty-fifth, and twenty-sixth amendments… 7

By September 25, the Congress had resolved all differences pertaining to the proposed amendments to the Constitution. On that day, a Clerk of the House, William Lambert, put what is now known as the Bill of Rights to parchment. Three days later, it was signed by the Speaker of the House, Frederick Augustus Muhlenberg, and the President of the Senate, Vice President John Adams. This parchment copy is held by the National Archives and Records Administration, and shows the following version of the fourth article:

Article the Fourth. A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed. 8

The above version is used almost exclusively today, but aside from the parchment copy, the author was unable to find any other official documents from that era which carry the amendment with the extra commas. In fact, in the appendix of the Senate Journal, Article the Fourth is entered as reading:

Art. IV. A well regulated militia being necessary to the security of a free state, the right of the people to keep and bear arms shall not be infringed.9

Also, the Annals of Congress, formally called The Debates and Proceedings in the Congress of the United States, show the proposed amendment as follows:

Article the Fourth. A well regulated militia being necessary to the security of a free State, the right of the People to keep and bear arms shall not be infringed.10

Further, once two-thirds of both chambers of the Congress agreed to the proposed amendments, the House passed a resolve to request that the President send copies of them to the governors of the eleven states in the Union, and to those of Rhode Island and North Carolina. The Senate concurred on September 26, as recorded in their journal:

Resolved by the Senate and House of Representatives of the United States of America in Congress assembled, That the President of the United States be requested to transmit to the executives of the United States, which have ratified the constitution copies of the amendments proposed by Congress, to be added thereto; and like copies to the executives of the states of Rhode Island and North Carolina.11

Fortunately, an original copy of the amendments proposed by the Congress, and sent to the State of Rhode Island and the Providence Plantations, does survive. Certified as a true copy by Assembly Secretary Henry Ward, it reads in part:

Article the Fourth, –A well regulated Militia being neceffary to the Security of a free State, the Right of the People to keep and bear Arms fhall not be infringed. 12

And so, the proposed amendments to the Constitution were sent to the states for ratification. When notifying the President that their legislatures or conventions had ratified some or all of the proposed amendments, some states attached certified copies of them. New York, Maryland, South Carolina, and Rhode Island notified the general government that they had ratified the fourth amendment in this form:

Article the Fourth. A well regulated militia being necessary to the security of a free State, the right of the People to keep and bear arms shall not be infringed. 13

Articles the First and Second were not ratified by the required three-fourths of the states, but by December 15, 1791, the last ten articles were. These, of course, are now known as the Bill of Rights. Renumbering the amendments was required since the first two had not been ratified. The 1796 revision of The Federalist on the New Constitution reflects the change as such:

ARTICLE THE SECOND

A well regulated militia being necessary to the security of a free state, the right of the people to keep and bear arms shall not be infringed.14

This version is carried throughout the 19th Century, in such legal treatises as Joseph Story’s Commentaries on the Constitution of the United States (1833) and Thomas Cooley’s Principles of Constitutional Law (1898). It is also transcribed in this manner in the 1845 Statutes at Large, although the term “state” is capitalized in that text. The latter are the official source for acts of Congress.15,16, 17

This version still appears today, as is the case with the annotated version of the Constitution they sponsored on the Government Printing Office web site (1992, supplemented in 1996 and 1998). The Second Amendment is shown as reading:

A well regulated Militia being necessary to the security of a free State, the right of the people to keep and bear Arms shall not be infringed. 18

(The Senate-sponsored GPO site does carry a “literal print” of the amendments to the Constitution showing the Second Amendment with the additional commas. The punctuation and capitalization of the amendments transcribed there are the same as those found on the parchment copy displayed in the Rotunda of the National Archives.)19

Thus, the correct rendition of the Second Amendment carries but a single comma, after the word “state.” It was in this form that those twenty-seven words were written, agreed upon, passed, and ratified.

Why the Commas are Important

It is important to use the proper Second Amendment because it is clearly and flawlessly written in its original form. Also, the function of the words, “a well regulated militia being necessary to the security of a free state,” are readily discerned when the proper punctuation is used. On the other hand, the gratuitous addition of commas serve only to render the sentence grammatically incorrect and unnecessarily ambiguous. These points will be demonstrated later in the Second Amendment Series.

Footnotes to Comment section:

1. Amendments Offered in Congress by James Madison, June 8, 1789. The Constitution Society. http://www.constitution.org/bor/amd_jmad.htm, 16 January 2000.

2. Amendments Reported by the Select Committee. July 28, 1789. The Constitution Society. http://www.constitution.org/bor/amd_scom.htm, 16 January 2000.

3. U.S. House Journal. 1st Cong., 1st sess., 21 August 1789.

4. U.S. Senate Journal. 1st Cong., 1st sess., 25 August 1789.

5. U.S. Senate Journal, 1st Cong., 1st sess., 4 September 1789.

6. U.S. Senate Journal, 1st Cong., 1st sess., 9 September 1789.

7. U.S. House Journal. 1st Cong., 1st sess., 21 September 1789.

8. Bill of Rights. National Archives and Records Administration. http://www.nara.gov/exhall/charters/billrights/bill.jpg, 22 January 2000.

9. U.S. Senate Journal. 1st Cong., 1st sess., Appendix.

10. Annals of Congress, 1st Cong., 1st sess., Appendix

11. U.S. Senate Journal. 1st Cong. 1st sess., 26 September 1789.

12. A True Bill. The Constitution for the United States, Its Sources and Its Applications. http://www.nidlink.com/~bobhard/billofrt.jpg, 27 January 2000.

13. U.S. House Journal, 1st Cong., 3rd sess., Appendix Note: Maryland and South Carolina capitalized the “m” in “Militia.”

14. The Federalist on the New Constitution, 1796. The Constitution for the United States, Its Sources and Its Applications. http://www.nidlink.com/~bobhard/f16b1234.jpg, 17 February 2000.

15. Commentaries on the Constitution of the United States. The Constitution Society. http://www.constitution.org/js/js_344.htm, 18 February 2000.

16. Quotes from Constitutional Commentators. Gun Cite. http://www.guncite.com/gc2ndcom.html, 2 February 2000.

17. Statutes at Large 1845, 21.

18. Second Amendment–Bearing Arms. The Constitution of the United States of America. http://www.access.gpo.gov/congress/senate/constitution/amdt2.html, 18 February 2000.

19. Text of the Amendments (Literal Print). The Constitution of the United States of America. http://www.access.gpo.gov/congress/senate/constitution/conamt.html, 18 February 2000.

Liberals have made various efforts to subvert the Second Amendment by enacting unconstitutional gun laws which restrict the ability of individuals to protect themselves against the excesses of government. Examples include:

See also list of celebrities against Second Amendment

Bill of Rights: 1 – Freedom of speech, press, etc. 2 – Right to bear arms 3 – Quartering of soldiers 4 – Warrants 5 – Due process 6 – Right to a speedy trial 7 – Right by trial of a jury 8 – No cruel or unusual punishments 9 – Unenumerated rights 10 – Power to the people and states

11 – Immunity of states to foreign suits 12 – Revision of presidential election procedures 13 – Abolition of slavery 14 – Citizenship 15 – Racial suffrage 16 – Federal income tax 17 – Direct election to the United States Senate 18 – Prohibition of alcohol 19 – Women’s suffrage 20 – Terms of the presidency 21 – Repeal of Eighteenth Amendment 22 – Limits the president to two terms 23 – Electoral College 24 – Prohibition of poll taxes 25 – Presidential disabilities 26 – Voting age lowered to 18 27 – Variance of congressional compensation

See the article here:
Second Amendment – Conservapedia

 Posted by at 10:45 pm  Tagged with:



Pierre Teilhard De Chardin | Designer Children | Prometheism | Euvolution | Transhumanism