Michael Beasley Reviews Hitchens Book, “God Is Not Great”

Michael John Beasley speaks on atheist Christopher Hitchens’ completely shallow views of God and Christianity from his book God Is Not Great: How Religion Poisons Everything.

I want to deal a bit with Hitchens worldview that is the drive for such a book that uses bad-thinking to get an emotional response (e.g., propaganda). Here Hitchens hat-tips Karl Marx by saying that this [Marx’s Manifesto] was “…OUR first attempt at philosophy, just as it was OUR first attempt at healthcare, cosmology… astronomy, and so on….”

Here is a wonderful documentary (15-parts, they will load automatically) about the Marxist/Leninist philosophy. Before watching the documentary, consider this by a former leader in the 60’s communist movement here in the states:

…To transform society, you need the power of the state; it is the only way their future can be achieved. That is why they are willing to follow the marching orders of a party that can control the state, and that is why they want to advance its fortunes. The Democrats’ perennial campaign message — Republicans are conducting a war on minorities, women, working Americans, and the poor — rests on the central idea that unites progressives behind the party: We are for equality, they are against it.

The reasoning behind such behavior was revealed by Leon Trotsky when he explained why he would not leave the Bolshevik party even after Stalin — who would eventually murder him — became its absolute leader: “We can only be right with and by the Party,” Trotsky said, “for history has provided no other way of being in the right.” “If the Party adopts a decision which one or other of us thinks unjust, he will say, just or unjust, it is my party, and I shall support the consequences of the decision to the end.”

Non-Bolsheviks may not share Trotsky’s metaphysical certitude, but they will recognize the principle. If the cause is about changing the world and there is only one party that can acquire the means to do it, then even though it may be wrong on this or that matter, its fortunes must be advanced and its power defended. This commitment is magnified when the opposition party is viewed as the enemy of the noble cause. If Republicans are seen as the party of privilege at war with minorities, women, and the poor, then their ideas are not only wrong but evil. As President Obama’s political mentor, Saul Alinsky, put it in Rules for Radicals: “One acts decisively only in the conviction that all of the angels are on one side and the devils are on the other.”

Here is another statement from Rules for Radicals: “We are always moral and our enemies always immoral.” The issue is never the issue. The issue is always the immorality of the opposition, of conservatives and Republicans. If they are perceived as immoral and indecent, their policies and arguments can be dismissed, and even those constituencies that are non-political or “low-information” can be mobilized to do battle against an evil party. In 1996 Senator Bob Dole — a moderate Republican and deal-maker — ran for president against the incumbent, Bill Clinton. At the time, Dick Morris was Clinton’s political adviser. As they were heading into the election campaign, Clinton — a centrist Democrat — told Morris, “You have to understand, Dick, Bob Dole is evil.” That is how even centrist Democrats view the political battle.

Because Democrats and progressives regard politics as a battle of good versus evil, their focus is not on policies that work and ideas that make sense, but on what will make their party win. Demonizing the opposition is one answer; unity is another. If we are divided, we will fail, and that means evil will triumph…

(National Review)

Enjoy this tour of worldviews:

“Paternalistic” Racism of White Liberals (Real White Privileged)

(See also the middle video here about “buying votes.”) The below is a great example of how this “paternalism” has destroyed the black family and left the black community blaming scapegoats. However, to note, this leftist… sorry… Leftist Progressive ideology IS the white privilege you hear about. But it is masked in such a way that those on the Left bolster it while railing against it. It is akin to the atheist without a ground for morals borrowing from the theistic worldview to paint act done as morally wrong… when there is not ontological grounding for them to say such things. Unless God exists, that is.

Dr. Christina Greer, a professor of Political Science at Fordham University said that the US is “a patriarchal, white supremacist country” on Thursday’s Nightly Show With Larry Wilmore on Comedy Central. (Breitbart)

Discover the Networks has this wonderful article, and as much as the above political science professor is a disgrace… the below is mainly geared towards the “slavery” remarks of the host on his way out of the segment:

The rise of the welfare state in the 1960s contributed greatly to the demise of the black family as a stable institution. The out-of-wedlock birth rate among African Americans today is 73%, three times higher than it was prior to the War on Poverty. Children raised in fatherless homes are far more likely to grow up poor and to eventually engage in criminal behavior, than their peers who are raised in two-parent homes. In 2010, blacks (approximately 13% of the U.S. population) accounted for 48.7% of all arrests for homicide, 31.8% of arrests for forcible rape, 33.5% of arrests for aggravated assault, and 55% of arrests for robbery. Also as of 2010, the black poverty rate was 27.4% (about 3 times higher than the white rate), meaning that 11.5 million blacks in the U.S. were living in poverty.

When President Lyndon Johnson in 1964 launched the so-called War on Poverty, which enacted an unprecedented amount of antipoverty legislation and added many new layers to the American welfare state, he explained that his objective was to reduce dependency, “break the cycle of poverty,” and make “taxpayers out of tax eaters.” Johnson further claimed that his programs would bring to an end the “conditions that breed despair and violence,” those being “ignorance, discrimination, slums, poverty, disease, not enough jobs.” Of particular concern to Johnson was the disproportionately high rate of black poverty. In a famous June 1965 speech, the president suggested that the problems plaguing black Americans could not be solved by self-help: “You do not take a person who, for years, has been hobbled by chains and liberate him, bring him up to the starting line in a race and then say, ‘you are free to compete with all the others,’” said Johnson.

Thus began an unprecedented commitment of federal funds to a wide range of measures aimed at redistributing wealth in the United States.[1]  From 1965 to 2008, nearly $16 trillion of taxpayer money (in constant 2008 dollars) was spent on means-tested welfare programs for the poor.

The economic milieu in which the War on Poverty arose is noteworthy. As of 1965, the number of Americans living below the official poverty line had been declining continuously since the beginning of the decade and was only about half of what it had been fifteen years earlier. Between 1950 and 1965, the proportion of people whose earnings put them below the poverty level, had decreased by more than 30%. The black poverty rate had been cut nearly in half between 1940 and 1960. In various skilled trades during the period of 1936-59, the incomes of blacks relative to whites had more than doubled. Further, the representation of blacks in professional and other high-level occupations grew more quickly during the five years preceding the launch of the War on Poverty than during the five years thereafter.

Despite these trends, the welfare state expanded dramatically after LBJ’s statement. Between the mid-Sixties and the mid-Seventies, the dollar value of public housing quintupled and the amount spent on food stamps rose more than tenfold. From 1965 to 1969, government-provided benefits increased by a factor of 8; by 1974 such benefits were an astounding 20 times higher than they had been in 1965. Also as of 1974, federal spending on social-welfare programs amounted to 16% of America’s Gross National Product, a far cry from the 8% figure of 1960. By 1977 the number of people receiving public assistance had more than doubled since 1960.

The most devastating by-product of the mushrooming welfare state was the corrosive effect it had (along with powerful cultural phenomena such as the feminist and Black Power movements) on American family life, particularly in the black community. As provisions in welfare laws offered ever-increasing economic incentives for shunning marriage and avoiding the formation of two-parent families, illegitimacy rates rose dramatically.

For the next few decades, means-tested welfare programs such as food stamps, public housing, Medicaid, day care, and Temporary Assistance to Needy Families penalized marriage. A mother generally received far more money from welfare if she was single rather than married. Once she took a husband, her benefits were instantly reduced by roughly 10 to 20 percent. As a Cato Institute study noted, welfare programs for the poor incentivize the very behaviors that are most likely to perpetuate poverty.[2]  Another Cato report observes:

“Of course women do not get pregnant just to get welfare benefits…. But, by removing the economic consequences of out-of-wedlock birth, welfare has removed a major incentive to avoid such pregnancies. A teenager looking around at her friends and neighbors is liable to see several who have given birth out-of- wedlock. When she sees that they have suffered few visible consequences … she is less inclined to modify her own behavior to prevent pregnancy…. Current welfare policies seem to be designed with an appalling lack of concern for their impact on out-of-wedlock births. Indeed, Medicaid programs in 11 states actually provide infertility treatments to single women on welfare.”

The marriage penalties that are embedded in welfare programs can be particularly severe if a woman on public assistance weds a man who is employed in a low-paying job. As a FamilyScholars.org report puts it: “When a couple’s income nears the limits prescribed by Medicaid, a few extra dollars in income cause thousands of dollars in benefits to be lost. What all of this means is that the two most important routes out of poverty—marriage and work—are heavily taxed under the current U.S. system.”[3]

The aforementioned FamilyScholars.org report adds that “such a system encourages surreptitious cohabitation,” where “many low-income parents will cohabit without reporting it to the government so that their benefits won’t be cut.” These couples “avoid marriage because marriage would result in a substantial loss of income for the family.”

A 2011 study conducted jointly by the Institute for American Values’ Center for Marriage and Families and the University of Virginia’s National Marriage Project suggests that “the rise of cohabiting households with children is the largest unrecognized threat to the quality and stability of children’s family lives.” The researchers conclude that cohabiting relationships are highly prone to instability, and that children in such homes are consequently less likely to thrive, more likely to be abused, and more prone to suffering “serious emotional problems.”…

…read it all…


[1] Hoover Institution senior fellow Thomas Sowell writes: “Never had there been such a comprehensive program to tackle poverty at its roots, to offer more opportunities to those starting out in life, to rehabilitate those who had fallen by the wayside, and to make dependent people self-supporting…. The War on Poverty represented the crowning triumph of the liberal vision of society—and of government programs as the solution to social problems.”

[2] For instance, “a 1 percent increase in the welfare-dependent population in a state increases the number of births to single mothers by about 0.5 percent,” and “an increase in AFDC benefits by 1 percent of average income increases the number of births to single mothers by about 2.1 percent.”

[3] The marriage penalties that are embedded in welfare programs can be particularly severe if a woman on public assistance weds a man who is employed in a low-paying job. Consider the hypothetical case, as outlined in May 2006 by Urban Institute senior fellow Eugene Steuerle, of a single mother with two children who earns $15,000 and enjoys an Earned Income Tax Credit (EITC) benefit of approximately $4,100. If she marries a man earning $10,000, thereby boosting the total household income to $25,000, the EITC benefit, which decreases incrementally for every dollar a married couple earns above a certain level, would drop precipitously to $2,200. Similarly, consider the case (also outlined by Eugene Steuerle in May 2006) of a mother of two children who earns $20,000 and thus qualifies for Medicaid. If she marries someone earning just $6,000, resulting in a combined household income of $26,000, her children’s Medicaid benefits are cut off entirely.

Thomas Sowell talks about how the black family was intact and wealthy up until the welfare state:

The black family, which had survived centuries of slavery and discrimination, began rapidly disintegrating in the liberal welfare state that subsidized unwed pregnancy and changed welfare from an emergency rescue to a way of life.

Government social programs such as the War on Poverty were considered a way to reduce urban riots. Such programs increased sharply during the 1960s. So did urban riots. Later, during the Reagan administration, which was denounced for not promoting social programs, there were far fewer urban riots.

Neither the media nor most of our educational institutions question the assumptions behind the War on Poverty. Even conservatives often attribute much of the progress that has been made by lower-income people to these programs.

For example, the usually insightful quarterly magazine City Journal says in its current issue: “Beginning in the mid-sixties, the condition of most black Americans improved markedly.”

That is completely false and misleading.

The economic rise of blacks began decades earlier, before any of the legislation and policies that are credited with producing that rise. The continuation of the rise of blacks out of poverty did not — repeat, did not — accelerate during the 1960s.

The poverty rate among black families fell from 87 percent in 1940 to 47 percent in 1960, during an era of virtually no major civil rights legislation or anti-poverty programs. It dropped another 17 percentage points during the decade of the 1960s and one percentage point during the 1970s, but this continuation of the previous trend was neither unprecedented nor something to be arbitrarily attributed to the programs like the War on Poverty.

In various skilled trades, the incomes of blacks relative to whites more than doubled between 1936 and 1959 — that is, before the magic 1960s decade when supposedly all progress began. The rise of blacks in professional and other high-level occupations was greater in the five years preceding the Civil Rights Act of 1964 than in the five years afterwards.

While some good things did come out of the 1960s, as out of many other decades, so did major social disasters that continue to plague us today. Many of those disasters began quite clearly during the 1960s.

Man With The Golden Arm ~ Eliminating Chance Statistically

This is a large exceprt from a book worth reading in total, The Design Inference, by William Dembski. It is a classic in I.D. literature, whether you are a skeptic of Intelligent Design, or not:

1.2 THE MAN WITH THE GOLDEN ARM

[p. 9>] Even if we can’t ascertain the precise causal story underlying an event, we often have probabilistic information that enables us to rule out ways of explaining the event. This ruling out of explanatory options is what the design inference is all about. The design inference does not by itself deliver an intelligent agent. But as a logical apparatus for sifting our explanatory options, the design inference rules out explanations incompatible with intelligent agency (such as chance). The design inference appears widely, and is memorably illustrated in the following example (New York Times, 23 July 1985, p. B1):

TRENTON, July 22 — The New Jersey Supreme Court today caught up with the “man with the golden arm:’ Nicholas Caputo, the Essex County Clerk and a Democrat who has conducted drawings for decades that have given Democrats the top ballot line in the county 40 out of 41 times.

[p. 10>] Mary V. Mochary, the Republican Senate candidate, and county Republi­can officials filed a suit after Mr. Caputo pulled the Democrat’s name again last year.

The election is over — Mrs. Mochary lost — and the point is moot. But the court noted that the chances of picking the same name 40 out of 41 times were less than I in 50 billion. It said that “confronted with these odds, few persons of reason will accept the explanation of blind chance.”

And, while the court said it was not accusing Mr. Caputo of anything, it said it believed that election officials have a duty to strengthen public confidence in the election process after such a string of “coincidences.”

The court suggested — but did not order — changes in the way Mr. Caputo conducts the drawings to stem “further loss of public confidence in the integrity of the electoral process.”

Justice Robert L. Clifford, while concurring with the 6-to-0 ruling, said the guidelines should have been ordered instead of suggested.

Nicholas Caputo was brought before the New Jersey Supreme Court because the Republican party filed suit against him, claim­ing Caputo had consistently rigged the ballot lines in the New Jersey county where he was county clerk. It is common knowledge that first position on a ballot increases one’s chances of winning an election (other things being equal, voters are more likely to vote for the first person on a ballot than the rest). Since in every instance but one Caputo positioned the Democrats first on the ballot line, the Republicans ar­gued that in selecting the order of ballots Caputo had intentionally favored his own Democratic party. In short, the Republicans claimed Caputo cheated.

The question, then, before the New Jersey Supreme Court was, Did Caputo actually rig the order of ballots, or was it without malice and forethought that Caputo assigned the Democrats first place forty out of forty-one times? Since Caputo denied wrongdoing, and since he conducted the drawing of ballots so that witnesses were unable to observe how he actually did draw the ballots (this was brought out in a portion of the article omitted in the preceding quote), determining whether Caputo did in fact rig the order of ballots becomes a matter of evaluating the circumstantial evidence connected with this case. How, then, is this evidence to be evaluated?

In trying to explain the remarkable coincidence of Nicholas Caputo selecting the Democrats forty out of forty-one times to head the ballot line, the court faced three explanatory options: [p. 11>]

Regularity: Unknown to Caputo, he was not employing a reliable random process to determine ballot order. Caputo was like some­one who thinks a fair coin is being flipped when in fact it’s a double-headed coin. Just as flipping a double-headed coin is going to yield a long string of heads, so Caputo, using his faulty method for ballot selection, generated a long string of Democrats coming out on top. An unknown regularity controlled Caputo’s ballot line selections.

Chance: In selecting the order of political parties on the state ballot, Caputo employed a reliable random process that did not favor one political party over another. The fact that the Democrats came out on top forty out of forty-one times was simply a fluke. It occurred by chance.

Agency: Caputo, acting as a fully conscious intelligent agent and intending to aid his own political party, purposely rigged the ballot line selections to keep the Democrats coming out on top. In short, Caputo cheated.

The first option — that Caputo chose poorly his procedure for selecting ballot lines, so that instead of genuinely randomizing the ballot order, it just kept putting the Democrats on top — was not taken seriously by the court. The court could dismiss this option outright because Caputo claimed to be using an urn model to select ballot Iines. Thus, in a portion of the New York Times article not quoted, Caputo claimed to have placed capsules designating the various political parties running in New Jersey into a container, and then swished them around. Since urn models are among the most reliable randomization techniques available, there was no reason for the court to suspect that Caputo’s randomization procedure was at fault. The key question, therefore, was whether Caputo actually put this procedure into practice when he made the ballot line selections, or whether he purposely circumvented this procedure to keep the Democrats coming out on top. And since Caputo’s actual drawing of the cap­sules was obscured to witnesses, it was this question the court had to answer.

With the regularity explanation at least for the moment bracketed, the court next decided to dispense with the chance explanation. Hav­ing noted that the chance of picking the same political party 40 out of 41 times was less than 1 in 50 billion, the court concluded that [p. 12>] “confronted with these odds, few persons of reason will accept the explanation of blind chance.” Now this certainly seems right. Nev­ertheless, a bit more needs to be said. As we saw in Section 1.1, exceeding improbability is by itself not enough to preclude an event from happening by chance. Whenever I am dealt a bridge hand, I par­ticipate in an exceedingly improbable event. Whenever I play darts, the precise position where the darts land represents an exceedingly improbable configuration. In fact, just about anything that happens is exceedingly improbable once we factor in all the other ways what actually happened might have happened. The problem, then, does not reside simply in an event being improbable.

All the same, in the absence of a causal story detailing what happened, improbability remains a crucial ingredient in eliminating chance. For suppose that Caputo actually was cheating right from the beginning of his career as Essex County clerk. Suppose further that the one exception in Caputo’s career as “the man with the golden arm” —that is, the one case where Caputo placed the Democrats second on the ballot line — did not occur till after his third time selecting ballot lines. Thus, for the first three ballot line selections of Caputo’s career the Democrats all came out on top, and they came out on top precisely because Caputo intended it that way. Simply on the basis of three bal­lot line selections, and without direct evidence of Caputo’s cheating, an outside observer would be in no position to decide whether Caputo was cheating or selecting the ballots honestly.

With only three ballot line selections, the probabilities are too large to reliably eliminate chance. The probability of randomly selecting the Democrats to come out on top given that their only competi­tion is the Republicans is in this case 1 in 8 (here p equals 0.125; compare this with the p-value computed by the court, which equals 0.00000000002). Because three Democrats in a row could eas­ily happen by chance, we would be acting in bad faith if we did not give Caputo the benefit of the doubt in the face of such large probabilities. Small probabilities are therefore a necessary condi­tion for eliminating chance, even though they are not a sufficient condition.

What, then, besides small probabilities do we need for evidence that Caputo cheated? As we saw in Section 1.1, the event in question needs to conform to a pattern. Not just any pattern will do, however. Some patterns successfully eliminate chance while others do not. [p. 13>] Consider the case of an archer. Suppose an archer stands fifty meters from a large wall with bow and arrow in hand. The wall, let us say, is sufficiently large that the archer cannot help but hit it. Now suppose every time the archer shoots an arrow at the wall, she paints a target around the arrow, so that the arrow is positioned squarely in the bull’s-eye. What can be concluded from this scenario? Absolutely nothing about the archer’s ability as an archer. The fact that the archer is in each instance squarely hitting the bull’s-eye is utterly bogus. Yes, she is matching a pattern; but it is a pattern she fixes only after the arrow has been shot and its position located. The pattern is thus purely ad hoc.

But suppose instead that the archer paints a fixed target on the wall and then shoots at it. Suppose she shoots 100 arrows, and each time hits a perfect bull’s-eye. What can be concluded from this second scenario? In the words of the New Jersey Supreme Court, “confronted with these odds, few persons of reason will accept the explanation of blind chance.” Indeed, confronted with this second scenario we infer that here is a world-class archer.

The difference between the first and the second scenario is that the pattern in the first is purely ad hoc, whereas the pattern in the second is not. Thus, only in the second scenario are we warranted eliminat­ing chance. Let me emphasize that for now I am only spotlighting a distinction without explicating it. I shall in due course explicate the distinction between “good” and “bad” patterns — those that respec­tively do and don’t permit us to eliminate chance (see Chapter 5). But for now I am simply trying to make the distinction between good and bad patterns appear plausible. In Section 1.1 we called the good pat­terns specifications and the bad patterns fabrications. Specifications are the non-ad hoc patterns that can legitimately be used to eliminate chance and warrant a design inference. Fabrications are the ad hoc patterns that cannot legitimately be used to eliminate chance.

Thus, when the archer first paints a fixed target and thereafter shoots at it, she specifies hitting a bull’s-eye. When in fact she repeatedly hits the bull’s-eye, we are warranted attributing her success not to beginner’s luck, but to her skill as an archer. On the other hand, when the archer paints a target around the arrow only after each shot, squarely positioning each arrow in the bull’s-eye, she fabri­cates hitting the bull’s-eye. Thus, even though she repeatedly hits the [p. 14>] bull’s-eye, we are not warranted attributing her “success” in hitting the bull’s-eye to anything other than luck. In the latter scenario, her skill as an archer thus remains an open question.2 (jump)

How do these considerations apply to Nicholas Caputo? By se­lecting the Democrats to head the ballot forty out of forty-one times, Caputo appears to have participated in an event of probability less than 1 in 50 billion (p = 0.00000000002). Yet as we have noted, events of exceedingly small probability happen all the time. Hence by itself Caputo’s participation in an event of probability less than 1 in 50 billion is no cause for alarm. The crucial question is whether this event is also specified — does this event follow a non-ad hoc pattern so that we can legitimately eliminate chance?

Now there is a very simple way to avoid ad hoc patterns and gen­erate specifications, and that is by designating an event prior to its occurrence — C. S. Peirce (1883 [1955], pp. 207-10) referred to this type of specification as a predesignation. In the archer example, by painting the bull’s-eye before taking aim, the archer specifies in ad­vance where the arrows are to land. Because the pattern is set prior to the event, the objection of ad-hocness or fabrication is effectively blocked.

In the Caputo case, however, the pattern is discovered after the event: only after we witness an extended series of ballot line selec­tions do we notice a suspicious pattern. Though discovered after the fact, this pattern is not a fabrication. Patterns given prior to an event, or Peirce’s predesignations, constitute but a proper subset of the pat­terns that legitimately eliminate chance. The important thing about a pattern is not when it was identified, but whether in a certain well-defined sense it is independent of an event. We refer to this relation of independence as detachability, and say that a pattern is detachable just in case it satisfies this relation.

[p. 15>] Detachability distinguishes specifications from fabrications. Al­though a precise account of detachability will have to wait until Chapter 5, the basic intuition underlying detachability is this: Given an event, would we be able to formulate a pattern describing it if we had no knowledge which event occurred? Here is the idea. An event has occurred. A pattern describing the event is given. The event is one from a range of possible events. if all we knew was the range of possible events without any specifics about which event actually occurred, could we still formulate the pattern describing the event? If so, the pattern is detachable from the event.

To illustrate detachability in the Caputo case, consider two pos­sible courses Nicholas Caputo’s career as Essex County clerk might have taken (for simplicity assume no third-party candidates were ever involved, so that all elections were between Democrats and Republi­cans). In the one case — and for the sake of argument let us suppose this is what actually happened — Caputo chose the Democrats over the Republicans forty out of forty-one times in the following order:

(A)  DDDDDDDDDDDDDDDDDDDDDDRDDDDDDDDDDDDDDDDDD

Thus, the initial twenty-two times Caputo chose the Democrats to head the ballot line, then the twenty-third time he chose the Republi­cans, after which for the remaining times he chose the Democrats.

In the second possible course Caputo’s career as county clerk might have taken, suppose Caputo once again had forty-one occasions on which to select the order of ballots, but this time that he chose both Democrats and Republicans to head the ballot pretty evenly, let us say in the following order

(B)   DRRDRDRRDDDRDRDDRDRRDRRDRRRDRRRDRDDDRDRDD

In this instance the Democrats came out on top only twenty times. and the Republicans twenty-one times.

Sequences (A) and (B) are both patterns and describe possible ways Caputo might have selected ballot orders in his years as Essex County clerk. (A) and (B) are therefore patterns describing possible events. Now the question detachability asks is whether (A) and (B) could have been formulated without our knowing which event occurred. For (A) the answer is yes, but for (B) the answer is no. (A) is therefore detachable whereas (B) is not.

[p. 16>] How is this distinction justified? To formulate (B) I just one mo­ment ago flipped a coin forty-one times, recording “D” for Democrat whenever I observed heads and “R” for Republican whenever I ob­served tails. On the other hand, to formulate (A) I simply recorded “D” forty times and then interspersed a single “R.” Now consider a human subject S confronted with sequences (A) and (B). S comes to these sequences with considerable background knowledge which, we may suppose, includes the following:

(1) Nicholas Caputo is a Democrat.
(2) Nicholas Caputo would like to see the Democrats appear first on the ballot since having the first place on the ballot line signifi­cantly boosts one’s chances of winning an election.

(3) Nicholas Caputo, as election commissioner of Essex County, has full control over who appears first on the ballots in Essex County.
(4) Election commissioners in the past have been guilty of all manner of fraud, including unfair assignments of ballot lines.
(5) If Caputo were assigning ballot lines fairly, then both Democrats and Republicans should receive priority roughly the same number of times.

Given this background knowledge S is in a position to formulate various “cheating patterns” by which Caputo might attempt to give the Democrats first place on the ballot. The most blatant cheat is of course to assign the Democrats first place all the time. Next most blatant is to assign the Republicans first place just once (as in (A) — there are 41 ways to assign the Republicans first place just once). Slightly less blatant — though still blatant — is to assign the Republicans first place exactly two times (there are 820 ways to assign the Republicans first place exactly two times). This line of reasoning can be extended by throwing the Republicans a few additional sops. The point is, given S’s background knowledge, S is easily able (possibly with the aid of a personal computer) to formulate ways Caputo could cheat, one of which would surely include (A).

Contrast this now with (B). Since (B) was generated by a sequence of coin tosses, (B) represents one of two trillion or so possible ways Caputo might legitimately have chosen ballot orders. True, in this respect probabilities do not distinguish (A) from (B) since all such sequences of Ds and Rs of length 41 have the same small probability of occurring by chance, namely 1 in 241, or approximately 1 in two [p. 17>] trillion. But S is a finite agent whose background knowledge enables S to formulate only a tiny fraction of all the possible sequences of Ds and Rs of length 41. Unlike (A), (B) is not among them. Confronted with (B), S will scrutinize it, try to discover a pattern that isn’t ad hoc, and thus seek to uncover evidence that (B) resulted from something other than chance. But given S’s background knowledge, nothing about (B) suggests an explanation other than chance. Indeed, since the relative frequency of Democrats to Republicans actually favors Republicans (twenty-one Rs versus twenty Ds), the Nicholas Caputo responsible for (B) is hardly “the man with the golden arm.” Thus, while (A) is detachable, (B) is not.

But can one be absolutely certain (B) is not detachable? No, one cannot. There is a fundamental asymmetry between detachability and its negation, call it nondetachability. In practice one can decisively demonstrate that a pattern is detachable from an event, but not that a pattern is incapable of being detached from an event. A failure to establish detachability always leaves open the possibility that detachability might still be demonstrated at some later date.

To illustrate this point, suppose I walk down a dirt road and find some stones lying about. The configuration of stones says nothing to me. Given my background knowledge I can discover no pattern in this configuration that I could have formulated on my own without actually seeing the stones lying about as they do. I cannot detach the pattern of stones from the configuration they assume. I therefore have no reason to attribute the configuration to anything other than chance. But suppose next an astronomer travels this same road and looks at the same stones only to find that the configuration precisely matches some highly complex constellation. Given the astronomer’s background knowledge, this pattern now becomes detachable. The astronomer will therefore have grounds for thinking that the stones were intentionally arranged to match the constellation.

Detachability must always be relativized to a subject and a subject’s background knowledge. Whether one can detach a pattern from an event depends on one’s background knowledge coming to the event. Often one’s background knowledge is insufficient to detach a pattern from an event. Consider, for instance, the case of cryptographers trying to break a cryptosystem. Until they break the cryptosystem, the strings of characters they record from listening to their enemy’s communications will seem random, and for all the cryptographers know [p. 18>] might just be gibberish. Only after the cryptographers have broken the cryptosystem and discovered the key for decrypting their enemy’s communications will they discern the detachable pattern present in the communications they have been monitoring (cf. Section 1.6).

Is it, then, strictly because our background knowledge and abil­ities are limited that some patterns fail to be detachable? Would. for instance, an infinitely powerful computational device be capa­ble of detaching any pattern whatsoever? Regardless whether some super-being possesses an unlimited capacity to detach patterns, as a practical matter we humans find ourselves with plenty of patterns we cannot detach. Whether all patterns are detachable in some grand metaphysical sense, therefore, has no bearing on the practical prob­lem whether a certain pattern is detachable given certain limited back­ground knowledge. Finite rational agents like ourselves can formulate only a very few detachable patterns. For instance, of all the possible ways we might flip a coin a thousand times, we can make explicit only a minuscule proportion. It follows that a human subject will be unable to specify any but a very tiny fraction of these possible coin flips. In general, the patterns we can know to be detachable are quite limited.(jump)

Let us now wrap up the Caputo example. Confronted with Nicholas Caputo assigning the Democrats the top ballot line forty out of forty-one times, the New Jersey Supreme Court first rejected the regularity explanation, and then rejected the chance explanation (“confronted with these odds, few persons of reason will accept the explanation of blind chance”). Left with no other option, the court therefore accepted the agency explanation, inferred Caputo was cheating, and threw him in prison.

Well, not quite. The court did refuse to attribute Caputo’s golden arm to either regularity or chance. Yet when it came to giving a positive explanation of Caputo’s golden arm, the court waffled. To be sure, the court knew something was amiss. For the Democrats to get the top ballot line in Caputo’s county forty out of forty-one times, especially [p. 19>] with Caputo solely responsible for ballot line selections, something had to be fishy. Nevertheless, the New Jersey Supreme Court was unwilling explicitly to charge Caputo with corruption. Of the six judges, Justice Robert L. Clifford was the most suspicious of Caputo, wanting to order Caputo to institute new guidelines for selecting ballot lines. The actual ruling, however, simply suggested that Caputo institute new guidelines in the interest of “public confidence in the integrity of the electoral process.” The court therefore stopped short of charging Caputo with dishonesty.

Did Caputo cheat? Certainly this is the best explanation of Caputo’s golden arm. Nonetheless, the court stopped short of convicting Ca­puto. Why? The court had no clear mandate for dealing with highly improbable ballot line selections. Such mandates exist in other legal settings, as with discrimination laws that prevent employers from at­tributing to the luck of the draw their failure to hire sufficiently many women or minorities. But in the absence of such a mandate the court needed an exact causal story of how Caputo cheated if the suit against him was to succeed. And since Caputo managed to obscure how he selected the ballot lines, no such causal story was forthcoming. The court therefore went as far as it could.

Implicit throughout the court’s deliberations was the design infer­ence. The court wanted to determine whether Caputo cheated. Lack­ing a causal story of how Caputo selected the ballot lines, the court was left with circumstantial evidence. Given this evidence, the court immediately ruled out regularity. What’s more, from the specified im­probability of selecting the Democrats forty out of forty-one times, the court also ruled out chance.

These two moves — ruling out regularity, and then ruling out chance — constitute the design inference. The conception of design that emerges from the design inference is therefore eliminative, as­serting of an event what it is not, not what it is. To attribute an event to design is to say that regularity and chance have been ruled out. Refer­ring Caputo’s ballot line selections to design is therefore not identical with referring it to agency. To be sure, design renders agency plau­sible. But as the negation of regularity and chance, design is a mode of explanation logically preliminary to agency. Certainly agency (in this case cheating) best explains Caputo’s ballot line selections. But no one was privy to Caputo’s ballot line selections. In the absence of [p. 20>] an exact causal story, the New Jersey Supreme Court therefore went as far as it could in the Caputo case.(jump)

[….]

1.4 FORENSIC SCIENCE AND DETECTION

[p. 22>] Forensic scientists, detectives, lawyers, and insurance fraud investiga­tors cannot do without the design inference. Something as common as a forensic scientist placing someone at the scene of a crime by match­ing fingerprints requires a design inference. Indeed, there is no logical or genetic impossibility preventing two individuals from sharing the same fingerprints. Rather, our best understanding of fingerprints and the way they are distributed in the human population is that they are, with very high probability, unique to individuals. And so, whenever the fingerprints of an individual match those found at the scene of a crime, we conclude that the individual was indeed at the scene of the crime.

The forensic scientist’s stock of design inferences is continually increasing. Consider the following headline: “DNA Tests Becoming Elementary in Solving Crimes.” The lead article went on to describe [p. 23>] the type of reasoning employed by forensic scientists in DNA testing. As the following excerpt makes clear, all the key features of the design inference described in Sections 1.1 and 1.2 are present in DNA testing (The Times — Princeton-Metro, N.J., 23 May 1994, p. A 1):

TRENTON — A state police DNA testing program is expected to be ready in the fall, and prosecutors and police are eagerly looking forward to taking full advantage of a technology that has dramatically boosted the success rate of rape prosecutions across the country.

Mercer County Prosecutor Maryann Bielamowicz called the effect of DNA testing on rape cases “definitely a revolution. It’s the most exciting development in my career in our ability to prosecute.”

She remembered a recent case of a young man arrested for a series of three sexual assaults. The suspect had little prior criminal history, but the crimes were brutal knifepoint attacks in which the assailant broke in through a window, then tied up and terrorized his victims.

“Based on a DNA test in one of those assaults he pleaded guilty to all three. He got 60 years. He’ll have to serve 271/2 before parole. That’s pretty good evidence, she said.

All three women identified the young man. But what really intimidated the suspect into agreeing to such a rotten deal were the enormous odds —one in several million — that someone other than he left semen containing the particular genetic markers found in the DNA test. Similar numbers are intimidating many others into foregoing trials, said the prosecutor.6 (jump)

Not just forensic science, but the whole field of detection is in­conceivable without the design inference. Indeed, the mystery genre would be dead without it.7 (jump) When in the movie Double Indemnity Edward G. Robinson (“the insurance claims man”) puts it together that Barbara Stanwyck’s husband did not die an accidental death by falling off a train, but instead was murdered by Stanwyck to [p. 24>] collect on a life insurance policy, the design inference is decisive. Why hadn’t Stanwyck’s husband made use of his life insurance pol­icy earlier to pay off on a previously sustained injury, for the pol­icy did have such a provision? Why should he die just two weeks after taking out the policy? Why did he happen to die on a train, thereby requiring the insurance company to pay double the usual in­demnity (hence the title of the movie)? How could he have broken his neck falling off a train when at the time of the fall, the train could not have been moving faster than 15 m.p.h.? And who would seriously consider committing suicide by jumping off a train mov­ing only 15 m.p.h.? Too many pieces coalescing too neatly made the explanations of accidental death and suicide insupportable. Thus, at one point Edward G. Robinson exclaims, “The pieces all fit together like a watch!” Suffice it to say, in the movie Barbara Stanwyck and her accomplice/lover Fred MacMurray did indeed kill Stanwyck’s husband.

Whenever there is a mystery, it is the design inference that elicits the crucial insight needed to solve the mystery. The dawning recog­nition that a trusted companion has all along been deceiving you (cf. Notorious); the suspicion that someone is alive after all, even though the most obvious indicators point to the person having died (cf. The Third Man); and the realization that a string of seemingly accidental deaths were carefully planned (cf. Coma) all derive from design in­ferences. At the heart of these inferences is a convergence of small probabilities and specifications, a convergence that cannot properly be explained by appealing to chance.


Notes


2) The archer example introduces a tripartite distinction that will be implicit throughout our study of chance elimination arguments: a reference class of possible events (e.g.. the arrow hitting the wall at some unspecified place): a pattern that restricts the reference class of possible events (e.g.. a target on the wall); and the precise event that has occurred (e.g., the arrow hitting the wall at some precise location). In a chance elimination argument, the reference class, the pattern, and the event are always inseparably linked, with the pattern mediating between the event and the reference class, helping to decide whether the event really is due to chance. Throughout this monograph we shall refer to patterns and events as such, but refer to reference classes by way of the chance hypotheses that characterize them (cf. Section 5.2). (back)

3) This conclusion is consistent with algorithmic information theory, which regards a sequence of numbers as nonrandom to the degree that it is compressible. Since compressibility within algorithmic information theory constitutes but a special case of detachability, and since most sequences are incompressible, the detachable sequences are indeed quite limited. See Kolmogorov (1965), Chaitin (1966). and van Lambalgen (1989). See also Section 1.7 (back)

4) Legal scholars continue to debate the proper application of probabilistic reasoning to legal problems. Larry Tribe (1971), for instance, views the application of Bayes’s theorem within the context of a trial as fundamentally unsound. Michael Finkelstein takes the opposite view (see Finkelstein, 1978, p. 288 ff.). Still, there appears no getting rid of the design inference in the law. Cases of bid-rigging (Finkelstein and Levin, 1990, p. 64), price-fixing (Finkelstein and Levenbach, 1986, pp. 79-106), and collusion often cannot be detected save by means of a design inference. (back)

[….]

6) It’s worth mentioning that at the time of this writing, the accuracy and usefulness of DNA testing is still a matter for debate. As a New York Times (23 August 1994, p. A10) article concerned with the currently ongoing 0..1. Simpson case remarks. “there is wide disagree­ment among scientific experts about the accuracy and usefulness of DNA testing and they emphasize that only those tests performed under the best of circumstances are valuable?’ My interest, however, in this matter is not with the ultimate fate of DNA testing, but with the logic that underlies it, a logic that hinges on the design inference. (back)

7) Cf. David Lehman’s (1989, p. 20) notion of “retrospective prophecy” as applied to the detective-fiction genre: “If mind-reading. backward-reasoning investigators of crimes —sleuths like Dupin or Sherlock Holmes — resemble prophets, it’s in the visionary rather than the vatic sense. It’s not that they see into the future; on the contrary. they’re not even looking that way. But reflecting on the clues left behind by the past, they see patterns where the rest of us see only random signs. They reveal and make intelligible what otherwise would be dark.” The design inference is the key that unlocks the patterns that “the rest of us see only Iasi random signs.” (back)

William A. Dembski, The Design Inference: Estimating Chance Through Small Probabilities (Cambridge, United Kingdom: Cambridge Press, 1998), 9-20, 22-24.

“This is a Christian Holocaust” ~ And Obama Brings Up Crusades?

But… but… we shouldn’t forget the Crusades!

They’re massacring every Christian they see and we face extinction, the oldest Christians in the world. Early today I spoke with White House officials and I told them if they keep not doing anything to protect Christians around the world, they’re sentencing my people to death. And, they’re not acting, so we have to put the pressure on… The American government and the White House are not doing enough… This is a Christian holocaust. …The absence of leadership in the White House is leading to more and more Christians to be persecuted and beheaded. Our churches are being bombed. Nobody’s acting. This is, as we keep saying, a full-blown Christian genocide and Washington isn’t doing anything. This is a Christian holocaust.

(Gateway Pundit)

Do the Rich Pay Their Fair Share? (UCLA Professor Lee Ohanian)

Do the rich pay their fair share of taxes? It’s not a simple question. First of all, what do you mean by rich? And how much is fair? What are the rich, whoever they are, paying now? Is there any tax rate that would be unfair? UCLA Professor of Economics, Lee Ohanian, has some fascinating and unexpected answers.

Videos on Israel: BDS, Apartheid, History, etc. (UPDATED)

Here are almost all the videos and audios I have either posted on my blog or uploaded to my YouTube & Vimeo dealing with the Palestinian Conflict or Israel. New videos will be added at the top. UPDATES appear just underneath.

When the state of Israel was founded in 1948, it was done so with the approval of the United Nations. But today, Israel’s enemies routinely challenge the legitimacy of its very existence. So, under international law, who’s right? Israel? Or its enemies?


As Israel is under attack from Hamas in the Gaza strip and BDS — Boycott, Divestment and Sanctions — right here in America, Bill Whittle makes the historical and moral case for Israel, and shows just who, indeed, are the tyrants and aggressors in the Middle East


In which our host, Andrew Klavan, points out that the leftists behind the current Boycott, Divestment and Sanction, or BDS, efforts against the State of Israel are not at all anti-semitic. They just hate Jews and want to kill them…


Israel is a vibrant democracy with full rights for women and gays, a free press and independent judiciary. You would think that the United Nations would celebrate such a country. Instead, the UN condemns Israel at every turn to the point of obsession. How did this happen? Anne Bayefsky, director of the Touro Institute on Human Rights, explains in five eye-opening minutes.


Are Israeli Settlements the Barrier to Peace?

Is Israel’s policy of building civilian communities in the West Bank the reason there’s no peace agreement with the Palestinians? Or would there still be no peace even if Israel removed all of its settlements and evicted Israeli settlers, as it did in Gaza in 2005? Renowned Harvard professor and legal scholar Alan Dershowitz explains.


Kenneth Meshoe – Black South African Member of Parliament, Talks About Israel and Apartheid


Is Israel an Apartheid State?


Kenneth Meshoe, Member of Parliament in South Africa, Talks Israel


The History of the Middle East Conflict in 11 Minutes:


Why Israel can’t withdraw to its pre ’67 borders line:


Larry Elder On Israel (Plus: “Son of Hamas”)


A Challenge To Medved on Jewish/Israeli History


Medved Makes Short Order of Some Tired Ol’ Mantras


BDS: The Attempt to Strangle Israel ~ Dershowitz


The Boycott, Divestment and Sanctions (BDS) movement says it’s fighting for Palestinian rights, but it’s really just trying to destroy Israel. Jonathan Sacks, author and former Chief Rabbi of the United Kingdom, explains how.


The Middle East Problem ~ Prager


UN vs Israel ~ Dr. Anne Bayefsky


Anti-Israeli Views at the NYT & the U.N. Exemplified ~ Prager


A Pro-Palestinian Call Taken ~ Prager


Caller Asks Dennis Prager About Palestinian History


Israel Palestinian Conflict: The Truth About the West Bank


ISRAEL VS THE WORLD!! ~ Crowder


Palestine Sucks ~ Crowder

Icons of Pluralism Examined: Elephants and Geography Mantras

John Piippo has this great insight from Dinesh D’Souza:

Dinesh D’Souza, in his new book Life After Death: The Evidence, talks about the genetic fallacy as used, he feels, by certain atheists. For example, it is a sociological fact that the statementReligious diversity exists is true. If you were born in India, as D.Souza was, you would most likely be a Hindu rather than a Christian or a Jew (as D’Souza was). While that sociological statement is true, its truth has (watch closely…) no logical relevance as regards the statements such as The Hindu worldview is true, or Christian theism is true. D’ Souza writes:

“The atheist is simply wrong to assume that religious diversity undermines the truth of religious claims… [T]he fact that you learned your Christianity because you grew up in the Bible Belt [does not] imply anything about whether those beliefs are true or false. The atheist is guilty here of what in logic is called the “genetic fallacy.” The term does not refer to genes; it refers to origins. Think of it this way. If you are raised in New York, you are more likely to believe in Einstein’s theory of relativity than if you are raised in New Guinea. Someone from Oxford, England, is more likely to be an atheist than someone from Oxford, Mississippi. The geographical roots of your beliefs have no bearing on the validity of your beliefs.” (38-39, emphasis mine)

  • [Dinesh D’Souza, Life After Death: The Evidence (Washington, DC: Regnery Publishing, 2009), 38-39.]

…and this from Theo-Sophical Ruminations:

Religious pluralists often claim that religious beliefs are culturally relative: the religion you adopt is determined by where you live, not the rationality/truth of the religion itself.  If you live in India you will probably be a Hindu; if you live in the U.S. you will probably be a Christian.  One’s personal religious beliefs are nothing more than a geographic accident, so we should not believe that our religion is true while others are not.

This argument is a double-edged sword.  If the religious pluralist had been born in Saudi Arabia he would have been a Muslim, and Muslims are religious particularists!  His pluralistic view of religion is dependent on his being born in 20th century Western society!

A more pointed critique of this argument, however, comes from the realm of logic.  The line of reasoning employed by the pluralist commits the genetic fallacy (invalidating a view based on how a person came to hold that view).  The fact of the matter is that the truth of a belief is independent of the influences that brought you to believe in it….

See more at Wintery Knight and Apologetics Index:

(Mainly from Paul Copan’s “True for You, But not for Me“)

People have used this old parable to share their opinion or viewpoint that no one religion is the only route to God (pluralism). Pluralists believe that the road to God is wide. The opposite of this is that only one religion is really true (exclusivism).

What could a thoughtful person say in response?

  • Just because there are many different religious answers and systems doesn’t automatically mean pluralism is correct.
  • Simply because there are many political alternatives in the world (monarchy, fascism, communism, democracy, etc.) doesn’t mean that someone growing up in the midst of them is unable to see that some forms of government are better than others.
  • That kind of evaluation isn’t arrogant or presumptuous. The same is true of grappling with religion.
  • The same line of reasoning applies to the pluralist himself. If the pluralist grew up in Madagascar or medieval France, he would not have been a pluralist!
  • If we are culturally conditioned regarding our religious beliefs, then why should the religious pluralist think his view is less arbitrary or conditioned than the exclusivist’s?
  • If Christian faith is true, then the Christian would be in a better position than the pluralist to assess the status of other religions
  • How does the pluralist know he is correct? Even though he claims others don’t know Ultimate Reality as it really is, he implies that he does. (To say that the Ultimate Reality can’t be known is a statement of knowledge.)
  • If the Christian needs to justify Christianity’s claims, the pluralist’s views need just as much substantiation.

If we can’t know Reality as it really is, why think one exists at all? Why not simply try to explain religions as purely human or cultural manifestations without being anything more?

[….]

If you had been born in another country, is it at all likely that you would be a Christian?

Eric looks back at his family—devoutly Christian for four generations in Europe and America, twelve pastors among his relatives, an inner-city schoolteacher and Christian writer for parents—and readily acknowledges that his environment made it easy for him to become a Christian. Still, his faith was exposed to severe challenges as he rose to the top of his university class and as he lived in Asia as a college student. And he knows it took a conscious series of wrenching decisions in his teens and early adult years for him to choose to remain a Christian. Oddly, one of the biggest influences on his faith came from outside his culture through Chinese Christian friends.

John Hick has asserted that in the vast majority of cases, an individual’s religious beliefs will be the conditioned result of his geographical circumstances.1 Statistically speaking, Hick is correct. But what follows from that scenario? We saw in an earlier chapter that the bare fact that individuals hold different views about a thing doesn’t make relativism the inevitable conclusion. Similarly, the phenomenon of varying religious beliefs hardly entails religious pluralism. Before becoming a religious pluralist, an exclusivist has a few equally reasonable options:

  • One could continue to accept the religion one grew up with because it has the ring of truth.
  • One could reject the view one grew up with and become an adherent to a religion believed to be true.
  • One could opt to embrace a less demanding, more convenient religious view.
  • One could become a religious skeptic, concluding that, because the process of belief-formation is unreliable, no religion appears to really save.

Why should the view of pluralism be chosen instead of these other options?

An analogy from politics is helpful.2 As with the multiple religious alternatives in the world, there are many political alternatives—monarchy, Fascism, Marxism, or democracy. What if we tell a Marxist or a conservative Republican that if he had been raised in Nazi Germany, he would have belonged to the Hitler Youth? He will probably agree but ask what your point is. What is the point of this analogy? Just because a diversity of political options has existed in the history of the world doesn’t obstruct us from evaluating one political system as superior to its rivals. Just because there have been many political systems and we could have grown up in an alternate, inferior political system doesn’t mean we are arrogant for believing one is simply better.3

Furthermore, when a pluralist asks the question about cultural or religious conditioning, the same line of reasoning applies to the pluralist himself. The pluralist has been just as conditioned as his religious exclusivist counterparts have. Alvin Plantinga comments:

Pluralism isn’t and hasn’t been widely popular in the world at large; if the pluralist had been born in Madagascar, or medieval France, he probably wouldn’t have been a pluralist. Does it follow that he shouldn’t be a pluralist or that his pluralistic beliefs are produced in him by an unreliable belief-producing process? I doubt it.4

If all religions are culturally conditioned responses to the Real, can’t we say that someone like Hick himself has been culturally conditioned to hold a pluralistic view rather than that of an exclusivist? If that is the case, why should Hick’s view be any less arbitrary or accidental than another’s? Why should his perspective be taken as having any more authority than the orthodox Christian’s?

There is another problem: The exclusivist likely believes he has better basis for holding to his views than in becoming a religious pluralist; therefore he is not being arbitrary. John Hick holds that the religious exclusivist is arbitrary: “The arbitrariness of [the exclusivist position] is underlined by the consideration that in the vast majority of cases the religion to which a person adheres depends upon the accidents of birth.”5 But the exclusivist believes he is somehow justified in his position—perhaps the internal witness of the Holy Spirit or a conversion experience that has opened his eyes so that now he sees what his dissenters do not—even if he can’t argue against the views of others. Even if the exclusivist is mistaken, he can’t be accused of arbitrariness. Hick wouldn’t think of his own view as arbitrary, and he should not level this charge against the exclusivist.

A third problem emerges: How does the pluralist know that he is correct? Hick says that the Real is impossible to describe with human words; It transcends all language. But how does Hick know this? And what if the Real chose to disclose Itself to human beings in a particular form (i.e., religion) and not another? Why should the claims of that religion not be taken seriously?6 As Christians, who lay claim to the uniqueness of Christ, we are often challenged to justify this claim—and we rightly should. But the pluralist is also making an assertion that stands in just as much need of verification. He makes a claim about God, truth, the nature of reality. We ought to press the pluralist at this very point: “How do you know you are right? Furthermore, how do you know anything at all about the Ultimate Reality, since you think all human attempts to portray It are inadequate?”7

At this point we see cracks in Hick’s edifice.8 Although Hick claims to have drawn his conclusions about religion from the ground up, one wonders how he could arrive at an unknowable Ultimate Reality. In other words, if the Real is truly unknowable and if there is no common thread running through all the world religions so that we could formulate certain positive statements about It (like whether It is a personal being as opposed to an impersonal principle, monotheistic as opposed to polytheistic, or trinitary as opposed to unitary), then why bother positing Its existence at all? If all that the world religions know about God is what they perceive—not what they know of God as he really is, everything can be adequately explained through the human forms of religion. The Ultimate becomes utterly superfluous. And while It could exist, there is no good reason to think that It does. One could even ask Hick what prevents him from going one step further and saying that religion is wholly human.

Furthermore, when Hick begins at the level of human experience, this approach almost inevitably winds up treating all religions alike. The German theologian Wolfhart Pannenberg writes, “If everything comes down to human experiences, then the obvious conclusion is to treat them all on the same level.”9

In contrast to Hick, the Christian affirms that the knowledge of God depends on his gracious initiative to reveal himself.10 We read in Scripture that the natural order of creation (what we see) actually reveals the eternal power and nature of the unseen God. He has not left himself without a witness in the natural realm (Rom. 1:20; also Acts 14:15–18; 17:24–29; Ps. 19:1). God’s existence and an array of his attributes can be known through his effects. His fingerprints are all over the universe. The medieval theologian-philosopher Thomas Aquinas, for instance, argued in this way: “Hence the existence of God, insofar as it is not self-evident to us, can be demonstrated from those of His effects which are known to us.”11 What we know about God and an overarching moral law in light of his creation, in fact, means we are without excuse (Rom. 2:14–15). (We’ll say more about general revelation in Part IV.) So rather than dismissing the observable world as inadequate, why can’t we say that what we see in the world serves as a pointer toward God?

Thus there is a role for Christian apologetics to play in defending the rationality and plausibility of the Christian revelation.12 This role—especially in the face of conflicting worldviews—shouldn’t be underestimated.13 While Christians should be wary of furnishing arguments as “proofs,” which tend to imply a mathematical certainty, a modest and plausible defense of Christianity—carried out in dependence on God’s Spirit—often provides the mental evidence people need to pursue God with heart, soul, and mind.

Deflating “If You Grew Up in India, You’d Be a Hindu.”

The phenomenon of differing religious beliefs doesn’t automatically entail religious pluralism. There are other options.

Simply because there are many political alternatives in the world (monarchy, Fascism, communism, democracy, etc.) doesn’t mean someone growing up in the midst of them is unable to see that some forms of government are better than others. That kind of evaluation isn’t arrogant or presumptuous. The same is true of grappling with religion.

The same line of reasoning applies to the pluralist himself. If the pluralist grew up in Madagascar or medieval France, he would not have been a pluralist!

If we are culturally conditioned regarding our religious beliefs, then why should the religious pluralist think his view is less arbitrary or conditioned than the exclusivist’s?

If Christian faith is true, then the Christian would be in a better position than the pluralist to assess the status of other religions.

How does the pluralist know he is correct? Even though he claims that others don’t know Ultimate Reality as It really is, he implies that he does. (To say that the Ultimate Reality can’t be known is to make at least one statement of knowledge.)

If the Christian needs to justify Christianity’s claims, the pluralist’s views need just as much substantiation.

If we can’t know Reality as It really is, why think one exists at all? Why not simply try to explain religions as purely human or cultural manifestations without being anything more?


NOTES

1. An Interpretation of Religion, 2.

2. Van Inwagen, ”Non Est Hick,” 213-214.

3. John Hick’s reply to this analogy is inadequate, thus leaving the traditional Christian view open to the charge of arrogance: ”The Church’s claim is not about the relative merits of different political systems, but about the eternal fate of the entire human race” (”The Epistemological Challenge of Religious Pluralism,” Faith and Philosophy 14 [July 1997]: 282). Peter van Inwagen responds by saying that Hick’s accusation is irrelevant to the charge of arrogance. Whether in the political or religious realm, I still must figure out which beliefs to hold among a number of options. So if I adopt a certain set of beliefs, then ”I have to believe that I and those who agree with me are right and that the rest of the world is wrong…. What hangs on one’s accepting a certain set of beliefs, or what follows from their truth, doesn’t enter into the question of whether it is arrogant to accept them” (”A Reply to Professor Hick,” Faith and Philosophy 14 [July 1997]: 299-300).

4. ”Pluralism,” 23-24.

5. This citation is from a personal letter from John Hick to Alvin Plantinga. See Alvin Plantinga’s article, ”Ad Hick,” Faith and Philosophy 14 (July 1997): 295. The critique of Hick in this paragraph is taken from Plantinga’s article in Faith and Philosophy (295-302). 

6. D’Costa, “The Impossibility of a Pluralist View of Religions,” 229.

7. Hick has claimed that he does not know but merely presents a ”hypothesis” (see his rather unilluminating essay ”The Possibility of Religious Pluralism,” Religious Studies 33 [1997]: 161-166). However, his claims that exclusivism is ”arbitrary” or has ”morally or religiously revolting” consequences (in More Than One Way?, 246) betrays his certainty. 

8. This and the following paragraphs are based on Paul R. Eddy’s argument in ”Religious Pluralism and the Divine,” 470-78.

9. ”Religious Pluralism and Conflicting Truth Claims,” in Gavin D’Costa, ed., Christian Uniqueness Reconsidered: The Myth of a Pluralistic Theology of Religions (Maryknoll, N.Y.: Orbis Books, 1990), 102.

10. On this point, I draw much from D. A. Carson, The Gagging of God, 182-189. 

11. Summa Theologiae I.2.3c.

12. Two fine popular-level apologetics books are William Lane Craig, Reasonable Faith (Wheaton, Ill.: Crossway Books, 1994) and J. P. Moreland, Scaling the Secular City (Grand Rapids, Mich.: Baker Book House, 1987). A bit more rigorous but rewarding is Stuart C. Hackett, The Reconstruction of the Christian Revelation Claim (Grand Rapids, Mich.: Baker, 1984). Three other apologetics books worth noting are Peter Kreeft and Ronald K. Tacelli, Handbook of Christian Apologetics (Downers Grove, Ill.: InterVarsity Press, 1994); Norman Geisler, Christian Apologetics (Grand Rapids, Mich.: Baker, 1976); and Winfried Corduan, Reasonable Faith.

13. Some well-meaning Christians have minimized the place of Christian apologetics for a number of reasons. But their reasons, discussed by C. Stephen Evans, tend to be inadequate: (1) ”Human reason has been damaged by sin,” but reason is not worthless, only defective. (2) ”Trying to use general revelation is presumptuous”: Seeking to persuade a person with arguments from general revelation doesn’t assume unassisted and autonomous reason (after all, reason is a gift from God); any such approach ought to rely upon God–just as presenting the gospel message should. (3) ”Natural revelation is unnecessary since special revelation is sufficient”: This argument wrongly assumes that God cannot use the world he created and the reason he gave us to interpret that creation to draw people to himself. (4) ”The arguments for God’s existence aren’t very good”: The Christian apologist should recognize that God has made the world in such a way that if a person is looking for loopholes to avoid God’s existence, he may do so, but it is not due to a lack of evidence. It seems that God would permit evidence for his existence to be resistible and discountable so that humans do not look like utter nitwits if they reject God. There is more to belief than mere intellectual reasons; people often have moral reasons for rejecting God. (See Evans’ fine essay, ”Apologetics in a New Key,” in Craig and McLeod, The Logic of Rational Theism, 65-75.) 

Example of the failure of the Genetic Fallacy:

Even if they are skeptical of their faith, which should be/is a natural human tendency and should be encouraged in an environment where one feels safe. I do wish, before the larger post of studies below, that there was another fallacy presented in the above video. And it deals with the genetic fallacy (http://en.wikipedia.org/wiki/Genetic_fallacy). It was pointed out that some people are born in places where Christianity is the dominant philosophy, and so they are Christian. Others are born in places where they have a Hindu influence, a Buddhist influence, or like in many parts of Europe, a secular influence. This however does nothing to disprove a religious belief as true or not true. I will give an example.

In the West we accept the truth of Einstein relativity as a scientific fact (or close to a fact). In fact, many theories based on this are shown to work out with these assumptions of fact in mind. Fine, we are born into a culture that believes this truth to be true. Now, if you were born in Papua New Guinea, the general populace may reject the truth of this since as a whole their culture is not steeped in this belief or the scientific method. This has or says nothing about the truth of Einstein’s theory. Which is why this is a fallacy and should be rejected.

Evolutionary Assumptions (Carl F.H. Henry and G.A. Kerkut)

  • The FIRST QUOTE is Carl Henry (a Christian) quoting Dr. Kerkut’s book (an evolutionist). The SECOND QUOTE [jump to] is the raw, long excerpted quote from G.A. Kerkut.

What I am going to do is post a quote from one of Carl F. H. Henry’s books, then follow that quote up a larger quote from his source he uses. Context is king and I love Dr. Henry’s source A LOT!

The numbers from Dr. Henry’s quote correspond to the same numbers in Kerkut’s concluding chapter (to follow… jump to now instead by clicking here).

[p. 182>] A. Kerkut emphasizes that all seven basic assumptions on which evolu­tionary theory rests are “by their nature… not capable of experimental verification” (Implications of Evolution, p. 7). (1) The assumption that “non­living things gave rise to living material… is still just an assumption” (ibid., p. 150). (2) The assumption that “biogenesis occurred only once… is a matter of belief rather than proof” (op. cit.). (3) The assumption that “Vi­ruses, Bacteria, Protozoa and the higher animals were all interrelated” biologically as an evolutionary phenomenon lacks definite evidence (ibid., p. 151). (4) The assumption that “the Protozoa gave rise to the Metazoa” has no basis in definite knowledge (ibid., pp. 151 ff.). (5) The assumption that “the various invertebrate phyla are interrelated” depends on “tenuous and cir­cumstantial” evidence and not on evidence that allows “a verdict of definite relationships” (ibid., pp. 152 f.). (6) The assumption that “the invertebrates gave rise to the vertebrates” turns on evidence gained by prior belief (ibid., p. 153). Although he finds “somewhat stronger ground” for assuming that “fish, amphibia, reptiles, birds and mammals are interrelated,” (7) Kerkut con­cedes that many key fossil transitions are “not well documented and we have as yet to obtain a satisfactory objective method of dating the fossils” (ibid., p. 153). “In effect, much of the evolution of the major groups of animals has to be taken on trust” (ibid., p. 154); “there are many discrete groups of animals and… we do not know how they have evolved nor how they are interrelated” (ibid., p. vii). In short, the theory that “all the living forms in the world have arisen from a single source which itself came from an inorganic form,” says Kerkut, has insufficiently strong evi­dential supports “to consider it as anything more than a working hypothe­sis” (ibid., p. 157). He thinks “premature and not satisfactorily supported by present-day evidence,” therefore, “the attempt to explain all living forms in terms of an evolution from a unique source,” that is, from a common ancestor (ibid., pp. vii f.)

[p. 183>] It is therefore understandable why commentators speak more and more of a crisis of evolutionary theory. Establishment science’s long regnant view that gradual development accounts for the solar system, earth, life and all else is in serious dispute. Not in many decades has so much doubt emerged among scientists about the so-called irrefutable evidence that evolution is what accounts for life on planet earth. Although it was still taught long thereafter in high schools, Ernst Haeckel’s “biogenetic law” that “ontogeny recapitulates phylogeny” had collapsed already in the late 1920s. The absence in recent texts of evolutionary charts depicting the common descent even of trees from a single form is noteworthy. Darwin’s insistence that nature makes no leaps, and that natural selection and chance adequately account for change in species, has lost credibility. Pa­leontologists and biologists are at odds over the significance of the fossil record, while gradualists and episodists disagree over the supposed tempo of evolution or whether the origin of species is consistent with microevolution or only with sudden gaps in the forms of life.

Gould, for example, opts for natural selection and, remarkably, combines it with saltation. He grants that “the fossil record does not support” the belief “in slow evolutionary change preached by most paleontologists” (and projected by Darwin); instead, “mass extinction and abrupt origination reign.. . . Gradualism is not exclusively valid (in fact, I regard it as rather rare). Natural selection contains no statement about rates. It can encompass rapid (geologically instantaneous) change by speciation in small popula­tions as well as the conventional and immeasurably slow transformation of entire lineages” (Ever Since Darwin, p. 271). Natural selection here becomes an elastic phrase that can accommodate to everything while re­quiring no significant empirical attestation.

University of Glasgow scientists Chris Darnbrough, John Goddard and William S. Stevely indicate problem areas that beset evolutionary theory: “The experiments demonstrating the formation of a variety of organic molecules from presumptive prebiotic soups,” they write, “fall far short of providing a pathway for chemical evolution. Again, it is self-evident that the fossil record leaves much to be desired and few biologists recognize the dependence of the geological column on radiometric dating methods based on questionable assumptions about initial conditions. The whole his­tory of evolutionary thought is littered with the debris of dubious assump­tions and misinterpretations, especially in the area of fossil ‘hominids.’ To come up to date, protein and DNA sequence data, generally viewed as consistent with an evolutionary explanation of diversity, are invariably interpreted using methods which presuppose, but do not demonstrate evolu­tionary relationships, and which use criteria that are essentially functional and teleological. Finally, there is a collection of isolated fragmentary pieces of evidence which are usually dismissed as anecdotal because they are irreconcilable with the evolutionary model” (“American Creation” [corre­spondence], by Chris Darnbrough, John Goddard and William S. Stevely, Nature, pp. 95 f.).

From ongoing conflicts and readjustments it is apparent that there never [p. 184>] was nor is there now only one theory of evolution. Many nontheistic schol­ars, to be sure, insist that evolution is and has always been “a fact.” Laurie R. Godfrey affirms that “there is actually widespread agreement in scien­tific circles that the evidence overwhelmingly supports evolutionism” and quotes Gould as saying that “none of the current controversy within evolu­tionary theory should give any comfort, not the slightest iota, to any cre­ationists” (“The Flood of Antievolution,” pp. 5-10, p. 10). If, as Godfrey insists, even the most sweeping revisions and reversals of scientific theory ought to be viewed not as weaknesses in evolutionary claims but rather as reflections of ongoing differences that inhere in “doing science—posing, testing and debating alternative explanations,” then the emphasis is proper only if Godfrey refuses to attach finality and a universal validity-claim to anticreationist evolutionary theses.

The history of evolutionary theory is far from complete and its present status ambiguous. Hampton L. Carson notes the difficulty of integrating the dual lines of study pursued by biological evolutionists when on the one hand they project the course of evolution that is held to produce contem­porary organisms, and when on the other they analyze supposed evolution­ary causation. Carson notes, moreover, that presentation of new approaches even to student audiences now requires an understanding of sophisticated computer techniques and an awareness of complex and sometimes esoteric theory; he ventures the bold observation that “new mutations and recom­binations” of evolutionary theory will themselves “be subject to natural selection” (“Introduction to a Pivotal Subject” [review of Evolution by Theodosius Dobzhansky and others, and of Organismic Evolution by Verne Grant], pp. 1272 f.).

Yet most secular evolutionists continue to assume that evolution is a complex fact and therefore debate only its mechanism. Appealing to con­sensus rather than to demonstrative data, G. G. Simpson states that “no evolutionist since [Darwin has] seriously questioned that man did originate by evolution”; he insists, moreover, that “the problem [the origin of life] can be attacked scientifically” (“The World into Which Darwin Led Us.” pp. 966-974). Simpson’s advance confidence in naturalistic explanation ex­udes a strong bias against theistic premises.

But Thomas S. Kuhn considers the physical sciences to be grounded less on empirical facts that on academically defined assumptions about the nature of the universe, assumptions that are unprovable, questionable and reversible (The Structure of Scientific Revolutions). His approach differs somewhat from Michael Polanyi’s assault on the objectivity of human knowledge (Personal Knowledge: Towards a Post-Critical Philosophy), a view that Christian theism disputes on its own ground. Yet both Kuhn’s emphasis and Polanyi’s tend to put a question mark after absolutist evolu­tionary claims.

Carl F. H. Henry, God, Revelation and Authority, Vol VI: God Who Stands and Stays (Wheaton, IL: Crossway Books, 1983), 182-184.


Here is the extended quote from Dr. Henry’s source used,

G.A. Kerkut’s Implications of Evolution (pp. 150-157):


[p. 150>] WHAT conclusions, then, can one come to concerning the validity of the various implications of the theory of evolution? If we go back to our initial assumptions it will be seen that the evidence is still lacking for most of them.

(1) The first assumption was that non-living things gave rise to living material. This is still just an assumption. It is conceivable that living material might have suddenly appeared on this world in some peculiar manner, say from another planet, but this then raises the question, “Where did life originate on that planet?” We could say that life has always existed, but such an explanation is not a very satisfactory one. Instead, the explanation that non­living things could have given rise to complex systems having the properties of living things is generally more acceptable to most scientists. There is, however, little evidence in favour of biogenesis and as yet we have no indication that it can be per­formed. There are many schemes by which biogenesis could have occurred but these are still suggestive schemes and nothing more. They may indicate experiments that can be performed, but they tell us nothing about what actually happened some 1,000 million years ago. It is therefore a matter of faith on the part of the biologist that biogenesis did occur and he can choose whatever method of biogenesis happens to suit him personally; the evidence for what did happen is not available.

(2) The second assumption was that biogenesis occurred only once. This again is a matter for belief rather than proof. It is convenient to believe that all living systems have the same fundamental chemical processes at work within them, but as has already been mentioned, only a few representatives from the wide range of living forms have so far been examined and even [p. 151>] these have not been exhaustively analysed. From our limited experience it is clear that the biochemical systems within proto­plasm are not uniform, i.e. there is no established biochemical unity. Thus we are aware that there are systems other than the Embden—Meyerhof and the tricarboxylic cycles for the systematic degradation of carbohydrates; a total of six alternative methods being currently available. High-energy compounds other than those of phosphorus have been described; the number of vital amino-acids has gone up from twenty to over seventy; all these facts indicate that the biochemical systems may be very variable. The morphological systems in protoplasm, too, show consider­able variation. It is possible that some aspects of cell structure such as the mitochondria and the microsomes might have arisen independently on several distinct occasions. It is also probable that two or more independent systems have evolved for the separation of chromosomes during cell division.

It is a convenient assumption that life arose only once and that all present-day living things are derived from this unique experi­ence, but because a theory is convenient or simple it does not mean that it is necessarily correct. If the simplest theory was always correct we should still be with the four basic elements—earth, air, fire and water! The simplest explanation is not always the right one even in biology.

(3) The third assumption was that Viruses, Bacteria, Protozoa and the higher animals were all interrelated. It seems from the available evidence that Viruses and Bacteria are complex groups both of which contain a wide range of morphological and physio­logical forms. Both groups could have been formed from diverse sources so that the Viruses and Bacteria would then be an assembly of forms that contain both primitive and secondarily simplified units. They would each correspond to a Grade rather than a Subkingdom or Phylum. We have as yet no definite evidence about the way in which the Viruses, Bacteria or Protozoa are interrelated.

(4) The fourth assumption was that the Protozoa gave rise to the Metazoa. This is an interesting assumption and various schemes have been proposed to show just how the change could have taken place. On the other hand equally interesting schemes have been suggested to show the way in which the Metaphyta [p. 152>] could have given rise to both the Protozoa and the Metazoa. Here again nothing definite is known. We can believe that any one of these views is better than any other according to the relative importance that we accord to the various pieces of evidence.

(5) The fifth assumption was that the various invertebrate phyla are interrelated. If biogenesis occurred many times in the past and the Metazoa developed on several finite occasions then we might expect to find various isolated groups of invertebrates. If on the other hand biogenesis was a unique occurrence it should not be too difficult to show some relationship between all the various invertebrate phyla.

It should be remembered, for example, that though there are similarities between the cleavage patterns of the eggs of various invertebrates these might only reflect the action of physical laws acting on a restrained fluid system such as we see in the growth of soap bubbles and not necessarily indicate any fundamental phylogenetic relationship .

As has already been described, it is difficult to tell which are the most primitive from amongst the Porifera, Mesozoa, Coelenterata, Ctenophora or Platyhelminthia and it is not possible to decide the precise interrelationship of these groups. The higher invertebrates are equally difficult to relate. Though the concept of the Protostomia and the Deuterostomia is a useful one, the basic evidence that separates these two groups is not as clear cut as might be desired. Furthermore there are various groups such as the Brachiopoda, Chaetognatha, Ectoprocta and Phoronidea that have properties that lie between the Protostomia and the Deuterostomia. It is worth paying serious attention to the con­cept that the invertebrates are polyphyletic, there being more than one line coming up to the primitive metazoan condition. It is extremely likely that the Porifera are on one such side line and it is conceivable that there could have been others which have since died away leaving their progeny isolated; in this way one could explain the position of the nematodes. The number of ways of achieving a specific form or habit is limited and resemblances may be due to the course of convergence over the period of many millions of years. The evidence, then, for the affinities of the majority of the invertebrates is tenuous and circumstantial; not [p. 153>] the type of evidence that would allow one to form a verdict of definite relationships.

(6) The sixth assumption, that the invertebrates gave rise to the vertebrates, has not been discussed in this book. There are several good reviews on this subject. Thus Neal and Rand (1939) pro­vide a useful and interesting account of the various views that have been suggested to explain the relationship between the inverte­brates and the vertebrates. The vertebrates have been derived from the annelids, arthropods, nemerteans, hemichordates and the urochordates. More recently Berrill (1955) has given a detailed account of the mode of origin of the vertebrates from the urochord-ates in which the sessile ascidian is considered the basic form. On the other hand, almost as good a case can be made to show that the ascidian tadpole is the basic form and that it gave rise to the sessile ascidian on the one hand and the chordates on the other. Here again it is a matter of belief which way the evidence happens to point. As Berrill states, “in a sense this account is science fiction.”

(7) We are on somewhat stronger ground with the seventh assumption that the fish, amphibia, reptiles, birds and mammals are interrelated. There is the fossil evidence to help us here, though many of the key transitions are not well documented and we have as yet to obtain a satisfactory objective method of dating the fossils. The dating is of the utmost importance, for until we find a reliable method of dating the fossils we shall not be able to tell if the first amphibians arose after the first choanichthian or whether the first reptile arose from the first amphibian. The evidence that we have at present is insufficient to allow us to decide the answer to these problems.

One thing that does seem reasonably clear is that many of the groups such as the Amphibia (Save Soderberg 1934), Reptilia (Goodrich 1916) and Mammalia appear to be polyphyletic grades of organisation. Even within the mammals there is the suggestion that some of the orders might be polyphyletic. Thus Kleinenberg (1959) has suggested that the Cetacea are diphyletic, the Odontoceti and the Mysticeti being derived from separate terrestrial stocks. (Other groups that appear to be polyphyletic are the Viruses, Bacteria, Protozoa, Arthropoda (Tiegs and Manton 1958), and it is possible that close study will show that the Annelida and Protochordata are grades too.)

[p. 154>] In effect, much of the evolution of the major groups of animals has to be taken on trust. There is a certain amount of circum­stantial evidence but much of it can be argued either way. Where, then, can we find more definite evidence for evolution? Such evidence will be found in the study of modern living forms. It will be remembered that Darwin called his book The Origin of Species not The Origin of Phyla and it is in the origin and study of the species that we find the most definite evidence for the evolution and changing of form. Thus to take a specific example, the Herring Gull, Larus argentatus, does not interbreed with the Lesser Black-backed Gull, Larus fuscus, in Western Europe, the two being separate species. But if we trace L. argentatus across the northern hemisphere through North America, Eastern Siberia and Western Siberia we find that in Western Siberia there is a form of L. argentatus that will interbreed with L. fuscus. We have here an example of a ring species in which the members at the ends of the ring will not interbreed whilst those in the middle can. The separation of what was possibly one species has been going on for some time (in this case it is suggested since the Ice Age). We have of course to decide that this is a case of one species splitting into two and not of two species merging into one, but this decision is aided by the study of other examples such as those of small mammals isolated on islands, or the development of melanic forms in moths. Details of the various types of speciation can be found in the books by Mayr, Systematics and the Origin of Species (1942), and Dobzhansky, Genetics and the Origin of Species (1951).

It might be suggested that if it is possible to show that the present-day forms are changing and the evolution is occurring at this level, why can’t one extrapolate and say that this in effect has led to the changes we have seen right from the Viruses to the Mammals? Of course one can say that the small observable changes in modern species may be the sort of thing that lead to all the major changes, but what right have we to make such an extrapolation? We may feel that this is the answer to the problem, but is it a satisfactory answer? A blind acceptance of such a view may in fact be the closing of our eyes to as yet undiscovered factors which may remain undiscovered for many years if we believe that the answer has already been found.

[p. 155>] It seems at times as if many of our modern writers on evolution have had their views by some sort of revelation and they base their opinions on the evolution of life, from the simplest form to the complex, entirely on the nature of specific and intra-specific evolution. It is possible that this type of evolution can explain many of the present-day phenomena, but it is possible and indeed probable that many as yet unknown systems remain to be dis­covered and it is premature, not to say arrogant, on our part if we make any dogmatic assertion as to the mode of evolution of the major branches of the animal kingdom.

Perhaps it is appropriate here to quote a remark made by D’Arcy Thompson in his book On Growth and Form. “If a tiny foraminiferan shell, a Lagena for instance, be found living today, and a shell indistinguishable from it to the eye be found fossil in the Chalk or some still more remote geological formation, the assumption is deemed legitimate that the species has ‘survived’ and has handed down its minute specific character or characters from generation to generation unchanged for untold millions of years. If the ancient forms be like rather than identical with the recent, we still assume an unbroken descent, accompanied by hereditary transmission of common characters and progressive variations. And if two identical forms be discovered at the ends of the earth, still (with slight reservation on the score of possible ‘homoplasy’) we build a hypothesis on this fact of identity, taking it for granted that the two appertain to a common stock, whose dispersal in space must somehow be accounted for, its route traced, its epoch determined and its causes discussed or discovered. In short, the Naturalist admits no exception to the rule that a natural classification can only be a genealogical one, nor ever doubts that ‘ ‘the fact that we are able to classify organ­isms at all in accordance with the structural characteristics which they present is due to their being related by descent.'”

What alternative system can we use if we are not to assume that all animals can be arranged in a genealogical manner? The alternative is to indicate that there are many gaps and failures in our present system and that we must realise their existence. It may be distressing for some readers to discover that so much in zoology is open to doubt, but this in effect indicates the vast amount of work that remains to be done. In many courses the [p. 156>] student is obliged to read, assimilate and remember a vast amount of factual information on the quite false assumption that know­ledge is the accumulation of facts. There seems so much to be learnt that the only consolation the student has is that those who come after him will have even more to learn, for more will be known. But this is not really so; much of what we learn today are only half truths or less and the students of tomorrow will not be bothered by many of the phlogistons that now torment our brains.

It is in the interpretation and understanding of the factual information and not the factual information itself that the true interest lies. Information must precede interpretation, and it is often difficult to see the factual data in perspective. If one reads an account of the history of biology such as that presented by Nordenskiold (1920) or Singer (1950) it sometimes appears that our predecessors had a much easier task to discover things than we do today. All that they had to do was realise, say, that oxygen was necessary for respiration, or that bacteria could cause septicaemia or that the pancreas was a ductless gland that secreted insulin. The ideas were simple; they just required the thought and the experimental evidence! Let us have no doubt in our minds that in twenty years or so time we shall look back on many of today’s problems and make similar observations. Everything will seem simple and straightforward once it has been explained. Why then cannot we see some of these solutions now? There are many partial answers to this question. One is that often an incorrect idea or fact is accepted and takes the place of the correct one. An incorrect view can in this way successfully displace the correct view for many years and it requires very careful analysis and much experimental data to overthrow an accepted but incorrect theory. Most students become acquainted with many of the current concepts in biology whilst still at school and at an age when most people are, on the whole, uncritical. Then when they come to study the subject in more detail, they have in their minds several half truths and misconceptions which tend to prevent them from coming to a fresh appraisal of the situation. In addition, with a uniform pattern of education most students tend to have the same sort of educational background and so in conversation and dis­cussion they accept common fallacies and agree on matters based on these fallacies.

[p. 157>] It would seem a good principle to encourage the study of “scientific heresies.” There is always the danger that a reader might be seduced by one of these heresies but the danger is neither as great nor as serious as the danger of having scientists brought up in a type of mental strait-jacket or of taking them so quickly through a subject that they have no time to analyse and digest the material they have “studied.” A careful perusal of the heresies will also indicate the facts in favour of the currently accepted doctrines, and if the evidence against a theory is over­whelming and if there is no other satisfactory theory to take its place we shall just have to say that we do not yet know the answer.

There is a theory which states that many living animals can be observed over the course of time to undergo changes so that new species are formed. This can be called the “Special Theory of Evolution” and can be demonstrated in certain cases by experi­ments. On the other hand there is the theory that all the living forms in the world have arisen from a single source which itself came from an inorganic form. This theory can be called the “General Theory of Evolution” and the evidence that supports it is not sufficiently strong to allow us to consider it as anything more than a working hypothesis. It is not clear whether the changes that bring about speciation are of the same nature as those that brought about the development of new phyla. The answer will be found by future experimental work and not by dogmatic assertions that the General Theory of Evolution must be correct because there is nothing else that will satisfactorily take its place.

G.A. Kerkut, Implication of Evolution (International series of monographs on pure and applied biology. Division: Zoology) (New York, NY: Pergamon Press, 1960), 150-157.

Science and Intelligent Design Defined

This is a good working definition for Intelligent Design via New World Encyclopedia:

Intelligent design (ID) is the view that it is possible to infer from empirical evidence that “certain features of the universe and of living things are best explained by an intelligent cause, not an undirected process such as natural selection” [1] Intelligent design cannot be inferred from complexity alone, since complex patterns often happen by chance. ID focuses on just those sorts of complex patterns that in human experience are produced by a mind that conceives and executes a plan. According to adherents, intelligent design can be detected in the natural laws and structure of the cosmos; it also can be detected in at least some features of living things.

Greater clarity on the topic may be gained from a discussion of what ID is not considered to be by its leading theorists. Intelligent design generally is not defined the same as creationism, with proponents maintaining that ID relies on scientific evidence rather than on Scripture or religious doctrines. ID makes no claims about biblical chronology, and technically a person does not have to believe in God to infer intelligent design in nature. As a theory, ID also does not specify the identity or nature of the designer, so it is not the same as natural theology, which reasons from nature to the existence and attributes of God. ID does not claim that all species of living things were created in their present forms, and it does not claim to provide a complete account of the history of the universe or of living things.

ID also is not considered by its theorists to be an “argument from ignorance”; that is, intelligent design is not to be inferred simply on the basis that the cause of something is unknown (any more than a person accused of willful intent can be convicted without evidence). According to various adherents, ID does not claim that design must be optimal; something may be intelligently designed even if it is flawed (as are many objects made by humans).

ID may be considered to consist only of the minimal assertion that it is possible to infer from empirical evidence that some features of the natural world are best explained by an intelligent agent. It conflicts with views claiming that there is no real design in the cosmos (e.g., materialistic philosophy) or in living things (e.g., Darwinian evolution) or that design, though real, is undetectable (e.g., some forms of theistic evolution). Because of such conflicts, ID has generated considerable controversy.

[1] Discovery Institute, Center for Science and Culture, Questions about Intelligent Design: What is the theory of intelligent design? Retrieved March 18, 2007.

Gay Patriot Tackles A Killer in the Gay Community ~ Moral Equivalency

Since marriage is no longer about creating a stable environment for children, and has become (and this mainly the fault of heterosexual liberals [e.g., liberalism]) about personal fulfillment, validation, and access to social benefits, there literally is no constraint on how much more broadly it can be redefined. ~ Gay Patriot

Gay Patriot bravely steps out on this subject and accepts the challenge… as any rational thinking conservatarian would:

The New York Times has noticed that bareback sex is a thing gay people are doing, which is breaking news from about the mid-1990′s when (according to Wikipedia) gay publications like The Advocate first took note of the phenomenon of gay men having unprotected sex and, in some cases, deliberately seeking HIV infection.

Anyway, the Times, perhaps after failing to find a celebrity to comment on the issue, goes to the next best source for information on epidemiology and behavioral psychology… an English professor from SUNY-Buffalo. Who provides this analysis:

What I learned in my research is that gay men are pursuing bareback sex not just for the thrill of it, but also as a way to experience intimacy, vulnerability and connection. Emotional connection may be symbolized in the idea that something tangible is being exchanged. A desire for connection outweighs adherence to the rules of disease prevention.

And some guys are apparently getting intimate, tangible, emotional connections 10-20 times a night in bathhouses.

It also seems that the readers of the NY Times, based on the comments, are in complete denial that this phenomenon exists, and think the author is just making it up to attack the gay community. Liberals choose to blame the recent dramatic increases in HIV infection rates on “the stigma attached to HIV.” Um, excuse me, but don’t stigmas usually make people avoid those things to which stigmas are attached?

In the real world, stigmatizing a behavior results in less of it: Which is why people don’t use the N-word in public any more and smoking has declined as a social activity. When the social stigma is removed … as with HIV infection and teenage pregnancy … you get more of those things.

…read more…

Bravo. I just wish to mention that this area of the body is not made for sex. And many will read the following and think that this is an attack on the humanity of the gay lifestyle/choice. It is not, it is a cry for gay men to become monogamous and cease having relations with the people they purport to love in that area. It is out of compassion, not hatred the following is pointed out:

Homosexuals also continue to contract and spread other diseases at rates significantly higher that the community at large. These include syphilis, gonorrhea, herpes, hepatitis A and B, a variety of intestinal parasites including amebiases and giardiasis, and even typhoid fever (David G. Ostrow, Terry Alan Sandholzer, and Yehudi M. Felman, eds., Sexually Transmitted Diseases in Homosexual Men; see also, Sevgi O. Aral and King K. Holmes, “Sexually Transmitted Diseases in the AIDS Era,” Scientific American). This is because rectal intercourse or sodomy, typically practiced by homosexuals, is one of the most efficient methods of transmitting disease. Why? Because nature designed the human rectum for a single purpose: expelling waste from the body. It is built of a thin layer of columnar cells, different in structure than the plate cells that line the female reproductive tract. Because the wall of the rectum is so thin, it is easily ruptured during intercourse, allowing semen, blood, feces, and saliva to directly enter the bloodstream. The chances for infection increases further when multiple partners are involved, as is frequently the case: Surveys indicate that American male homosexuals average between 10 and 110 sex partners per year (L. Corey and K. K. Holmes, “Sexual Transmission of Hepatitis A in Homosexual Men,” New England Journal of Medicine; and, Paul Cameron et al., “Sexual Orientation and Sexually Transmitted Disease,” Nebraska Medical Journal).

Not surprisingly, these diseases shorten life expectancy. Social psychologist Paul Cameron compared over 6,200 obituaries from homosexual magazines and tabloids to a comparable number of obituaries from major American Newspapers. He found that while the median age of death of married American males was 75, for sexually active homosexual American males it is 42. For homosexual males infected with the AIDS virus, it was 39. While 80 percent of married American men lived to 65 or older, less than two percent of the homosexual men covered in the survey lived as long

…read more…

…these problems don’t remain personal and private. The drive, especially since this issue is associated with the word “gay rights,” is to make sure your worldview reflects theirs. To counter this effort, we must demand that the medical and psychiatric community take off their PC blinders and treat these people responsibly.  If we don’t, the next thing you know, your child will be taking a “tolerance” class explaining how “transexuality” is just another “lifestyle choice”…. After all, it is the only way malignant narcissists will ever feel normal, healthy, and acceptable: by remaking society – children – in their image

Tammy Bruce, The Death of Right and Wrong: Exposing the Left’s Assault on Our Culture and Values (Roseville: Prima, 2003), 92, 206.

In the black community, for example, one of the major factors in the degradation of that sub-culture is fatherlessness. In order to stop the devolving of young men into criminals, the black community would have to step up to the plate and accept responsibility for their own actions and change behavior… not blaming outside forces. Similarly, the gay community will have to battle their demons as well to help their subculture. See my Cumulative Case for some ideas of what these demons are.

Many years ago, Tammy Bruce reemphasized this dangerous, self-destructive notion and action:

….What a difference treatment makes! As researchers succeeded in developing ever more effective drugs, AIDS became—like gonorrhea, syphilis, and hepatitis B before it—what many if consider to be a simple “chronic disease.” And many of the gay men who had heeded the initial warning went right back to having promiscuous unprotected sex here is now even a movement—the “bareback” movement—that encourages sex  without condoms. The infamous bathhouses are opening up again; drug use, sex parties, and hundreds of sex partners a year are all once again a feature of the “gay lifestyle.” In fact, “sexual liberation” has simply become a code phrase for the abandonment of personal responsibility, respect, and integrity.

In his column for Salon.com, David Horowitz discussed gay radicals like the writer Edmund White. During the 1960s and beyond, White addressed audiences in the New York gay community on the subject of sexual liberation. He told one such audience that “gay men should wear their sexually transmitted diseases like red badges of courage in a war against a  sex-negative society.” And did they ever. Then, getting gonorrhea was the so-called courageous act. Today, the stakes are much higher. That red badge is now one of AIDS suffering and death, and not just for gay men themselves. In their effort to transform society, the perpetrators are taking women and children and straight men with them.

Even Camille Paglia, a woman whom I do not often praise, astutely commented some years ago, “Everyone who preached  free love in the Sixties is responsible for AIDS. This idea that it was somehow an accident, a microbe that sort of fell from  heaven—absurd. We must face what we did.”

The moral vacuum did rear its ugly head during the 1960s with the blurring of the lines of right and wrong (remember “situational ethics”?),  the sexual revolution, and the consequent emergence of the feminist and gay civil-rights movements. It’s not the original ideas of these movements, mind you, that caused and have perpetuated the problems we’re discussing. It was and remains the few in power who project their destructive sense of themselves onto the innocent landscape, all  the while influencing and conditioning others. Today, not only is the blight not being faced, but in our Looking-Glass world, AIDS is romanticized and sought after….

Tammy Bruce, The Death of Right and Wrong: Exposing the Left’s Assault on Our Culture and Values (Roseville: Prima, 2003), 96-97.

And take note I talk about the nihilistic culture in the gay community [infected by liberalism] in a more philosophical and religious sense than most places, from my chapter in my book:


…Foucault looked at truth as an object to be constructed by those whom wielded the power to define facts.  “Madness, abnormal sex, and criminality were not objective categories but rather social constructs.”[73] He embraced what mainstream society had rejected, which was sadomasochism and drug use. In 1984 Foucault died from contracting AIDS.  One should take note that Foucault so enjoyed his hope of dying “of an overdose of pleasure” that he frequented gay bathhouses and sex clubs even after knowing of his communicable disease.  Many people were infected because of Foucault and Foucault’s post-modern views.[74]  On a lighter note, Dinesh D’Souza tells of a contest about the time Foucault was dying.  The story is fitting for those who view hell as a real option:

People were debating whether AIDS victims should be quarantined as syphilis victims had been in the past.  [William F.] Buckley said no. The solution was to have a small tattoo on their rear ends to warn potential partners.  Buckley’s suggestion caused a bit of a public stir, but the folks at National Review were animated by a different question: What should the tattoo say?  A contest was held, and when the entries were reviewed, the winner by unanimous consent was Hart.[75]  He [Hart] suggested the lines emblazoned on the gates to Dante’s Inferno: “Abandon all hope, ye who enter here.”[76]

You see, in order to have one’s alternative lifestyle accepted, one must attack “what truth is” in its absolute (Judeo-Christian) sense.  Truth is whatever the powerful decided it was, or so Foucault proposed.  This is the attack.  “We are subjected to the production of truth through power and we cannot exercise power except through the production of truth.”[77]  Foucault, sadly, never repented from violating God’s natural order and truth.  He was a living example in his death of what Paul said was naturally to follow in their rejection of God’s gracious revelation of Himself to humanity,[78] Romans 1:26-32 reads:

Worse followed. Refusing to know God, they soon didn’t know how to be human either—women didn’t know how to be women, men didn’t know how to be men. Sexually confused, they abused and defiled one another, women with women, men with men—all lust, no love. And then they paid for it, oh, how they paid for it—emptied of God and love, godless and loveless wretches.… And it’s not as if they don’t know better. They know perfectly well they’re spitting in God’s face. And they don’t care—worse, they hand out prizes to those who do the worst things best! [79]

Foucault said that “sex was worth dying for,”[80] but is it?…


Notes:
[73] Ibid.
[74] Ibid.
[75] Jeffrey Hart, a professor many years ago at Dartmouth Univ.
[76] Dinesh D’ Souza, Letters to a Young Conservative: The Art of Mentoring (New York: Basic Books, 2002), 20.
[77] Flynn, 235-237.
[78] Walter A Elwell, Evangelical Commentary on the Bible (Grand Rapids: Baker Books, 1996), Romans 1:21
[79] Eugene H Peterson, The Message: The Bible in Contemporary Language (Colorado Springs: NavPress, 2002), Romans 1:26-27, 30-32.
[80] Ibid., 235.


 

CO2 Not The Demon It Is Made Out To Be (UPDATED)

Bottom Line:

  1. The Mean Global Temperature has been stable since 1997, despite a continuous increase of the CO2 content of the air: how could one say that the increase of the CO2 content of the air is the cause of the increase of the temperature? (discussion: p. 4)
  2. 57% of the cumulative anthropic emissions since the beginning of the Industrial revolution have been emitted since 1997, but the temperature has been stable. How to uphold that anthropic CO2 emissions (or anthropic cumulative emissions) cause an increase of the Mean Global Temperature?

(more)

This is Part IV of a series from Christmas 2014.

Renown physicist Freeman Dyson says CO2 does not worry him… in two separate interviews

We need MORE CO2, not less!

Dr. William Happer, currently a professor of Physics at Princeton University, was once fired by Gore at the Department of Energy in 1993 for disagreeing with the vice president on the effects of ozone to humans and plant life, also disagrees with Gore’s claim that manmade carbon dioxide (CO2) increases the temperature of the earth and is a threat to mankind. Happer appeared before the U.S. Senate’s Environment and Public Works Committee on Feb. 25 and explained CO2 is in short-supply in relative terms of the history of the planet.

“Many people don’t realize that over geological time, we’re really in a CO2 famine now. Almost never has CO2levels been as low as it has been in the Holocene [geologic epoch] – 280 [parts per million (ppm)] – that’s unheard of,” Happer said. “Most of the time, it’s at least 1,000 [ppm] and it’s been quite higher than that.”

Happer said that when CO2 levels were higher – much higher than they are now, the laws of nature still managed to function as we understand them today.

“The earth was just fine in those times,” Happer said. “You know, we evolved as a species in those times, when CO2 levels were three or four times what they are now. And, the oceans were fine, plants grew, animals grew fine. So it’s baffling to me that, you know, we’re so frightened of getting nowhere close to where we started.”…


Must see interview

Here is a quick intro that I combined with a great visual in reagrds to PPM and how it is benefitial to mankind:

To skip this aside, click HERE


CONSENSUS


He mentioned most of the experts KNOW how CO2 affects climate. He says he does not and doesn’t think they do either. This has nothing to do with the supposed “consensus” of experts — 97% — who “say” it is driven by mankind. This is known as anthropogenic global warming, of AGW. The myth of the 97% started with ONLY 75-out-of-77 climatologists saying they believe man is the primary cause.

Yes, you heard me correctly, seventy-five.

Another study has undergrads and non-specialists (bloggers) search through many articles in peer reviewed journals, and noting that a large majority supported the AGW position. The problem was that they were not specialized in the field of science… AND… they only read the abstracts, not the peer reviewed paper itself. Many of the scientists behind the papers “said” to support AGW rejected that idea. So the specialists THEMSELVES said their papers cannot be read to support the AGW position.

Another study (pictured in the graph above) tries to save an earlier one with tainted information based on abstracts — a very UNSCIENTIFIC way to get to consensus (that is, relying on abstracts). Not only was this study based on abstracts, again, non specialists categorized them. Yet another study was merely based on search parameters/results. Here is more info (mainly links) for the not-faint-of-heart.

In reality, nearly half of specialists in the fields related reject man causing climates change.

And a good portion of those that do reject the claim that it is detrimental to our planet.

Only 13% saw relatively little danger (ratings of 1 to 3 on a 10-point scale); the rest were about evenly split between the 44% who see moderate to high danger (ratings of 4 to 7) and 41% who see very high or grave danger (ratings of 8 to 10). (Forbes)

Here is a list of scientists with varying views on the cause of “Climate Change,” and here is a list of 31,000 who stand against man as the primary cause.


Continuing with the original post


This is meant mainly as a supplement to a Christmas Eve-Eve gathering/discussion I was at. I will make this post  a little different than other posts, as, it will be “minimalist.” This is the fourth installment of the topics covered, which are polar bears, rising sea levels, CO2, Inconvenient Truth (the movie), nuclear power, warmest year, electric vehicles (EVs)/hybrid cars, and bullet trains.

Can you imagine the polluted, destroyed, world we would have if the left had their way with green energy?

Environazis, like all progressives, care about two things: other people’s money and the power entailed in imposing their ideology. Prominent among the many things they do not care about is the environment, as demonstrated by a monstrosity planned for Loch Ness:

A giant 67 turbine wind farm planned for the mountains overlooking Loch Ness will be an environmental disaster thanks to the sheer quantity of stone which will need to be quarried to construct it, according to the John Muir Trust. In addition, the Trust has warned that the turbines spell ecological disaster for the wet blanket peat-land which covers the area and acts as a huge carbon sink, the Sunday Times has reported.

According to global warming dogma, carbon sinks are crucial in preventing human activity from causing climatic doom.

The planet isn’t the only victim of this ideologically driven enterprise:

Around one million people visit the picturesque Loch Ness, nestled in the highlands of Scotland each year, bringing about £25 million in revenue with them. Most are on the lookout for the infamous monster, but if Scottish and Southern Energy (SSE) get their way the tourists will have something else to look at: the Stronelairg wind farm – 67 turbines, each 443ft high, peppered across the Monadhlaith mountains overlooking the Loch.

….read it all….

Remember what the two top Google scientist in charge of their renewable energy program just said?

We came to the conclusion that even if Google and others had led the way toward a wholesale adoption of renewable energy, that switch would not have resulted in significant reductions of carbon dioxide emissions. Trying to combat climate change exclusively with today’s renewable energy technologies simply won’t work; we need a fundamentally different approach.

[…..]

“Even if one were to electrify all of transport, industry, heating and so on, so much renewable generation and balancing/storage equipment would be needed to power it that astronomical new requirements for steel, concrete, copper, glass, carbon fibre, neodymium, shipping and haulage etc etc would appear. All these things are made using mammoth amounts of energy: far from achieving massive energy savings, which most plans for a renewables future rely on implicitly, we would wind up needing far more energy, which would mean even more vast renewables farms – and even more materials and energy to make and maintain them and so on. The scale of the building would be like nothing ever attempted by the human race.”

But asking someone who has swallowed this story is like beating a dead horse. They will tell me — to my face — that mankind releasing CO2 into the atmosphere is driving weather changes.

I will point out a graph that shows in the past couple of decades man has produced more CO2 combined from the previous 100-years, overlayed to the temperature staying the same for over 18-years (in fact, falling a bit since 2005), and this MAJOR, FOUNDATIONAL belief being shown false doesn’t sway their “belief” towards rethinking their previously held paradigm.

See Also, “Dr. William Happer Speaking To The Benefits Of CO2.”

This comes by way of Gay Patriot, and shows how scientific the party of science is:

Bypassing Congress yet again, Obama today announced a unilateral imposition of carbon dioxide emission limits for electrical power plants.

Even the NYTimes admits the regulations will have no discernible impact on Global  CO2 levels. They will, however, cost $50 Billion per year in regulatory costs, raise energy bills an average of $1,200 per family per year, and destroy 224,000 jobs annually through 2030.

The Administration promises none of those outcomes will happen, but then, they also promised “If you like your plan, you can keep your plan,” and “We will be the most transparent administration in history.”

Obama is justifying his dictatorial imposition of carbon dioxide regulations partly on the basis that carbon causes asthma and heart attacks.

You read that right. Carbon. Causes. Asthma.

Party of science my ass.

…read more…


Climate scientist Dr. Murry Salby, Professor and Climate Chair at Macquarie University, Australia explains in a recent, highly-recommended lecture presented at Helmut Schmidt University, Hamburg, Germany, why man-made CO2 is not the driver of atmospheric CO2 or climate change.

Dr. Salby demonstrates:

  • CO2 lags temperature on both short [~1-2 year] and long [~1000 year] time scales
  • The IPCC claim that “All of the increases [in CO2 concentrations since pre-industrial times] are caused by human activity” is impossible
  • “Man-made emissions of CO2 are clearly not the source of atmospheric CO2 levels”
  • Satellite observations show the highest levels of CO2 are present over non-industrialized regions, e.g. the Amazon, not over industrialized regions
  • 96% of CO2 emissions are from natural sources, only 4% is man-made
  • Net global emissions from all sources correlate almost perfectly with short-term temperature changes [R2=.93] rather than man-made emissions
  • Methane levels are also controlled by temperature, not man-made emissions
  • Climate model predictions track only a single independent variable – CO2 – and disregard all the other, much more important independent variables including clouds and water vapor.
  • The 1% of the global energy budget controlled by CO2 cannot wag the other 99%
  • Climate models have been falsified by observations over the past 15+ years
  • Climate models have no predictive value
  • Feynman’s quote “It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you are. If it doesn’t agree with the data, it’s wrong” applies to the theory of man-made global warming.

See and Read More HERE