Wednesday, February 3, 2010

In Praise of Deference

It is common to hear that you should make up your own mind and not let other people make it up for you. While I wholeheartedly agree with the sentiment and believe it’s silly to be obsequious to arbitrary authority, I nonetheless think intellectual deference is both unavoidable and a virtue. This notion may seem repugnant, an affront to liberal values, and perhaps even unnecessary, so I’ll defend it in some detail. (I want to stress, however, that while I’ll be defending my considered opinion, my argument for it will be sketchy – a full, precise, treatment would require far more time and space). My argument rests on (among others) the following two premises: (1) the world is staggeringly complicated and regularly in conflict with our intuitions, and (2) intellectual responsibility requires you to know what you’re talking about before you form an opinion. (The picture, by the way, is a high-resolution map of science, from this PLoS One paper).

It’s a cliché that the more you learn the more you realize how little you know, but it’s true: things are nearly always more complicated than they seem, and thus peeling away layers of ignorance usually reveals yet more complexities, uncertainties and undreamt of intricacy. Or as Douglas Adams is supposed to have said: "the universe is a lot more complicated than you might think even if you start from a position of thinking that it’s pretty damn complicated to begin with". If you’re at all in doubt that the world really is as complex as I make out, I invite you to examine the work of those whose job it is to understand it. Have a look at contents of any respected peer-reviewed journal, Science, say, or Nature Neuroscience or Physical Review Letters. Or, if you prefer, examine an advanced-level textbook in any explanatorily successful field such as molecular biology, astrophysics, or neuroscience. (It has to be a field that is at least partially explanatorily successful because incorrect theories can be arbitrarily simple given they need not reflect reality). Explanations of empirical phenomena almost invariably require us to employ sophisticated techniques and equipment, set aside ingrained or natural ways of thinking, invoke vast bodies of previously established knowledge, and express or establish our findings with abstruse mathematics or statistics. Given that our models and theories at least partially mirror reality, the intricacy and opacity of the former reflects the complexity and opacity of the latter. (Though there are probably exceptions, explanations tend to get more complicated – in the sense of arcane or difficult to understand – as they become more successful. There thus seems to be little reason to attribute the complexity of our theories to the incompleteness of our knowledge. Also, while successful explanations may be simple in the sense of being elegant or parsimonious, they’re extremely unlikely to be simple in the sense of easy to understand or straightforward).

My second premise is that ‘making up one’s own mind’ responsibly requires, among other things, understanding what the hell is going on before settling on an opinion. It might be news to some people, but really ‘understanding what is going on’ requires much more than ‘I have a strong inner conviction’ or ‘I once read a book on this’ or ‘someone told me so’. It requires, in general and speaking roughly, proficiency in logic and critical thinking, competence in general and domain-specific scholarship and research, knowledge of the relevant facts and how sure we are of them, mastery of the relevant techniques, a familiarity with possible alternative explanations, knowledge of at least a large proportion of the relevant literature, and more. And if (1) is true – if the universe really is extremely complicated – fulfilling these requirements is awfully demanding. Carl Sagan provided an excellent example:
Imagine you seriously want to understand what quantum mechanics is about. There is a mathematical underpinning that you must first acquire, mastery of each mathematical subdiscipline leading you to the threshold of the next. In turn you must learn arithmetic, Euclidian geometry, high school algebra, differential and integral calculus, ordinary and partial differential equations, vector calculus, certain special functions of mathematical physics, matrix algebra, and group theory. For most physics students, this might occupy them from, say, third grade to early graduate school – roughly 15 years. Such a course of study does not actually involve learning any quantum mechanics, but merely establishing the mathematical framework required to approach it deeply. (The Demon-Haunted World, p. 249)
In other words, before you can hope to begin to understand quantum mechanics, you need to master a vast body of often-difficult mathematics even before learning masses of mind-bending physics. Sagan goes as far as to say there are no successful popularizations of quantum mechanics because it is so in conflict with our intuitions, dealing with it mathematically is our only option. If you think you understand quantum mechanics without having mastered the mathematics, in short, you’re confused about what the word ‘understand’ means.

But no one person could possibly understand or discover everything about a topic, I hear you protest. Exactly! That’s exactly my point. The universe is chock full of all sorts of preposterously complicated phenomena, and no one person could ever hope to understand all or even a significant proportion of it in full. No one person could single handedly maintain a modern lifestyle (grow, harvest and process all his own food, construct his own home, build his own computer from scratch, generate his own electricity, etc.) but is forced to rely on the division of labor. Similarly, no one person could hope to understand everything about even tiny bits of the universe – the causes of climate change on a particular planet, say – but is forced to rely on the intellectual division of labor. And such a cognitive division of labor means intellectual deference. Even an expert doesn't (and can’t) know everything about her field or sometimes even her own area of specialization. It is not unheard of for scientific papers to be published where none of the authors understand all of the methods and findings. Experts defer to other experts. Experts have to defer to other experts. Deference, then, is a non-optional virtue, the only alternatives to which are agnosticism and being intellectually irresponsible. Or, to be less fancy about it, there are inevitably a large number of topics about which you can either (1) form or express unjustified opinions (i.e. be irresponsible), (2) say "I don't know" or (3) defer to the experts.

But, it may be objected, experts are sometimes wrong! Experts can be bought! Experts disagree! Indeed they are, indeed they can be and indeed they do. Experts, of course, are human beings and are therefore subject to all the familiar human failings: they are as fallible, quarrelsome, susceptible to cognitive biases and illusions, prone to social climbing, self-interested, biased, driven by ideology and whatnot as the rest of us. (Well, maybe this isn’t quite true: people self-select into science and must jump through various hoops like defending a thesis, so perhaps individuals best adapted to the ideals of science are more likely to become scientists. What is clear, though, is that scientists are not immune to these human failings). Expertise – the mastery of the techniques and in-depth knowledge of the scholarship on some subject – is not itself a huge improvement over “making up your own mind”. (By the way, I don't suggest expertise requires formal education or credentials; non-PhDs who have mastered a subject certainly still count as experts). Given the complexity of the universe and the limitations of the human mind, expertise is (for many subjects) a necessary but not sufficient condition for having justified opinions. (For one thing, it is possible to be a kind of an expert in utter bollocks: there is, for example, a huge alchemical literature, complete with rival schools, arcane jargon, different techniques and so on. And don’t get me started on postmodernism). So individual expertise in some cases doesn’t seem particularly reliable (though, ceteris paribus, it’s certainly better than nothing) and deferring to an individual expert thus isn’t necessarily such a good idea. Help is at hand, however.

The scientific method, far from denying human failings like the ones I enumerated above, exists exactly because of them: it is because the human mind is so prone to error and bias that we need this vast, expensive and seemingly inefficient set of institutions, norms and practices we call “science”. Science, roughly and to first approximation, is a collaborative enterprise aimed at a unified description and explanation of natural phenomena where the ultimate arbiter of truth is empirical experimentation, the reliability and quality of which is evaluated by a community of scholars through peer-review and replication. A scientist, then, is a person who attempts to describe and explain the natural world by testing empirical hypotheses in collaboration with a group of other researchers by reviewing their work, and producing work that is in turn reviewed. Convincing your peers – who will criticize your ideas harshly and subject them to industrial-strength skepticism – by publishing in peer-reviewed journals, presenting papers at conferences, and, more informally, debating in seminars and pubs, is at the very heart of science. (The mark of a crank is not being embedded in such a system of cooperation, dismissing criticism as some conspiracy or another, and claiming the mantle of Galileo). The point of this collaborative enterprise is to minimize bias: an individual wants her ideas to be true, is limited by peculiar psychological traits and a particular background, knows only some fraction of the relevant facts, and suffers from a whole ménage of other cognitive biases and illusions. A community dedicated to collaboration (and competition) – whose members aim at rigorous explanation and consensus, and who agree on the primacy of empirical demonstration – can overcome many (though obviously not all) of these biases because, in a sense, one individual’s biases cancels out another individual’s biases. Manipulating the world in such a way to hold certain variables constant while varying others – i.e. doing controlled experiments – is the most powerful technique ever invented to discover nature’s secrets (Daniel Dennett aptly called it the “technology of truth”), and having an entire social system (complete with attending values) and a supporting set of institutions (universities, granting agencies, journals, professional organizations etc.) multiplies these powers by minimizing human failings in interpreting and conducting the experiments. Obviously, science is not perfect, but because of how it is organized – and crucially, because there is a way of falsifying hypotheses – it is a self-correcting process: explanations are tested, discarded and repeatedly refined, which then slowly ratchets our theories closer to the truth over time. (If you have doubts about the success of science, please stop reading, apply head to desk and repeat until you come to your senses).

So why is this whole story about science relevant in a post about deference? Simple: because when the relevant experts agree that some theory or explanation is correct, you can be reasonably confident that the theory is in fact correct. In other words, given the nature of the scientific method – given that claims are peer-reviewed, subject to intense scrutiny, tested, re-tested, refined, and so on – when there is a consensus among the relevant experts, it is reasonable to believe they are right. Of course and again, experts are people, so you can’t be certain the theory or proposition is true just because there is consensus, you can only be rather confident. At a minimum, individuals who disagree with the consensus have the burden of proof – they must show it is false, the majority does not need to refute the alternatives. (Though they often do). Since showing some consensus theory in science is false (or incomplete) is going to be extremely difficult, those who wish to disagree damn well better be experts themselves. (Consensus theories are of course sometimes overturned: witness plate tectonics). In general laypeople are not qualified to have independent opinions about complex topics – they lack the means to come to justified beliefs – and it is especially unreasonable for a non-expert to take a stance contrary to consensus. The upshot is that, firstly, it is reasonable to defer to the consensus opinion of the relevant experts – so I can justifiably say ‘E=mc2,' ‘DNA carries heritable information’, ‘there is a supermassive black hole at the center of the Milky Way’, etc. And, secondly, a layperson who disagrees with expert consensus – denying evolution by natural selection, anthropogenic global warming, that the Earth is about 4.5 billion years old, etc. – is unreasonable in the extreme. Experts get to have opinions on scientifically controversial questions, experts get to disagree with consensus; laypeople get to defer to consensus or reserve judgment. Doing otherwise, I think, shows what Bertrand Russell called (in another context) an “impertinent insolence toward the universe”. Scientists are often accused of arrogance and maybe I’ll be accused of this vice as well for telling laypeople who disagree to shut up. But I think the opposite is true. It is extraordinarily arrogant to have (independent) opinions on complex questions without being willing to pay your dues first – that is, without studying the question for years, reading the scholarly literature, mastering the relevant techniques and mathematics, and so on. Thinking you are entitled to an opinion without paying your dues is the very epitome of intellectual arrogance. And it is especially arrogant – mind-bogglingly so – for a non-expert to have opinions that contradict the consensus of the tens of thousands of intelligent, diligent and dedicated people who have spent decades studying, debating, doing research on and thinking deeply about their respective disciplines. The bottom line: be an expert, defer, or suspend judgment. (To be clear: I’m making an epistemic and not a political claim. People have a right of free speech and conscience, so they can form and express any opinion they like. But that doesn’t mean they have an intellectual warrant to do so).

To be sure, there are a whole bunch of complications here (how very appropriate, no?). For one, scientists are not the only people worth deferring to: if there is a consensus among plumbers, for example, that the best way to fix problem P is to do X, Y and Z, I’d be inclined to say that’s quite reliable. Nevertheless, while there are other groups (‘communities of practice’, etc.) that can reasonably be deferred to, for my purposes it is simply worth noting that scientists are one such group. Secondly, there are certainly degrees of expertise and deciding when someone has crossed the threshold to expert status is fraught with difficulty (though beware the false continuum). More problematical is the question of what topics are ‘sufficiently complicated’ that laypeople shouldn’t have independent opinions. Saying, for example, that only psychologists are qualified to determine whether Bob has a crush on Tamba, for example, is preposterous. The (partial) solution here, I think, is to invoke Richard Dawkins’ notion of the “Middle World” and to distinguish between explicit and implicit knowledge. Let's start with the former. The human mind, Dawkins convincingly argues, evolved to deal with and understand the everyday world we inhabit: of medium sized objects, operating at low-velocities, including animals and other people. “Folk biology”, “folk psychology”, and “folk physics”, for example, are regularly wrong in detail (sometimes spectacularly so), but they are often reliable when in our ‘natural environment’. The fact that we can, say, play football (which requires sophisticated ballistics), navigate a cluttered room (which requires sophisticated optics and physics), and cooperate and compete with other humans (which requires a complex theory of mind) and so on, shows we are far from cognitively incompetent. The human mind is (largely) good at solving the problems we encountered often in our evolutionary past: it is good, in other words, at Middle-World problems. But our ancient ancestors never traveled near the speed of light, never lived in large-scale complex societies, never interacted directly with the quantum world, never needed to understand the nature of stars and so. On certain topics, then, laypeople are reliably (though almost always incompletely) competent. There is a fundamental difference, in other words, between the statements “Bob is upset at Mary for cheating on him with John” and “E equals mc2”: the mind evolved to deal with the former, but not the latter. Important, also, is the difference between implicit knowledge (or behavioral competence) and explicit knowledge (i.e. justified true belief, with some modifications), While being a football quarterback, say, requires a brain capable of solving complex physics problems, this does not mean football players explicitly understand the relevant physics. When I move my arm to pick up my cup of coffee, my brain does damn complicated trigonometry, but I don't know that trigonometry explicitly - my brain's calculations are not consciously accessible to me. What this means is that behavioral competence or implicit knowledge in some domain (seeing, interacting with people and non-human animals, walking about) does not imply explicit knowledge of the underlying science. (A slam dunk argument for this, by the way, is our inability to build robots even remotely as competent as we are).

There are several more complications but I'll only mention one more. It is regularly extremely difficult to determine whether there is a scientific consensus on some topic and, if so, what it actually is, especially so when ideologically committed pseudoscientists muddy the waters. For example, the overwhelming majority of the relevant experts agree that evolution by natural selection – the fact of evolution and the theory of natural selection (etc.) – is established beyond all reasonable doubt. Creationists, however, have tried to argue there is no such consensus and have even compiled lists of scientists who 'disagree with evolution' (c.f. Project Steve). Laypeople who do not understand the scientific method might see two sides 'debating' and have real difficulty figuring out who to believe. They might not realize, for example, that scientific consensus is not about lists of people who agree or disagree, that only the relevant experts are important (engineers who 'disagree with evolution' qua engineers tell us nothing), that no paper critical of evolution has appeared in a mainstream peer-reviewed journal (save by fraud) for several decades, and so on. Even a layperson convinced it is important to defer to scientific consensus, then, will sometimes have real trouble determining whether there is a consensus and what the consensus actually is. There are two ways of dealing with this problem, I think. The first is to employ the much underused phrase "I don't know". Agnosticism  isn't particularly popular, but I think openly admitting what you do and do not know is one of the most important intellectual virtues. So when you can't figure out what the consensus is (or whether there is one), it doesn't suddenly become reasonable to form opinions in the absence of knowledge; agnosticism is then the reasonable course. The second answer to the above problem is having certain metacognitive skills - an understanding of the scientific method and the academic process, familiarity with cognitive biases, skepticism and an ability to assign onus appropriately, finely-honed critical reasoning skills, a basic understanding of statistics, and so on - that are useful for evaluating any claim. While these are not sufficient to understand the details of any area of science, it allows for a 'popular level' grasp of the field, which in turn enables one to identify what the findings are (and some of the reasons why they're established), and determine, with some (hard) work, whether there is a consensus on some question.

"The fundamental cause of the trouble," wrote Russell, "is that in the modern world the stupid are cocksure while the intelligent are full of doubt". While dividing the world into 'the stupid' and 'the intelligent' is probably going too far, I think Russell is on to something: it is those who are ignorant of science who are certain they're right - even when they're not. The Dunning-Kruger effect suggests why: the intellectually unskilled lack the intellectual skills needed to recognize that they are unskilled. They are, in other words, unskilled and unaware of it. Dunning and Kruger also showed, however, that people could be trained to become somewhat more competent, which then allows them to recognize the depth of their incompetence. What I have shown in this post, I hope, is that, in a sense, we are all cognitively incompetent relative to the stupendous complexity of the universe. It is science (or, more broadly, the project of secular reason) that holds out a candle in the dark: we have uncovered nature's secrets only because we invented this 'technology of truth' and those who wish to advance our knowledge or understand a particular phenomenon deeply must approach it humbly and pay their dues in long and intensive study. Those of us who have not paid our dues in a particular field can only defer to those who have or remain agnostic. There is no reasonable alternative. 

The last word goes to the great Bertrand Russell:
The demand for certainty is one which is natural to man, but is nevertheless an intellectual vice. So long as men are not trained to withhold judgment in the absence of evidence, they will be led astray by cocksure prophets, and it is likely that their leaders will be either ignorant fanatics or dishonest charlatans. To endure uncertainty is difficult, but so are most of the other virtues.

22 comments:

  1. Thanks Michael, a closely argued and timely piece. One can only hope that those most in need of taking it to heart will do so. The underlying historical prompts for the mistaken idea that anyone can be an expert on anything are briefly examined here.

    ReplyDelete
  2. "folk psychics"? Are those the kind that tell your future to the sound of acoustic guitar?

    ReplyDelete
  3. Thanks defollyant... Yeah, I've seen your piece. Indeed, I've commented on it! :-)

    ReplyDelete
  4. Anon: well, no. Folk physics refers to (1) people's implicit knowledge and abilities to deal with the world (catching a ball is damn complicated to do, but we can do it), and (2) explicit intuitions regarding physical phenomena. (Newton's laws of motion, for example, conflict with our intuitions: we intuitively feel something needs 'ooomphf' to continue moving).

    ReplyDelete
  5. Ooopsie, so you did. I’d forgotten that. *Blush*

    ReplyDelete
  6. Dear Mr Meadon. Please for my old eyes, use more paragraphs and spacing. It looks to cluttered and therefore too much; therefore uninteresting.

    ReplyDelete
  7. Arrived here via Mike the Mad Biologist's blog. Great article. Bookmarked ya.
    Thanks, Mark

    ReplyDelete
  8. Anon 1: sorry about that. Will keep that in mind and I'll see whether there's anything I can do...

    Anon 2: thanks! :-)

    ReplyDelete
  9. Intellectual deference, as you praise it in this article, is indeed as you stated "both unavoidable and a virtue." As you say, "The universe is chock full of all sorts of preposterously complicated phenomena, and no one person could ever hope to understand all or even a significant proportion of it in full." This is so true, I couldn't agree more.

    There's one thing I see lacking in this article. With deference being so necessary, since our intellects are so small and incapable of understanding everything, what about deferring to God. As Creator of the universe, God is the only being who knows everything about everything, and better yet, He has provided mankind with revelation in the Bible. As you point out, "the human mind is so prone to error and bias", so what better than to turn to an infinite mind, that is not prone to error and bias.

    ReplyDelete
  10. The vast majority of the experts agree that the God to which you refer is a Bronze Age invention. There are no good reasons to believe in Gods, fewer to believe in your specific version of God, and fewer still to think the Bible is even close to infallible. (It gets worse. The Bible, I am sure you'll agree, is difficult to interpret. But what use is a difficult to interpret infallible? Since we're fallible, we're bound to interpret it incorrectly...)

    ReplyDelete
  11. Sorry, edit fail. "But what use is a difficult to interpret infallible book?"

    ReplyDelete
  12. Your fallback on the "vast majority of the experts" is not a good reason to reject God. There are many experts that believe in a God, I would say a majority of experts that believe in a God. However since it's point to argue numbers (how many experts support your position or my postion), take the following quote from your above article:

    Given the complexity of the universe and the limitations of the human mind, expertise is (for many subjects) a necessary but not sufficient condition for having justified opinions.

    What I understand you to be saying is that your opinion that there is no God is not a justified opinion. Furthermore you conveyed the idea that mankind knows so little about the whole universe. Therefore since you don't have all knowlege, you can't say there there is no God, because you don't know everything.

    You say that the Bible is difficult to interpret, we are bound to interpret it wrong, so why even try. Well I would put this back to you, that because we are infallible, we are bound to interpret science wrong, so why even try.

    In your article you say, "it is because the human mind is so prone to error and bias that we need this vast, expensive and seemingly inefficient set of institutions, norms and practices we call science. Science, roughly and to first approximation, is a collaborative enterprise aimed at a unified description and explanation of natural phenomena where the ultimate arbiter of truth is empirical experimentation, the reliability and quality of which is evaluated by a community of scholars through peer-review and replication. "

    Why I find ironical about this statement, is that you don't seem to see a problem with allowing the infallible mind of man to interpret science, and establish 'truth' which you consider to be reliable. If you follow the logic through, there's a problem, because we have to interpret something, whether it's the Bible or the evidence. If you incorrectly interpret the evidence, then science and the truth's built on it are false.

    ReplyDelete
  13. Robert, it looks like you’ve got “infallible” mixed up with “fallible.” Twice you wrongly credit humanity with infallibility. But whatever. Your god doesn’t answer anything usefully and raises hugely more complicated questions than those s/he pretends to be an answer to. The bible, like all so-called “holy” books, is a collection of (Bronze Age) fables whose sole value is literary. To discard the findings of science on the grounds that we as humans are fallible is to reveal a profound misunderstanding of how science accumulates knowledge. Real and useful knowledge. The methods of science are themselves refined over time to minimise and to correct error, as are the findings of science themselves. The bible has no such mechanism for admitting error. The bible is inaccurate on many physical questions. Why else would there be over 38,000 recognised flavours of Christianity, all at doctrinal odds with one another? I mean they all proceed from the same source material so it is surprising to find such widespread incompatibility.

    ReplyDelete
  14. Rob: I realize 'the experts think God does not exist' is far from a perfect argument against its existence. But I have a degree in philosophy, so I know something about the issues involved. And the experts - philosophers of religion - all pretty much agree there are no good reasons to believe in God. Obviously, some think there are such reasons, but they are in a minority. Now, that doesn't mean they're wrong. But it DOES mean anyone who disagrees with the consensus better know his stuff. That was the point of this whole article. Now, correct me if I'm wrong, but you don't have expert-level knowledge in this field, do you?

    (By the way, I think there is less reason to defer to consensus among philosophers than to consensus among scientists. And, of course, getting consensus among philosophers is rare indeed.)

    Also: what Con-Tester said.

    ReplyDelete
  15. Con-tester:
    Yes, a typo, I did write infallible when I meant fallible. Sorry.

    I never said that I discard science; I only pointed out your inconsistency in attacking the Bible, but completely trusting science. You say that my God does not answer anything usefully -- on the contrary it is only with a starting point in God that there can be absolutes, that there can be truth; and it is only from the Bible that we have the answers to life, Who am I? How can I know? and What do I do? Can you answer these questions, relying only on science and evolution?


    Michael:
    If you reject God, then you have to put your faith in something else, lets say science. That's the difference between you and me, and why we can never come to an agreement. You have made your starting point with man and science. I have made my starting point with God. On the basis of His word, I have a firm basis for truth, absolutes, morals, and my life has meaning and purpose. I see a host of reasons to believe in a God.

    You keep referring to the 'majority' supporting your view. What's interesting about that is that that is very relative. Are you referring to a majority in a particular university, or association, or a particular country? I could easily cite polls that show that a majority on Americans believe in a God. Does that effect that validity of either position? No. Neither does your claim of numbers.

    As for your questioning of my knowledge, it's true that I don't have a degree in science or philosophy; however, I have been taught by experts in those areas as well as theology, so I have a pretty reasonable knowledge. Either for you or for me, the best knowledge of God is the one that's built in. Just as you are self-conscious, deep inside you you know there is a God.

    ReplyDelete
  16. Robert Tetzlaff wrote (February 9, 2010 6:27 AM): “I only pointed out your inconsistency in attacking the Bible, but completely trusting science.
    What inconsistency? The findings of science are always provisional (i.e. subject to later revision if needed) and only credible to the degree that they are supported by physical evidence, that they don’t contradict other well-established science and that they yield fruitful accounts of certain classes of phenomena. Science doesn’t require trust or faith; understanding and a certain humility to acknowledge that one may be wrong are all that are required. All religions disdain doubt about their respective doctrines, and several warn of dire consequences for those who do entertain doubts. Your bible is not reliable because it permits as many interpretations as there are readers of it. Anyone can make just about whatever they want to out of its content.

    Robert Tetzlaff wrote (February 9, 2010 6:27 AM):“[I]t is only with a starting point in God that there can be absolutes, that there can be truth; and it is only from the Bible that we have the answers to life, Who am I? How can I know? and What do I do?
    Science doesn’t deal in absolutes, and religion only pretends to. Religious precepts have changed over time with changes in social needs and perceptions – for example, Western societies no longer stone to death adulterers and homosexuals. Religion rides piggyback on people’s need for security, order and predictability in our lives, and what can offer greater security than an “absolute” answer? You might find those “answers” meaningful (and long may they give you comfort), but many people see them for the unsatisfying con that they are.

    Robert Tetzlaff wrote (February 9, 2010 6:27 AM):“Can you answer these questions, relying only on science and evolution?
    No, but neither can religion, otherwise there would be no atheists, most of whom were believers at one time. It is up to the individual to find their own answers to those questions. To have them dictated to you from birth by parents, peers and teachers is to accept them out of habit and because we all are drawn to thoughts of transcendence. Such belief is not the result of introspection, critical scrutiny and consequent thought. BTW, evolution is science.

    ReplyDelete
  17. Rob: I suggest you read my post if you haven't done so. I am referring to a consensus among the relevant experts, i.e., philosophers of religion. (And, no, theologians don't count. Why? For the same reason we don't allow plaintiffs to be judges in their own cases). The average American is not an expert in anything, so they're far from important here. That there is a consensus among the relevant experts that there are no good reasons to believe in God obviously doesn't "prove" she/it/he doesn't exist. It only shows people who want to say otherwise better know what the hell they're talking about. I'm not saying you don't know this area, I'm clarifying what my argument is.

    I am interested to hear what you think these good reasons for the existence of God are...

    ReplyDelete
  18. Con-tester:
    I believe in God, therefore I believe the Bible. I find it to be very reliable.
    You believe in man and science, the Bible doesn't fit your worldview, and you find it to be contradictory to modern interpretations of the evidence.

    You say that science does not require faith. But every person must start with faith in something. You cannot live without a belief in something. I start with faith in God and His message in the Bible. What do you have faith in? Based on your comments, it sounds to me like you have faith in science, but I'm glad to let you answer that question yourself - What do you have faith in? What are your beliefs?

    You criticize religion, but did you realize that you have a religion too. Atheism and Evolution are both religions. Read the Webster's definition of religions, one definition is as follows, "a cause, principle, or system of beliefs held to with ardor and faith." Would you not agree that you hold to your principles with ardor?"

    You say that the belief in a religion such as Christianity is not the result of introspection, critical scrutiny, and consequent thought. However, I have given my faith critical scrutiny, introspection and thought. I have also given evolution critical scrutiny, introspection, and thought. And the result has been a firmer belief in God and the recognition that is the only reasonable religion to hold.

    ReplyDelete
  19. Michael:
    I did read your post all the way through. I see that you are only interested what a particular set of experts believe. As for my belief in God, I could give you some reasons, but first, why don't you take a look at this page: http://www.proofthatgodexists.org/

    ReplyDelete
  20. Rob, frankly, you're rehearsing silly, tired old arguments. "Evolution and science are religions"? Oh come now, that's dumb.

    As for the website. Nothing new, nothing interesting and fallacies abound. (Particularly, false dilemma).

    ReplyDelete
  21. Robert, your assertion, “Atheism and Evolution are both religions” (February 10, 2010 6:57 AM) is a convenient fiction that was cooked up by religionists some time ago. First, it simply isn’t so, and no amount of repetition will magically make it true. Atheism is the absence of belief in a (theistic) god. It doesn’t entail a positive belief in such a god’s non-existence. The latter position, while possible, might be termed “strong atheism.” A common analogy is in the retort, “Atheism is a religion like baldness is a hair colour.” Evolution rests on much strong evidence, which means that faith is not required. Said Dan Barker, former preacher who later co-founded the Freedom From Religion Foundation, “Truth does not demand belief. Scientists do not join hands every Sunday, singing, ‘Yes, gravity is real! I will have faith! I will be strong! I believe in my heart that what goes up, up, up must come down, down, down. Amen!’ If they did, we would think they were pretty insecure about it.

    Second, your assertion is a tu quoque fallacy, one of several in your post of February 10, 2010 6:57 AM. Even if it was 100% correct (which it isn’t), it would provide no validation at all for your own beliefs unless you are also defending a postmodernist conception of “truth,” but that would be in blatant conflict with your earlier ideas about “absolute answers.”

    ReplyDelete