Showing posts with label belief. Show all posts
Showing posts with label belief. Show all posts

Wednesday, January 28, 2015

Hypothesis testing and arguments

Lately ideas related to hypothesis testing have been showing up a lot on the blogs I read, both in general--what exactly is it, what is the right way to do it, how do we teach it, how should we frame the conclusions, etc.--and in specific contexts. A common caveat is that failing to reject a null hypothesis is not the same as demonstrating that the null is true. Can't argue with that, but when facing this kind of question--what conclusion can we or should we draw if we do or do not reject the null--I think that it is essential to recognize that the statistical test is being employed as a rhetorical device and that it should not be viewed as independent of other elements of the argument. Hypothesis testing, like any bit of statistical inference (or any bit of science, for that matter), can in principle be executed and presented in a value-free manner, but in practice it virtually never is. Once you start talking about what we would like to know, you're imposing values on the process. Any argument about what we should do based on the statistical results will necessarily be value-laden. If there is some decision to be made based on the results of a hypothesis test in which the null was not rejected, one has to decide how failing to reject the null bears upon that decision, and there isn't going to be a single, objective answer to that.

Wednesday, May 21, 2014

More about beliefs vs. facts

This article refers to research that makes some progress on the question of how to correct false beliefs. At the same time, reading it renews the concern I discuss here and here. If the article is correct about the reason for the persistence of false beliefs, we should expect that some people hold correct beliefs for the wrong reasons. I.e. if beliefs can be based on something other than facts, then some beliefs will be only coincidentally consistent with facts. These coincidental cases can lead to an unjustified confidence in one's whole process of belief formation. I'm concerned that someone reading the article might reason as follows:
  • Global warming is a problem.
  • I believe that it is a problem.
  • Therefore, I am on the side of truth.
  • Therefore, my beliefs about other things are correct also.
Quite the meta-problem.

Wednesday, May 7, 2014

De-politicization of beliefs

An addendum to yesterday's post.

Say there is some issue A, like "climate change is a problem that warrants policy intervention," with which liberals are more likely to agree and conservatives are more likely to disagree (or vice versa). There has been talk of de-politicizing such issues, meaning removing the association of the issue with one ideological group (which may be a matter of detaching the issue from underlying beliefs or values that differ across ideological groups). The presumed benefit of de-politicization is that it would make conservatives more likely to believe A. But at the same time, it would make liberals less likely to believe A. That it is a political issue cuts both ways: it's harder to get some ideological groups on board, easier to get others.

Now, what we really want to accomplish is to get people to believe something because it's actually true. Appealing to reason doesn't seem to be the way to go, its effectiveness being severely limited. Perhaps there is a way to appeal to beliefs or values that are common to many ideological groups--stuff that tends to unite rather than divide people. This  raises the question of whether people adopt beliefs because they want to agree with their own group or because they to want to oppose some other group (about which I wrote a paper). If you argue an issue in a way that is meant to appeal to one's humanity (rather than one's ideology), will that fail because there is no obvious group to which one can be opposed? Does Haidt's "groupishness" have an inherent us-vs-them component? If so--if people need to find something to disagree about--then maybe the thing to do is change the focus of disagreement to something more innocuous. I.e. try to get liberals and conservatives to expend their us-vs-them energies on something that doesn't have such dramatic consequences.

Tuesday, May 6, 2014

A good point about climate change beliefs, and beliefs in general

Dan Kahan's research deals largely with individuals' adoption of beliefs as a means of identifying with some kind of ideological group: e.g., someone denies that global warming is a problem, not as a result of weighing the evidence but because the person wants to identify with fellow conservatives, who tend to hold that belief. I'm not completely convinced that the mechanism for belief adoption is exactly as Kahan says it is, but it does seem perfectly clear that publicly stated opinions are not always about truth and may not even be based on perceived truth.

Arnold Kling comments on Kahan:
Kahan advises climate worriers to try to engage in public discussion in ways that are less “culturally assaultive.” This assumes that climate worriers care more about climate policy than about asserting their moral and intellectual superiority over conservatives. The most charitable I can be is to say that I am willing to wait and see whether that is the case.
I can put it a bit more charitably: even if the "worriers" are correct, some of them are adopting the right belief for the wrong reason. Kahan focuses on those who deny global warming for the wrong reasons (although he may very well address the flip side somewhere). One might infer from this treatment that everyone who believes in global warming does so for the right reasons, or that it doesn't matter why they think what they do as long as they end up at the correct conclusion. In fact, it is essential to recognize that lots of belief in global warming is poorly founded, and that this is part of what makes the discussion "culturally assaultive." It is a mistake to assume that we can detach the way beliefs are defended in some public arena from the way those beliefs were formed in the first place.

Which is not to say that there isn't some actual truth in the matter. In this case, I really have no doubt that global warming is a problem that warrants action, and I don't think that anyone approaching the issue with even a pretense of objectivity can reasonably claim otherwise. But supporting a reasonable conclusion for the wrong reasons doesn't do any good. I see this kind of thing all the time, with respect to lots of issues. For example, I might agree with someone that having a minimum wage is a good idea, but if someone denies that the efficiency effects even exist, I can't take their opinion very seriously. It becomes a real problem when someone who does not support minimum wage encounters such an opinion. One can then jump to the conclusion that those who support minimum wage don't even understand its effects, avoiding the reasonable discussion that could be had about how severe the inefficiencies created really are and whether it is worth it to create these inefficiencies.

I don't know whether framing public debate this way makes me feel more or less optimistic about the potential for resolving disagreements constructively.

Wednesday, April 16, 2014

God and null hypotheses

Comedian Tim Minchin performs a song, "Thank You God," prefatory to which he tells a story about meeting a fan of his who is a Christian. When asked why he does not believe in God, Minchin says this is part of a more general policy of only believing things for which he has evidence. What he does not acknowledge is that there also is no evidence that God does not exist. Don't get me wrong: I do enjoy the man's comedy. However, if we're being scientific about this, we start with one of two null hypotheses--God exists, or God does not exist--and we cannot reject either. One might argue that one of those hypotheses is the more reasonable starting point, but it is incorrect to claim that the lack of evidence is an unconditional indication that God does not exist (although, to be fair, I'm not completely sure Minchin was trying to make that claim; maybe he was standing up for agnosticism rather than atheism).

Scientists of all kinds avoid any confusion over the conclusions of research by stating the conclusions carefully. You'll often hear the statement, "There is no evidence that [blank]," which basically means that if we start with the assumption [not blank], we don't have enough evidence to reject that assumption with any reasonable degree of certainty. That could be the case after hundreds of peer-reviewed studies have attempted to demonstrate [blank], or it may be that no one has even tried. Either way, "There is no evidence that [blank]" is not at all the same claim as "There is evidence that [not blank]" and should not be treated as such. Case in point: when my firstborn was an infant, I heard from more than one source that there is no evidence that allowing a baby to "cry it out" causes any lasting psychological damage, but I wasn't about to let my son cry for hours at a stretch.

I think it is common for null hypotheses to go unnoticed or unacknowledged. There's always some kind of default belief. Consider the vaccination scare, still active in some quarters. I don't think that childhood vaccinations cause autism, and that conclusion is predicated on the belief that we shouldn't think that vaccines cause autism unless we have some good reason to do so. One could take the opposite belief as the default, once the possibility of a connection is raised: that vaccinations do cause autism, or some other kind of harm, unless we have strong reason to believe otherwise. I'm not sure if that belief is thoroughly unreasonable, but it is certainly subject to criticism. Such as: given all of the potential causes and potential effects in the universe, any pair chosen at random are extremely unlikely to be related, and therefore the reasonable default belief is that there is no causal relationship between any pair of randomly chosen events. And why are we focusing on autism in particular? Why not other disorders or diseases, or even positive effects, for that matter? Why not take the default belief to be that childhood vaccines cause pattern baldness later in life? And so forth. Apart from the reasonableness of the null itself, adopting the wrong null is costly if there is insufficient evidence to reject the null. If you avoid vaccines because you believe they cause autism, you risk disease that vaccines can prevent (and we have very good evidence for that).

Back to Tim Minchin: it hardly makes sense not to believe anything without evidence. A more defensible statement is that one will not deviate from some kind of baseline beliefs without sufficient evidence. Then the origin of these baseline beliefs becomes the issue. These baseline beliefs are often unstated and may even be subconscious. I doubt that the anti-vaccination crowd is thinking things through in the way I describe above. Opposition to vaccinations may arise from a fundamental belief in the ability of the human body to heal itself under some kind of natural conditions, or from a distrust of the medical establishment's ability to improve health in a substantive way. I suspect that anyone considering the vaccination question has some kind of predisposition toward one side of the debate or the other, and that this predisposition often goes unarticulated.

I can imagine all kinds of baseline beliefs that fuel conflict over social and political issues. For example:
  • People are basically good (or bad)
  • Government agencies are generally corrupt (or trustworthy)
  • Life on Earth is generally getting better (or worse)
One can find indications of any of these things but nothing like definitive proof. If we think about beliefs scientifically, the way to get evidence for something is to assume the opposite and then demonstrate that the data are inconsistent with that assumption. There is no further scientific guidance in forming the null hypothesis: it's just the thing you are trying to disprove. Of course there will always be some reason why one chooses to look for evidence of one particular thing, but this cannot be reduced to a purely logical exercise. I would like to see more explicit acknowledgement of all sorts of baseline beliefs: e.g., the anti-vaxxer says, "I oppose vaccinations because I think medical science is full of it," or the research scientist says, "I am trying to prove this result because it would be neat if it were true."

Tim Minchin's disbelief in God is a matter of faith. Nothing wrong with that, and he can try to argue that this belief is somehow better or more reasonable than the opposite belief. But he can't claim that he is on the side of objective truth.