Thursday, March 22, 2012

Introduction to Intentionality #2: Materialist Solutions

In the first post, we looked at the problem of intentionality and how it may be a serious challenge to materialistic and naturalistic theories of mind. There hasn't been a successful explanation of intentionality in physical terms thus far, but there are several attempts. This article will look at a few of these.

I. Conceptual Role

Let's consider the possibility that something represents something else in virtue of the fact that it relates to other representations. So for example your thought about Mt Everest refers to Mt Everest because of its relationship to other neighboring thoughts, such as "mountains are large landforms" and "the Himalayas are a mountain range in China and Nepal", and so forth. So a thought represents something by being in context with other thoughts that also represent things. This web of representations, each

The obvious question is that, while this nicely explains how a thought has a specific meaning, it doesn't explain how anything has any meaning at all. All these other thoughts also have meaning, and if they get their meaning from the context of other thoughts, then you go to infinity and never end up explaining how any thought has any meaning at all in the first place.

The answer is that these thoughts rest on a Background of non-intentional capacities for acting with the world. At the foundational level, there is, for example, the behavioral capacity for climbing a mountain, and this forms the foundation of the thoughts and beliefs having the meaning that they do.

The problem with this is that there is a fundamental difference between motion, and intentional acting. A rock rolling down a hill is not intentional action; it's just matter in motion. An unconscious behavior, such as fleeing from a predator, is an intentional action since it is directed towards a specific end. So the behavior at the bottom level would still be intentional, if it is to explain anything, and so it leaves intentionality still unreduced.


II. Causal

An improvement of this theory is to recognize that for a thought to represent something else, it must connect somehow with the external world. So if there is a constant conjunction between an external stimulus, such as the vision of Mt Everest, along with the internal brain event that processes it, then eventually the thought or brain event will come to represent its referent.The external stimulus is causing the thought to point to something.

This theory is appealing but has several problems.

The first problem is that it can't account for misrepresentation. You might have a thought about Mt Everest, but in fact be mistaken and your thought is really about the neighboring mountain Lohtse, because it is at night and conditions are bad and you are mistaken. So in this case your thought about Mt Everest is caused, not by Mt Everest, but by Lohtse.

This further raises yet a further problem. If Mt Everest or Lohtse-at-night can cause your thought about Mt Everest, then why should the thought be considered a thought about [Mt Everest], rather than a thought about [Mt Everest or Lohtse-at-night]? Why does Everest get to win and be what the thought is about? A possible response to this is to say that the stimulus, Mt Everest, normally causes the thought, but other stimuli are parasitic on it, such as Lohtse-at-night. The second thought, caused by Lohtse-at-night, would not exist without the "parent" thought, caused by Mt Everest.

But there are even more problems.

Your thought about Mt Everest might be caused, not by Mt Everest, but by hallucinogenic drugs, or by the Matrix. Perhaps we live in the Matrix and there is no actual Mt Everest, in which case your thought about it was not caused by it at all, because it doesn't even exist. Your thought about Mt Everest was caused instead by computer software input directly into your brain from the Matrix.

Also, this theory explains why a certain thought has a certain meaning rather than some other meaning, but it doesn't explain how thoughts can have any meaning at all in the first place.

And finally, there is yet another problem with the causal account. There is a long causal chain, from the stimulus that is the object of thought all the way to the brain processes. The light photons travel from the sun, hit the snow of Mt Everest, travel through the air, into the eyes, where they cause electrons to travel up the optic nerve to the visual cortex, where they cause other brain processes, and so on. This is one long chain of causal processes, with Mt Everest at one end and the thought that represents Mt Everest on the other. Yet, if we remove what we already know to be the two end points of this causal chain, and just look at it in purely materialistic terms, we just have one long causal chain with no specific end points that are objective. If the photons from the sun are A, the travel through the air is B, Mt Everest is E, travelling through the air from the mountain to our eyes is F through M, entering our eyes is N, hitting our retinas is O, travelling through our optic nerve is P, and so on, then there are no objective endpoints. And so we would have to bring in another mind to say that T represents E, rather than M representing F. And bringing in another mind to assign meaning is once again to leave mind, and representation, unreduced and unexplained.

III. Biological Semantics

So if both these theories leave much to be desired, let's turn instead to biological theories of intentionality. We could think of representations and intentionality as biological functions, designed by natural selection. The need to run away from predators is a biological function, and perhaps over time this function becomes a representation. Of predators, say.

This theory avoids the misrepresentation problem. In this case, your thought that represents bears is caused, not by bears themselves, but by the desire to avoid bears. So therefore if you mistakenly believe that an old stump is a bear, this can be explained as a result of your representation being caused by the desire to avoid predators.

One problem with this theory is that it can't account for sophisticated and abstract beliefs and thoughts. Concerning math, philosophy, or other high level reasoning. Natural selection wouldn't be able to program such functions in.

Biological theories of intentionality also can't deal with the disjunction problem. If your desire to avoid both bears and stumps-that-look-like-bears has been wired in by evolution, then your thought that represents bears would actually represent bears or stumps-that-look-like-bears, not bears specifically.

Some philosophers, in response to this, suggest that this is not a problem with this theory. That in fact thoughts do not have exact meaning, and there is no specific fact of the matter about what any particular thought is. But this seems wrong. When you think about Mt Everest, there is a specific thing you are thinking about: Mt Everest. There is a specific object of your thought. Your thought about Mt Everest isn't a thought about "maybe Mt Everest, maybe Lohtse, maybe chocolate milk, whatever". It's about a specific object, or referent. However, even if we accept that our thoughts are not about specific referents, and accept the biological theory of intentionality, there remains yet another problem with it.

If the theory can explain why certain thoughts have certain meanings, but to explain why any thought has any meaning at all in the first place, it has to appeal to biological functions. Behavior directed at specific ends, or goals, or aims. But the whole point of Darwinian evolution is that it explains life without having to refer to functions, purposes, teleology, and such. The function of the stomach is not to digest food. The stomach causes the digestion of food, but it has no function. But if intentionality, or the meaning of thoughts, is to be explained in terms of biological function, then this theory illegitimately brings in teleology, which is anathema to materialistic theories.

IV. Instrumentalism

So let's look at a final materialist theory of intentionality.

We could take several different positions towards various things. If we need to know how to move an elephant, we need to know it's weight, size, and so on. So we could take the physical stance towards it. If we need to know what the elephant's heart does, we can take the design stance. We can AS IF the heart were designed for a particular function: pumping blood. Even though it wasn't designed. And finally, if we need to understand elephant behavior, we can act AS IF the elephant were consciously aware of its behavior. If the elephant trumpets to warn its herd of a predator, we can act AS IF the elephant were consciously deciding to do so rather than just acting on blind instinct. This is called the "intentional stance." We ACT LIKE the elephant is figuring things out mentally, and has intentionality, but really it doesn't. Or think of a DVR. We can take the intentional stance towards it, speaking about it as if it "knows" our favorite shows and is "smart enough" to record them, even though we know it isn't really.

So this theory says that our own intentionality reduces to just that: we act AS IF we have intentionality, but really we do not.

A few obvious problems pop up right away. For us to be taking a conscious "stance" towards anything presupposes intentionality. If our minds were not really directed towards specific targets or goals, then we couldn't take any stance towards anything.

A possible reply is that a subprocess in the mind assigns intentionality to thoughts. This is called a "homunculus". A little mind inside your mind. And then we would need to explain the intentionality of the homunculus by postulating another homunclus inside his head, and so on. Normally, this would lead to an infinite regress and thus a non-explanation, but philosopher Daniel Dennet proposes stupider and stupider homunculi, until we get down to the bottom level of no homunculi at all. But to bridge that gap, from no homuncli to the first, and stupidest, homunclus, is not just a wide gap but an uncrossable one. For, to leap from non-intentional and non-teleological matter, to a very very stupid intentional homunculus, still requires some minimal intentionality. In short, there is a still an infinite distance between very very minimal intentionality, and ZERO intentionality.



V. Eliminativisim

Since all of the materialist theories of intentionality are lacking in many ways, several philosophers have suggested just dumping intentionality altogether. This view is called eliminative materialism. One of the more well known versions postulates that beliefs, desires, fears, thoughts, and so forth do not exist. There is only the brain's chemistry that can be described in neuroscientific terms, and that's it. Thoughts, beliefs, doubts, and the like, all intentional notions that are directed towards some target or subject, are a type of folk theory. Primitive peoples ascribe all sorts of intentional notions to nature, like the sea is angry about something, the sun desires sacrifices, or whatever. The same for astrology, magic, and the like. These are all folk theories, which modern science has refuted. So, ascribing beliefs, desires, anger, fear, pain, and so forth to human beings is similar to astrology and folk magic, and a future neuroscience, far from completion obviously, will also get rid of such notions from human beings.

The obvious problem with this is how science and reason can even operate, or how we can even have any knowledge at all, since these all rely on such intentional notions as beliefs, thoughts, and so forth. We all know we have doubts and beliefs and so forth, and the person who wants to convince us otherwise has quite the uphill battle. This theory may be similar to the solipsist, who wants to convince us, through only philosophical argumentation, that the external world does not exist. The solipsist's purely philosophical premises are stacked against our direct and every day experience of the external world. Clearly, the solipsist loses here. Similarly, the eliminative materialist stacks his purely philosophical premises against our direct and immediate experience of having beliefs, desires, and so forth, and so he may be on the losing side.

VI. Conclusion

So, to sum up, no materialist theory of intentionality has been successful so far. Conceptual role theories explain how a belief has a certain meaning in virtue of its neighboring beliefs, but doesn't explain how a belief can have any meaning at all in the first place. Causal theories say that meaning comes from connection with the outside world, but can't account for mistakes and the possibility of having beliefs that are not caused by the fact known. Biological theories explain our beliefs in terms of biological functions, but functions do not exist in the non-teleological world of Darwinian evolution. Dennet's instrumentalism explains meaning as being just a stance we take towards things, but cannot explain how meaning exists in the first place from meaningless matter. And finally, eliminativism, in saying that intentionality simply does not exist, may have the effect of making science and reason impossible, as these are all intentionality-laden tools.

So intentionality remains a puzzle for materialism of the mind, and whether a successful solution will be found or not remains to be seen.