Lots of mental states are conscious, lots of mental states are intentional, and lots of mental processes are rational, and the question does rather suggest itself how anything that is material could be any of these. -- Jerry FodorThis quote mined from the Philosophy ProFeser.
The 'Feser himself goes on to say:
... the difference between conscious systems and unconscious ones seems clearly to be a difference in quality and not merely of quantity [of complexity]. This is the problem of consciousness.
As John Searle has put it, the robot’s symbolic representations – like words, sentences, and symbols in general – have only derived intentionality, while human thought has original or intrinsic intentionality. What can account for the difference, especially if we assume that human beings are no less material than robots? That, in a nutshell, is the problem of intentionality.
But no one has to assign meaning to our mental processes in order for them to count as logical. So, what accounts for the difference? How are we able to go from one thought to another in accordance, not just with physical causal laws, but in accordance with the laws of logic? That is the problem of rationality.A bit more...
... the problems of consciousness and intentionality are... an artifact of certain historically contingent metaphysical assumptions.... In particular... the “mechanistic” revolution.
If nothing in the material world inherently “points to” or “aims at” anything else – if matter is comprised of nothing more than inherently purposeless, meaningless particles in motion – then, since the brain is made up of these particles no less than any other material object is, it seems to follow that the intentionality of our thoughts, that by virtue of which they inherently “point to,” “aim at,” or mean something beyond themselves, cannot be any sort of material property of the brain. Thus is generated the problem of intentionality.I love this stuff.