True Believers: The Intentional Strategy and Why It Works
Reference: Dennett, D. C. (1981). In A. F. Heath (Ed.), Scientific Explanations, Oxford University Press. Reprinted in The Intentional Stance (1987, MIT Press, ch. 2) and in Mind Design II (1997). Source file: dennett_true_believers.pdf. URL
Summary
Dennett’s canonical defence of the intentional stance: a predictive strategy in which one treats a system as a rational agent with beliefs and desires, works out what a rational agent so situated ought to do, and predicts it will do that. The stance is contrasted with the physical stance (predict from physical laws and microstructure) and the design stance (predict from presumed function). Belief is a perfectly objective phenomenon, but one discernible only from the point of view of an adopter of this strategy; a system is a true believer just in case its behaviour is reliably predictable by intentional ascription.
Dennett argues this vindicates a “mild realism” (or “interpretationism”) about belief against both hard-core realism (beliefs as sentence-like tokens in the head) and eliminativism. The remarkable empirical fact is that the intentional strategy works — on people, animals, chess programs, even thermostats (in a degenerate way) — because evolution and engineering produce systems whose behaviour tracks their interests given accurate beliefs. This pattern of success is what there is to being a believer; no further fact about inner sentential structure is required. The paper grounds the philosophical licence to ascribe mental states to artificial agents and thus underwrites the mentalistic vocabulary of agent theory.
Key Ideas
- Three stances: physical, design, intentional.
- Intentional strategy: assume rationality, compute ought-to-do, predict.
- True believer = system reliably predicted by the intentional strategy.
- Belief is objective but stance-relative (mild realism / interpretationism).
- Natural selection explains why the strategy works on organisms.
- Thermostats as degenerate (but continuous) intentional systems.
- Licence for mentalistic ascription to artificial agents.
Connections
- Intentional Stance
- BDI
- Intelligent Agents Theory and Practice
- Agent-Oriented Programming
- Society of Mind
- Mentalistic Semantics
- Two Faces of Intention
- Modeling Rational Agents within a BDI-Architecture
Conceptual Contribution
- Claim: Beliefs and desires are objective but stance-relative — a system genuinely has them just in case adopting the intentional strategy (assume rationality with appropriate beliefs/desires and predict accordingly) reliably predicts its behaviour.
- Mechanism: Dennett distinguishes physical, design, and intentional stances; defines the intentional strategy operationally (attribute beliefs the system ought to have given its perceptual and epistemic history, attribute desires it ought to have given its biological/functional role, predict the rational action); argues success of the strategy on people, animals, and artefacts is an empirical fact explained by evolutionary/design pressures, and that this success is all there is to being a believer — no further requirement of language-of-thought tokens.
- Concepts introduced/used: Intentional Stance, True Believer, Mild Realism, Design Stance, Physical Stance, Rationality Assumption
- Stance: philosophical theory
- Relates to: Justifies the ascription of BDI mental states to artificial systems in Modeling Rational Agents within a BDI-Architecture, Intelligent Agents Theory and Practice and Agent-Oriented Programming; cohabits with Two Faces of Intention as a pillar of agent-theoretic philosophy; foreshadows functional-role arguments in Society of Mind.
Tags
#philosophy #intentional-stance #foundational #agents #belief