Citation
The Evolution of Cooperation
TL;DR
Tit-for-tat (cooperate first, then mirror opponent's previous move) wins iterated prisoner's dilemma tournaments
Axelrod and Hamilton's formal analysis of tit-for-tat strategy proved mathematically what vampire bats demonstrate biologically: cooperation can be evolutionarily stable when cheating gets detected and punished. Their work on the iterated prisoner's dilemma showed that simple reciprocal strategies outperform both pure cooperation and pure defection.
This research provides the game-theoretic foundation for understanding why reputation systems work: when interactions repeat, memory exists, and future value exceeds present gain, tit-for-tat emerges as the dominant strategy.
Key Findings from Axelrod & Hamilton (1981)
- Tit-for-tat (cooperate first, then mirror opponent's previous move) wins iterated prisoner's dilemma tournaments
- Cooperation becomes stable when interactions repeat and players remember past behavior
- Future benefits must exceed immediate gains from cheating
- Punishment of defectors is essential for cooperation stability
- Simple strategies (like tit-for-tat) outperform complex ones