Shared Truth Is Key to Human Cooperation

Humans are unique among animals in our unparalleled capacity to cooperate. While other social animals direct their help towards their kin – bees work together to ensure the success of the hive, and meerkats stand guard to protect their pups – humans extend their generosity to complete strangers, transcending the boundaries of genetic relatedness. A new study based on game theory now suggests that reciprocity, and our willingness to accept shared facts, are key to human cooperation.

It has long been known that human cooperation is so successful in large part because of our reciprocity. We help those who are likely to help us, and we withhold generosity from people whom we expect to be selfish.

But research in economics and game theory has also shown that reciprocity can easily falter when we disagree on who is worthy of our generosity. We may not always share the same views on which actions are morally acceptable, or which individuals deserve help. Often, we cannot even agree on the basic facts of what actions have occurred in the past.

Sarah Mathew and Robert Boyd, evolutionary anthropologists at Arizona State University’s School of Human Evolution and Social Change, have dedicated their careers to studying how human culture, morals, and norms of reciprocity have allowed our societies to evolve their unique “super-cooperation”. In a new study published this month in Nature Human Behaviour , Mathew and Boyd explain how cultural evolution may preserve reciprocity-based cooperation despite disagreements over facts of past behavior.

Mathew and Boyd used a branch of evolutionary game theory called direct reciprocity. In their models, players participated in repeated interactions, acting according to behavioral rules they wished to follow. Each time, a person could choose to either cooperate or withhold help, depending on what their partner did in their last encounter.

Previous research has firmly established that the retaliatory “tit-for-tat” principle, and similar rules, can foster cooperation, but errors and disagreements allow selfishness to thrive.

In the new study, researchers introduced a behavioral rule that has not been previously explored, the so-called “arbitration tit-for-tat”. If the two partners disagreed on what actions have occurred in the past, they had an option of paying a cost to consult a third-party arbiter. This third-party observer then used their own perspective to align players’ beliefs about what has transpired.

Mathew and Boyd found that “arbitration tit-for-tat” outperformed other common behavioral rules and restored high cooperation rates even when disagreements were frequent. What’s more, this new strategy fared better than the rival rules even when arbitration itself was not objective.

These new findings help explain why cooperation based on reciprocity is found only in humans, but not other animals. Because the threat of retaliatory defection alone is not enough to sustain generosity when facts are disputed, partners in human societies often have to resolve their disagreements by raising the issue with friends, elders, or courts.

In small-scale societies around the world, people create informal institutions that resolve similar disputes in day-to-day cooperative interactions. For years, Sarah Mathew has studied cooperation in Turkana people of Northwest Kenya. When the tribe accuses a Turkana man of selfish behavior, for instance, because they refused to join a raid for cattle, other members discuss the issue in informal gatherings. The society resolves disputes over moral wrongdoings through simple community discussion. Ju’/hoansi Bushmen of Botswana likewise debate disputes over food sharing in “group talking” sessions, until a consensus is reached.

Through simple discussions among peers, these communities manage to align the conflicting views and ensure that cooperation persists. Larger societies may require formal institutions of arbitration and norm enforcement. But in modern communities that cannot maintain effective institutions establishing shared facts, for example, in online social networks, cooperation may give way to destructive polarization.

In 2019, the University of Pennsylvania researchers led by the mathematician Joshua Plotkin reached a similar conclusion. Using a different branch of game theory, the group found that our willingness to altruistically help strangers depends on shared knowledge about moral reputations, and that cooperation easily collapses when people disagree about which actions were morally good or bad.

But their research also revealed that altruistic behavior can be restored if people choose to be more empathetic by considering moral views of their partners. Empathy, they found, often works as well as formal institutions.

When it comes to human morality, people rarely share the same beliefs. In this day and age, it is no longer surprising that even the basic facts about people’s actions can be disputed, not always out of selfishness. Online social networks, once predicted to strengthen social ties and foster global cooperation, often propagate disputes over facts, without providing tools for their resolution. Evolutionary models based on game theory consistently show that if such disagreements are common, societies will often fail to remain cooperative. But the heartening conclusion of the new research is that so long as we can find ways to reach a common truth – even if it does not reflect our innermost beliefs – cooperation can prevail.


Arunas Radzvilavicius Ph.D.