People who take more time to think over certain decisions — like how much money to contribute to a common pool — tend to give less than those who act more impulsively.
That’s according to the results of a recent “public goods” experiment conducted by behavioral psychology researchers at Harvard University.
The debate over whether humans are, by nature, more selfish or more cooperative has raged for decades, with evidence supporting both conjectures.
Prior laboratory experiments showing a tendency for selfish actions have been criticized for not drawing their subject from a more representative population pool (most utilized young college students).
So, in this most recent study, the researchers drew upon users of the website Amazon Mechanical Turk, where people sign up to do small amounts of work for small amounts of money. Users come from a wider slice of the overall population, and so, are more representative of society. It is also a favorite on-line laboratory for social scientists.
In the experiment — a decision-making game called “public goods” game — voluntary participants were separated into smaller groups of four, given .40 cents each, and then asked how much they wanted to contribute to a common pool. They were also told that whatever amount ended up in the pool would be doubled and the total divided amongst each member of the group.
Thus, if everyone contributed their entire allotment, that is, if they acted cooperatively, they would see their share double.
However, the instructions given to participants indicated that the game would reward selfishness. If, for example, one player chose to contribute nothing, s/he would end up receiving .60 (giving him/her 1.00) while the others received much less as their share. Of course, if no one contributed anything, participants would retain the original 40 cents.
A total of ten experiments were conducted.
Those who decided quickly contributed on average 27 cents, while those who decided slowly end up giving on average only 21 cents.In this first experiment, subjects were not told to decide quickly or otherwise.
This experiment was followed up with a second one in which some participants were instructed to decide within ten seconds, while others were told to wait at least ten seconds before deciding (in order to “think more carefully”). But the same thing happened: the people who decided quickly contributed larger amounts than those who deliberated longer.
Lead researcher David Rand believes that spending more time deliberating how much to give may allow that person to realize that he can achieve greater benefit from being selfish. Rand explains:
“If they stop and think about it, they realize, Oh, this is one of those situations where actually I can take advantage of the person and get away with it.” [quote source]
These results were confirmed back in the university lab (using a less representative subject pool); although these lab subjects gave less on average than the on-line subjects, quick deciders still gave more.
According to the researchers (quoting from the paper abstract):
“…we propose that cooperation is intuitive because cooperative heuristics are developed in daily life where cooperation is typically advantageous…Our results provide convergent evidence that intuition supports cooperation in social dilemmas, and that reflection can undermine these cooperative impulses.”
The full results of the ten experiments were published last month in Nature, under the title ‘Spontaneous giving and calculated greed’ , by authors David G. Rand, Joshua D. Greene, and Martin A. Nowak
Author’s After thoughts:
It should be noted that even those who deliberated longer still gave some amount of money; “pure” selfishness — in which one or more subjects gave nothing — did not occur.
This fact may be the result of the study design and rules; in reality, such “games” (e.g., an investment pool that doubles each contributor’s share) do not exist (or are exceedingly rare).
Although the authors state that “heuristics are developed in daily life where cooperation is typically advantageous”, our real world societies may offer more opportunities to act selfishly (such as in the stock market, where short-selling is common). Further, some social values (here in the U.S.) champion “individual competitiveness” and “winner take all” competitive behaviors (a type of which is known as a “zero sum game” in Gamer Theory).
However, these later behaviors may be recently evolved, “over-layers” on our basic human (biological) tendencies to act cooperatively. In the classic “Prisoner’s Dilemma” game, most participants’ first instinct is to cooperate (i.e., trust that others will act cooperatively too), while “defection” from the cooperative norm tends to spread through the group only when one, or a few, decide at some point to act non-cooperatively.
In Game Theory, the most “winning” strategy is to copy your opponent’s strategy; if he cooperates, then you cooperate; if he defects, then you defect.
Thus, some socially selfish behaviors (those that can be represented in a game, anyways) may be the result of modeling others who act selfishly, an instinctive response to selfish behavior, perhaps, but not reflective of our basic human natures. Further, this defection from a cooperative style of play (in the PD game) can quickly back-fire on the original defector, as too many copy-cat defectors will reduce the original defector’s gains over time. And, according to recent research by Stewart and Plotkin (2013), the best long-term strategy (in a social PD game scenario) is the ‘generous’ (i.e., always cooperate) strategy, as it tends to be copied by most all of the other players and thus creates a type of social pressure on any would-be defectors to do likewise.
Finally, there is abundant evidence that we are cooperative — a trait known as “prosociality” — by nature, with some evidence of prosocial behaviors, such as imitation, teaching, and sharing, existing in human infants as young as 8 months. Such findings are strong indicators that prosociality and cooperative behavior is in-borne.
Top image: Le Penseur (Auguste Rodin) ; Daniel Stockman