I found this quote from an article in the Sunday New York Times book review section interesting because of its the implicit bias that it reveals:
Consider an experiment economists call “the ultimatum game”: The experimenter gives one player, the sender, $20 to distribute between himself and another player, the receiver. An egalitarian sender might propose a split of $10 each. A more selfish sender might propose to give the receiver only $1, keeping $19 for himself. If the receiver accepts the deal, the two players collect their shares. If the receiver rejects the deal, both walk away with nothing. Were humans perfectly rational, the receiver would accept whatever is offered: even a dollar is better than nothing, right? Instead, researchers find, receivers will reject an overly lopsided deal, gladly giving up their shares just to punish the stingy senders.But wait. Why is it more "rational" to accept whatever money is offered to you even if someone else gets more money out of the deal? Why is acceptance of an unequal transaction "rational"? There is a sort of implicit libertarian sort of logic embedded in that statement that would make Rand Paul proud. Without even questioning this assumption, the reviewer in this article takes for granted the notion (also apparently expressed by the author of the book she is reviewing) that it is more "rational" to accept a greater amount money in an unequal transaction than the zero dollars that would be accepted were one to reject the transaction altogether. Behind that assumption lies a further one--that concern for maximizing whatever one can acquire for one's self is the only truly rational basis for human behavior. But in fact there is simply no reason to assume this. Humans, and in fact other primates as well, are social animals who often adhere to concepts of justice and fairness:
According to research due to be published in the journal Animal Behaviour, fairness is not only essential to the human social contract, it also plays an important role in the lives of nonhuman primates more generally. Sarah F. Brosnan and colleagues conducted a series of behavioral tests with a colony of chimpanzees housed at the University of Texas in order to find out how they would respond when faced with an unfair distribution of resources. A previous study in the journal Nature by Brosnan and Frans de Waal found that capuchin monkeys would refuse a food item when they saw that another member of their group had received a more desired item at the same time (a grape instead of a slice of cucumber). Some individuals not only rejected the food, they even threw it back into the researchers face. The monkeys seemed to recognize that something was unfair and they responded accordingly. This raised the provocative question: can the basis of the social contract be found in our evolutionary cousins?The upshot of this is that for us primates, transactions have a social character, and one could just as easily argue that it is perfectly rational for humans to take into consideration how a transaction affects other people besides themselves in evaluating whether to participate in the transaction or not. The answer that you give to that question may say more about your own biases than it does about what rationality really means.
I would argue that "rational" doesn't just have to translate to "how something will benefit me personally the most". It is possible to conceive of a rationality that accounts for how others besides one's self will benefit or be harmed by it, or how equitably a transaction treats all the parties involved. This understanding lies at the basis of many concepts of justice--and I happen to think that justice is actually a perfectly rational concept.