Decisions based on moral, rather than practical, considerations are made more quickly and are often more extreme, research shows. But perhaps more significantly, the findings showed flexibility in what we consider to be moral or non-moral decisions. “Little work has been done on how attaching morality to a particular judgment or decision may affect that outcome,” explains Jay Van Bavel, an assistant professor in New York University’s Department of Psychology and co-author of the study published in the journal PLOS ONE. “Our findings show that we make and see decisions quite differently if they are made with a morality frame. “But, despite these differences, there is now evidence that we can shift judgments so they are based on practical, rather than moral, considerations—and vice versa.” Deciding to frame any issue as moral or not may have important consequences, says co-author Ingrid Haas, an assistant professor of political science at the University of Nebraska-Lincoln. “Once an issue is declared moral, people’s judgments about that issue become more extreme, and they are more likely to apply those judgments to others.”
Posts Tagged ‘Culture’
WHERE do new words come from? On Twitter at least, they often begin life in cities with large African American populations before spreading more widely, according to a study of the language used on the social network. Jacob Eisenstein at the Georgia Institute of Technology in Atlanta and colleagues examined 30 million tweets sent from US locations between December 2009 and May 2011. Several new terms spread during this period, including “bruh”, an alternative spelling of “bro” or “brother”, which first arose in a few south-east cities before eventually hopping to parts of California. Residents of Cleveland, Ohio, were the first to use “ctfu”, an abbreviation of “cracking the fuck up”, usage that has since spread into Pennsylvania (arxiv.org/abs/1210.5268). After collecting the data, the team built a mathematical model that captures the large-scale flow of new words between cities. The model revealed that cities with big African American populations tend to lead the way in linguistic innovation. The team is still working on a more detailed analysis and says it is too early to say which cities are the most influential. (via Twitter shows language evolves in cities – tech – 17 November 2012 – New Scientist)
Campaigners in Sweden are trying to force a dictionary to change its definition of “nerd”. But after two decades of “reappropriation” has “nerd” – and its sister word “geek” – now completely lost its derogatory connotations? In the 1984 film Revenge of the Nerds the rousing final speech of one of the protagonists starts with the statement: “I’m a nerd.” Its plot may be cartoonish but the film reveals a certain cultural backdrop – to be a nerd was to be socially awkward, even socially inferior. Jocks, those who were good at sport, or other socially successful groups, usually ended up winning. To turn that on its head could form the basis for comedy. Things have changed. (via BBC News – Are ‘geek’ and ‘nerd’ now positive terms?)
The main questions I’ve been asking myself over the last couple years are broadly about how culture drove human evolution. Think back to when humans first got the capacity for cumulative cultural evolution—and by this I mean the ability for ideas to accumulate over generations, to get an increasingly complex tool starting from something simple. One generation adds a few things to it, the next generation adds a few more things, and the next generation, until it’s so complex that no one in the first generation could have invented it. This was a really important line in human evolution, and we’ve begun to pursue this idea called the cultural brain hypothesis—this is the idea that the real driver in the expansion of human brains was this growing cumulative body of cultural information, so that what our brains increasingly got good at was the ability to acquire information, store, process and retransmit this non genetic body of information.
Tags: Culture, Evolution, Sociology, stereotypes
Researchers from Scotland suggest that stereotypes form and evolve over time through social transmission of information, similar to the way in which languages evolve. The research team led by Dr. Doug Martin of the Person Perception Lab at the University of Aberdeen used a technique they have used previously to study the evolution of language. They invented a series of aliens and randomly assigned them different colors, shapes and attributes such as selfishness, adventurousness, arrogance or trustfulness. A volunteer was then called in to learn about the aliens and memorize their personality traits and physical attributes. The volunteer then relayed this information to the researchers, who passed it on to the next volunteer, and so on down a communication chain. What they discovered was that stereotypes began to form almost immediately and particular shapes and colors became linked with personality traits. As it passed down the communication chain the information was unintentionally changed and simplified, and became more structured and thus easier to learn.
Tags: Culture, Evolution, Language, Mind, Neuroscience
Languages are extremely diverse, but they are not arbitrary. Behind the bewildering, contradictory ways in which different tongues conceptualise the world, we can sometimes discern order. Linguists have traditionally assumed that this reflects the hardwired linguistic aptitude of the human brain. Yet recent scientific studies propose that language “universals” aren’t simply prescribed by genes but that they arise from the interaction between the biology of human perception and the bustle, exchange and negotiation of human culture.
Language has a logical job to do—to convey information—and yet it is riddled with irrationality: irregular verbs, random genders, silent vowels, ambiguous homophones. You’d think languages would evolve towards an optimal state of concision, but instead they accumulate quirks that hinder learning, not only for foreigners but also for native speakers.
These peculiarities have been explained by linguists by reference to the history of the people who speak it. That’s often fascinating, but it does not yield general principles about how languages have developed—or how they will change in future. As they evolve, what guides their form?
The experience of losing our ‘net connection becomes more & more like losing a friend
Sparrow, B., Liu, J., and Wegner, D. M. (2011, 5 Aug). Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips. Science, 333(6043): 776-778.
abstract only, tho I have access via LSE.
The advent of the Internet, with sophisticated algorithmic search engines, has made accessing information as easy as lifting a finger. No longer do we have to make costly efforts to find the things we want. We can “Google” the old classmate, find articles online, or look up the actor who was on the tip of our tongue. The results of four studies suggest that when faced with difficult questions, people are primed to think about computers and that when people expect to have future access to information, they have lower rates of recall of the information itself and enhanced recall instead for where to access it. The Internet has become a primary form of external or transactive memory, where information is stored collectively outside ourselves.
Here are my notes & quotes:
Storing information externally is nothing particularly novel, even before the advent of computers. In any long-term relationship, a teamwork environment, or other ongoing group, people typically develop a group or transactive memory, a combination of memory stores held directly by individuals and the memory stores they can access because they know someone who knows that information… The present research explores whether having online access to search engines, databases, and the like, has become a primary transactive memory source in itself.
results from experiment 1: when asked difficult trivia questions, do people think about computers more quickly?
Although the concept of knowledge in general seems to prime thoughts of computers, even when answers are known, not knowing the answer to general-knowledge questions primes the need to search for the answer, and subsequently computer interference is particularly acute.
results of experiment 2: will people only remember keywords when they think they’ll have access to a computer to look up information int he future?
Participants apparently did not make the effort to remember when they thought they could later look up the trivia statements they had read. Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.
they were more affected about whether they’d be able to look something up later than whether they had to remember it at all.
results of experiment 3: do people remember things better when they know if/where info is saved?
…believing that one won’t have access to the information in the future enhances memory for the information itself, whereas believing the information was saved externally enhances memory for the fact that the information could be accessed, at least in general.
having a search function – on the web or on a computer – means that you won’t use cognitive capacity to remember where you saw it, but knowing something’s been erased will use “memory demands”.
finally, results of experiment 4: do people remember where saved information can be found?
“where” was prioritized in memory, with the advantage going to “where” when “what” was forgotten…This is preliminary evidence that when people expect information to remain continuously available (such as we expect with Internet access), they are more likely to remember where to find it than to remember the details of the item. One could argue that this is an adaptive use of memory—to include the computer and online search engines as an external memory system that can be accessed at will.
and their conclusions:
..processes of human memory are adapting to the advent of new computing and communication technology.
we are learning what the computer “knows” and when we should attend to where we have stored information in our computer based memories. We are becoming symbiotic with our computer tools, growing into interconnected systems that remember less by knowing information than by knowing where the information can be found.
and the kicker:
We have become dependent on [our gadgets] to the same degree we are dependent on all the knowledge we gain from our friends and co-workers—and lose if they are out of touch. The experience of losing our Internet connection becomes more and more like losing a friend.