Brain & Behavior

Michael Scally MD

Doctor of Medicine
10+ Year Member
Several pioneering studies have begun examining the relationship between brain activity and political attitudes, but none have characterized brain structure. Political attitudes are typically captured on a single-item measure in which participants self-report using a five-point scale ranging from ‘‘very liberal’’ to ‘‘very conservative.’’ Despite the simplicity of such a scale, it accurately predicts voting behaviors of individuals and has been used successfully to determine genetic contributions to political orientation. Psychological differences between conservatives and liberals determined in this way map onto self-regulatory processes associated with conflict monitoring. Moreover, the amplitude of event-related potentials reflecting neural activity associated with conflict monitoring in the anterior cingulate cortex (ACC) is greater for liberals compared to conservatives. Thus, stronger liberalism is associated with increased sensitivity to cues for altering a habitual response pattern and with brain activity in anterior cingulate cortex.

Here researchers explored this relationship further by examining whether political attitudes correlated not just with function but also with anatomical structure of these regions. To test the hypothesis that political liberalism (versus conservatism) is associated with differences in gray matter volume in anterior cingulate cortex, they recorded structural magnetic resonance imaging (MRI) scans from 90 healthy young adults (61% female) who self-reported their political attitudes confidentially on a five-point scale from ‘‘very liberal’’ to ‘‘very conservative.’’ They then used voxel-based morphometry (VBM) analyses to investigate the relationship between these attitudes, expressed as a numeric score between one and five, and gray matter volume. They found that increased gray matter volume in the anterior cingulate cortex was significantly associated with liberalism (R = 22.71, T(88) = 2.633, p = 0.010 corrected). They regressed out potential confounding variables of age and gender in the analysis, therefore, their findings are not attributable to these factors.

Apart from the anterior cingulate cortex, other brain structures may also show patterns of neural activity that reflect political attitudes. Conservatives respond to threatening situations with more aggression than do liberals and are more sensitive to threatening facial expressions. This heightened sensitivity to emotional faces suggests that individuals with conservative orientation might exhibit differences in brain structures associated with emotional processing such as the amygdala. Indeed, voting behavior is reflected in amygdala responses across cultures. They therefore further investigated the structural MRI data to evaluate whether there was any relationship between gray matter volume of the amygdala and political attitudes. They found that increased gray matter volume in the right amygdala was significantly associated with conservatism (R = 0.23, T(88) = 22.22, p < 0.029 corrected). No significant correlation was found in the left amygdala (R = 0.15, T(88) = 21.43, p = 0.15 corrected).

ACC.JPG


Kanai R, Feilden T, Firth C, Rees G. Political Orientations Are Correlated with Brain Structure in Young Adults. Current biology : CB. Current Biology - Political Orientations Are Correlated with Brain Structure in Young Adults

Substantial differences exist in the cognitive styles of liberals and conservatives on psychological measures [1]. Variability in political attitudes reflects genetic influences and their interaction with environmental factors [2, 3]. Recent work has shown a correlation between liberalism and conflict-related activity measured by event-related potentials originating in the anterior cingulate cortex [4]. Here we show that this functional correlate of political attitudes has a counterpart in brain structure. In a large sample of young adults, we related self-reported political attitudes to gray matter volume using structural MRI. We found that greater liberalism was associated with increased gray matter volume in the anterior cingulate cortex, whereas greater conservatism was associated with increased volume of the right amygdala. These results were replicated in an independent sample of additional participants. Our findings extend previous observations that political attitudes reflect differences in self-regulatory conflict monitoring [4] and recognition of emotional faces [5] by showing that such attitudes are reflected in human brain structure. Although our data do not determine whether these regions play a causal role in the formation of political attitudes, they converge with previous work [4, 6] to suggest a possible link between brain structure and psychological mechanisms that mediate political attitudes. º Political liberalism and conservatism were correlated with brain structure º Liberalism was associated with the gray matter volume of anterior cingulate cortex º Conservatism was associated with increased right amygdala size º Results offer possible accounts for cognitive styles of liberals and conservatives
 
Last edited:
Signs, signs, everywhere signs: Seeing God in tsunamis and everyday events
http://www.scientificamerican.com/blog/post.cfm?id=signs-signs-everywhere-signs-seeing-2011-03-13

By Jesse Bering
Mar 13, 2011

It’s only a matter of time—in fact, they’ve already started cropping up—before reality-challenged individuals begin pontificating about what God could have possibly been so hot-and-bothered about to trigger last week’s devastating earthquake and tsunami in Japan. (Surely, if we were to ask Westboro Baptist Church members, it must have something to do with the gays.) But from a psychological perspective, what type of mind does it take to see unexpected natural events such as the horrifying scenes still unfolding in Japan as "signs" or "omens" related to human behaviors?


Piazza J, Bering JM, Ingram G. "Princess Alice is watching you": Children's belief in an invisible person inhibits cheating. Journal of Experimental Child Psychology 2011;109(3):311-20. http://www.jessebering.com/publications.php

Two child groups (5-6 and 8-9 years of age) participated in a challenging rule-following task while they were (a) told that they were in the presence of a watchful invisible person ("Princess Alice"), (b) observed by a real adult, or (c) unsupervised. Children were covertly videotaped performing the task in the experimenter's absence. Older children had an easier time at following the rules but engaged in equal levels of purposeful cheating as the younger children. Importantly, children's expressed belief in the invisible person significantly determined their cheating latency, and this was true even after controlling for individual differences in temperament. When "skeptical" children were omitted from the analysis, the inhibitory effects of being told about Princess Alice were equivalent to having a real adult present. Furthermore, skeptical children cheated only after having first behaviorally disconfirmed the "presence" of Princess Alice. The findings suggest that children's belief in a watchful invisible person tends to deter cheating.


Bering JM, Parker BD. Children's Attributions of Intentions to an Invisible Agent. Developmental Psychology 2006;42(2):253-62. www.jessebering.com/pdf/childrens-attributions-of-intentions.pdf /

Children ages 3-9 years were informed that an invisible agent (Princess Alice) would help them play a forced-choice game by "telling them, somehow, when they chose the wrong box," whereas a matched control group of children were not given this supernatural prime. On 2 unexpected event trials, an experimenter triggered a simulated unexpected event (i.e., a light turning on/off; a picture falling), and children's behavioral response to these events (i.e., moving their hand to the opposite box) was coded. Results showed a significant Age Group × Experimental Condition interaction; the only children to reliably alter their behavior in response to the unexpected events were the oldest children (M = 7 years 4 months), who were primed with the invisible agent concept. For children's posttest verbal explanations, also, only these children saw the unexpected events as being referential and declarative (e.g., "Princess Alice did it because I chose the wrong box"). Together, these data suggest that children may not regularly begin to see communicative signs as embedded in unexpected events until they are around 7 years of age.
 
People Know When First Impressions Are Accurate
People know when first impressions are accurate

ScienceDaily (Apr. 17, 2011) — First impressions are important, and they usually contain a healthy dose both of accuracy and misperception. But do people know when their first impressions are correct? They do reasonably well, according to a study in the current Social Psychological and Personality Science (published by SAGE).

Researchers had two separate groups of more than100 people meet in a "getting-acquainted" session much like speed-dating, until the people had spoken with everyone else in the group for three minutes each. At the end of each 3-minute chat, they rated each other's personalities, and rated how well they thought their impressions "would agree with someone who knows this person very well." To establish what the person was "really" like, the researchers had people fill out their own personality reports, which were bolstered with personality ratings that came either from friends or parents.

There is a large body of research that shows impressions can be accurate with short interactions, and the participants did a reasonably good job of seeing each other's personality. And the more accurate they felt, the closer their ratings to the friends' and parents' ratings (although this correlation was not perfect). The participants also found the highest accuracy from people who rated themselves moderately accurate -- when people were highly confident of their judgment, accuracy was not greater than for moderate levels of confidence.

The research team, led by Jeremy Biesanz of the University of British Columbia, noted that there are two ways to be right about people's personality. We can know how people are different from each other, but a good judge of persons knows that people are mostly alike -- for example, almost everyone would prefer being friendly to being quarrelsome. The more people rated their partner's personality in a way typical of most everyone, the more accurate they felt their perception was. And because most people are like most people, they were indeed being accurate.

"Many important decisions are made after very brief encounters -- which job candidate to hire, which person to date, which student to accept," write the authors. "Although our first impressions are generally accurate, it is it critical for us to recognize when they may be lacking."


Biesanz JC, Human LJ, Paquin A-C, et al. Do We Know When Our Impressions of Others Are Valid? Evidence for Realistic Accuracy Awareness in First Impressions of Personality. Social Psychological and Personality Science. http://spp.sagepub.com/content/early/2011/01/19/1948550610397211.full.pdf

Do people have insight into the validity of their first impressions or accuracy awareness? Across two large interactive round-robins, those who reported having formed a more accurate impression of a specific target had (a) a more distinctive realistically accurate impression, accurately perceiving the target’s unique personality characteristics as described by the target’s self-, parent-, and peer-reports, and (b) a more normatively accurate impression, perceiving the target to be similar to what people generally tend to be like. Specifically, if a perceiver reported forming a more valid impression of a specific target, he or she had in fact formed a more realistically accurate impression of that target for all but the highest impression validity levels. In contrast, people who generally reported more valid impressions were not actually more accurate in general. In sum, people are aware of when and for whom their first impressions are more realistically accurate.
 
Refuse to learn from experience? Thank your genes
http://www.scientificamerican.com/blog/post.cfm?id=refuse-to-learn-from-experience-tha-2011-04-19

By Katherine Harmon
Apr 19, 2011

Some people are incurable contrarians or imperturbable logicians. But most of us, whether we like it or not, allow other people's opinions and advice to color our own experiences and opinions. Have you found that restaurant to really be as good as people say it is?

New findings suggests that a person's willingness to coolly consider the facts gleaned from their own experience—apart from others—might be based in large part on genetics.


It has been known and frequently demonstrated that "people will distort what they experience to be perceived as more consistent with what they thought already," Michael Frank, of the Brown Institute for Brain Science at Brown University, and a collaborator in the new research, said in a prepared statement. Even researchers can fall prey to confirmation bias, thinking they have discovered what they actually had expected to find in the noise of data.

So, why do we often struggle to accept our own impressions if they contradict what we've been told to expect? The disconnect occurs in part because these two types of information, the abstract and the experiential, are processed in different parts of the brain. Advice ("go to that Italian restaurant") is filtered, along with other higher-level cognition, in the prefrontal cortex. Experience ("that Italian restaurant is usually mediocre"), on the other hand, is lodged in a more primitive region of the brain, the striatum.

Although perhaps we should be more inclined to stick with what our gut (or tastebuds) has learned from personal experience, most people tend to lean on what their prefrontal cortex—i.e. outside instruction—has to say for more time than they rationally should.

"Maintaining instructions in the prefrontal cortex changes the way that the striatum works," Bradley Doll, a researcher at Brown, said in a prepared statement. "It biases what people learn about the contingencies they are actually experiencing," noted Doll, who coauthored a new paper detailing the results, which published online April 19 in The Journal of Neuroscience.

People's willingness to let advice color their experience hinges at least in part on the neurotransmitter dopamine, which is associated with pleasure, reward and learning. The researchers pinpointed one gene in particular, COMT, that seems to play a role in a person's inclination to learn from his or her own experiences. Individuals in the study with different alleles of this gene had differing propensities to be biased by outside advice in interpreting their own experiences.

Frank, Doll and colleague Kent Hutchinson tested more than 70 adults on a computer-based learning program. Subjects had to learn which symbols were most likely to be classified as the "correct" answer. The correlation was based on probability, rather than strict correlation, creating a gray area in which subjects had to weigh their past experiences with each symbol. In some tests, people were given advice about which symbols were correct most often—but this advice sometimes proved to be incorrect.

People with an exceptional ability to spot inaccurate instructions and start making decisions using their own experience tended to have the Val/Val version of the gene, whereas those who needed "greater confidence" that their experience was telling them to jettison earlier advice were more likely to have the Met allele.

Overall, the researchers concluded, "these findings suggest that the striatal learning process is modulated by prior expectations, and that the resulting associative weights cannot be easily 'undone' after the prior is rejected." So that might mean you have to order many bowls of substandard pasta before you finally admit to yourself that a much-lauded Italian restaurant isn't actually all that great.

Of course, it's certainly easier—and less painful—to learn to avoid a hot plate by being told to do so, and we've likely evolved to take this into account, prizing the prefrontal cortex's retained instructions. "This phenomenon of confirmation bias might actually just be a byproduct of a system that tries to be more efficient with the learning process," Frank said.

But the human mind is rarely satisfied with simple instruction, as instruction—and advice—often turn out to be wrong. And what's a few burnt fingertips in the grand scheme of independent thought?


Doll BB, Hutchison KE, Frank MJ. Dopaminergic Genes Predict Individual Differences in Susceptibility to Confirmation Bias. The Journal of Neuroscience 2011;31(16):6188-98. Dopaminergic Genes Predict Individual Differences in Susceptibility to Confirmation Bias

The striatum is critical for the incremental learning of values associated with behavioral actions. The prefrontal cortex (PFC) represents abstract rules and explicit contingencies to support rapid behavioral adaptation in the absence of cumulative experience. Here we test two alternative models of the interaction between these systems, and individual differences thereof, when human subjects are instructed with prior information about reward contingencies that may or may not be accurate. Behaviorally, subjects are overly influenced by prior instructions, at the expense of learning true reinforcement statistics. Computational analysis found that this pattern of data is best accounted for by a confirmation bias mechanism in which prior beliefs—putatively represented in PFC—influence the learning that occurs in the striatum such that reinforcement statistics are distorted. We assessed genetic variants affecting prefrontal and striatal dopaminergic neurotransmission. A polymorphism in the COMT gene (rs4680), associated with prefrontal dopaminergic function, was predictive of the degree to which participants persisted in responding in accordance with prior instructions even as evidence against their veracity accumulated. Polymorphisms in genes associated with striatal dopamine function (DARPP-32, rs907094, and DRD2, rs6277) were predictive of learning from positive and negative outcomes. Notably, these same variants were predictive of the degree to which such learning was overly inflated or neglected when outcomes are consistent or inconsistent with prior instructions. These findings indicate dissociable neurocomputational and genetic mechanisms by which initial biases are strengthened by experience.
 
The Effects Of BOTOX Injections On Emotions

Botox May Deaden Ability to Empathize, New Study Says
Botox May Deaden Ability to Empathize, New Study Says - Marc E. Babej - Strategic Marketing - Forbes

According to a study published in the journal Social Psychology and Personality Science, Botox may not only numb facial muscles, but also – and for the same reason – numb users’ perception of other people’s emotions.

According to David Neal, a psychology professor at the University of Southern California, “if muscular signals from the face to the brain are dampened, you’re less able to read emotions.”



Neal DT, Chartrand TL. Embodied Emotion Perception: Amplifying and Dampening Facial Feedback Modulates Emotion Perception Accuracy. Social Psychological and Personality Science. Embodied Emotion Perception: Amplifying and Dampening Facial Feedback Modulates Emotion Perception Accuracy

How do we recognize the emotions other people are feeling? One source of information may be facial feedback signals generated when we automatically mimic the expressions displayed on others’ faces. Supporting this “embodied emotion perception,” dampening (Experiment 1) and amplifying (Experiment 2) facial feedback signals, respectively, impaired and improved people’s ability to read others’ facial emotions. In Experiment 1, emotion perception was significantly impaired in people who had received a cosmetic procedure that reduces muscular feedback from the face (Botox) compared to a procedure that does not reduce feedback (a dermal filler). Experiment 2 capitalized on the fact that feedback signals are enhanced when muscle contractions meet resistance. Accordingly, when the skin was made resistant to underlying muscle contractions via a restricting gel, emotion perception improved, and did so only for emotion judgments that theoretically could benefit from facial feedback.


Davis JI, Senghas A, Brandt F, Ochsner KN. The effects of BOTOX injections on emotional experience. Emotion 2010;10(3):433-40. http://dept.psych.columbia.edu/~kochsner/pdf/Davis_etal_Botox_2010.pdf

Although it was proposed over a century ago that feedback from facial expressions influence emotional experience, tests of this hypothesis have been equivocal. Here we directly tested this facial feedback hypothesis (FFH) by comparing the impact on self-reported emotional experience of BOTOX injections (which paralyze muscles of facial expression) and a control Restylane injection (which is a cosmetic filler that does not affect facial muscles). When examined alone, BOTOX participants showed no pre- to posttreatment changes in emotional responses to our most positive and negative video clips. Between-groups comparisons, however, showed that relative to controls, BOTOX participants exhibited an overall significant decrease in the strength of emotional experience. This result was attributable to (a) a pre- versus postdecrease in responses to mildly positive clips in the BOTOX group and (b) an unexpected increase in responses to negative clips in the Restylane control group. These data suggest that feedback from facial expressions is not necessary for emotional experience, but may influence emotional experience in some circumstances. These findings point to specific directions for future work clarifying the expression-experience relationship.


Davis JI, Senghas A, Ochsner KN. How Does Facial Feedback Modulate Emotional Experience? J Res Pers 2009;43(5):822-9. How Does Facial Feedback Modulate Emotional Experience?

Contracting muscles involved in facial expressions (e.g. smiling or frowning) can make emotions more intense, even when unaware one is modifying expression (e.g. Strack, Martin, & Stepper, 1988). However, it is unresolved whether and how inhibiting facial expressions might weaken emotional experience. In the present study, 142 participants watched positive and negative video clips while either inhibiting their facial expressions or not. When hypothesis awareness and effects of distraction were experimentally controlled, inhibiting facial expressions weakened some emotional experiences. These findings provide new insight into ways that inhibition of facial expression can affect emotional experience: the link is not dependent on experimental demand, lay theories about connections between expression and experience, or the distraction involved in inhibiting one's expressions.
 
Fruit flies on meth: Study explores whole-body effects of toxic drug
A new study in fruit flies offers a broad view of the potent and sometimes devastating molecular events that occur throughout the body as a result of methamphetamine exposure.
Fruit flies on meth: Study explores whole-body effects of toxic drug

The study, described in the journal PLoS ONE, tracks changes in the expression of genes and proteins in fruit flies (Drosophila melanogaster) exposed to meth.

Unlike most studies of meth, which focus on the brain, the new analysis looked at molecular changes throughout the body, said University of Illinois entomology professor Barry Pittendrigh, who led the research.

"One of the great things about working with fruit flies is that because they're small, we can work with the whole organism and then look at the great diversity of tissues that are being impacted," Pittendrigh said. "This is important because we know that methamphetamine influences cellular processes associated with aging, it affects spermatogenesis, and it impacts the heart. One could almost call meth a perfect storm toxin because it does so much damage to so many different tissues in the body."

By tracking changes in gene expression and protein production of fruit flies exposed to meth, the researchers identified several molecular pathways significantly altered by the drug.

Many of these cascades of chemical reactions within cells are common to many organisms, including humans, and are similar even among very different families of organisms.

The researchers found that meth exposure influenced molecular pathways associated with energy generation, sugar metabolism, sperm cell formation, cell structure, hormones, skeletal muscle and cardiac muscles. The analysis also identified several new molecular players and unusual disruptions of normal cellular events that occur in response to meth, though the authors acknowledge that further work is required to validate the role of these pathways in response to meth.

Illinois crop sciences professor Manfredo Seufferheld, a co-author on the study, saw changes that indicate that meth exposure may alter the cell's energy metabolism in a manner that mirrors changes that occur in rapidly growing cancer cells. Most types of cancer rely primarily on the rapid breakdown of glucose in a process called glycolysis, which does not require oxygen even when oxygen is available. In contrast, healthy cells tend to use oxidative respiration, a slower and more efficient energy-generating process that occurs in the presence of oxygen. This aberration in energy metabolism observed in cancer cells is called the Warburg effect.

"The discovery of the molecular underpinnings of the meth syndrome inDrosophila – based on a systems biology approach validated by mutant analysis – has the potential to be used in advancing our knowledge about malignant cell proliferation by understanding the connections behind the Warburg effect and cell death," Seufferheld said.

Since glycolysis uses glucose to produce energy, the researchers tested the hypothesis that sugar metabolism is involved in the "toxic syndrome" spurred by meth. They found that meth-exposed fruit flies lived longer if they consumed trehalose, a major blood sugar in insects that also is an antioxidant.

Human meth users are known to crave sugary drinks, said lead author Lijie Sun. "And now we have evidence that increased sugar intake has a direct impact on reducing the toxicity of meth, at least in flies."

The researchers found that meth caused changes that may interfere with the critical balance of calcium and iron in cells, and they were the first to identify numerous genes that appear to be involved in the meth-induced dysfunction of sperm formation.

"All in all, this study shows that Drosophila melanogaster is an excellent model organism in which to study the toxic effect of methamphetamine at the molecular level," said Illinois postdoctoral researcher Kent Walters, an author on the study.


Sun L, Li H-M, Seufferheld MJ, et al. Systems-Scale Analysis Reveals Pathways Involved in Cellular Response to Methamphetamine. PLoS ONE 2011;6(4):e18215. PLoS ONE: Systems-Scale Analysis Reveals Pathways Involved in Cellular Response to Methamphetamine

Background - Methamphetamine (METH), an abused illicit drug, disrupts many cellular processes, including energy metabolism, spermatogenesis, and maintenance of oxidative status. However, many components of the molecular underpinnings of METH toxicity have yet to be established. Network analyses of integrated proteomic, transcriptomic and metabolomic data are particularly well suited for identifying cellular responses to toxins, such as METH, which might otherwise be obscured by the numerous and dynamic changes that are induced.

Methodology/Results - We used network analyses of proteomic and transcriptomic data to evaluate pathways in Drosophila melanogaster that are affected by acute METH toxicity. METH exposure caused changes in the expression of genes involved with energy metabolism, suggesting a Warburg-like effect (aerobic glycolysis), which is normally associated with cancerous cells. Therefore, we tested the hypothesis that carbohydrate metabolism plays an important role in METH toxicity. In agreement with our hypothesis, we observed that increased dietary sugars partially alleviated the toxic effects of METH. Our systems analysis also showed that METH impacted genes and proteins known to be associated with muscular homeostasis/contraction, maintenance of oxidative status, oxidative phosphorylation, spermatogenesis, iron and calcium homeostasis. Our results also provide numerous candidate genes for the METH-induced dysfunction of spermatogenesis, which have not been previously characterized at the molecular level.

Conclusion - Our results support our overall hypothesis that METH causes a toxic syndrome that is characterized by the altered carbohydrate metabolism, dysregulation of calcium and iron homeostasis, increased oxidative stress, and disruption of mitochondrial functions.
 
Microsleep: Brain Regions Can Take Short Naps During Wakefulness, Leading to Errors
Microsleep: Brain regions can take short naps during wakefulness, leading to errors

ScienceDaily (Apr. 28, 2011) — If you've ever lost your keys or stuck the milk in the cupboard and the cereal in the refrigerator, you may have been the victim of a tired brain region that was taking a quick nap.


Vyazovskiy VV, Olcese U, Hanlon EC, Nir Y, Cirelli C, Tononi G. Local sleep in awake rats. Nature 2011;472(7344):443-7. Local sleep in awake rats : Nature : Nature Publishing Group

In an awake state, neurons in the cerebral cortex fire irregularly and electroencephalogram (EEG) recordings display low-amplitude, high-frequency fluctuations. During sleep, neurons oscillate between ‘on’ periods, when they fire as in an awake brain, and ‘off’ periods, when they stop firing altogether and the EEG displays high-amplitude slow waves. However, what happens to neuronal firing after a long period of being awake is not known. Here we show that in freely behaving rats after a long period in an awake state, cortical neurons can go briefly ‘offline’ as in sleep, accompanied by slow waves in the local EEG. Neurons often go offline in one cortical area but not in another, and during these periods of ‘local sleep’, the incidence of which increases with the duration of the awake state, rats are active and display an ‘awake’ EEG. However, they are progressively impaired in a sugar pellet reaching task. Thus, although both the EEG and behaviour indicate wakefulness, local populations of neurons in the cortex may be falling asleep, with negative consequences for performance.
 
Circadian Clocks - Circadian (Daily) Rhythms

Circadian (daily) rhythms in physiology and behavior are phylogenetically ancient and are present in almost all plants and animals. In mammals, these rhythms are generated by a master circadian clock in the suprachiasmatic nucleus (SCN) of the hypothalamus, which in turn synchronizes peripheral oscillators throughout the brain and body in almost all cell types and organ systems. Though circadian rhythms are phylogenetically ancient, modern industrialized society and the ubiquity of electric lighting has resulted in a fundamental alteration in the relationship between an individual’s endogenous circadian rhythmicity and the external environment. The ramifications of this desynchronization for mental and physical health are not fully understood, although numerous lines of evidence are emerging that link defects in circadian timing with negative health outcomes. Animal models have shown that chronic circadian disruption can alter mortality rates in tau mutant hamsters and in aged mice. The current obesity epidemic in Western societies has also occurred hand-inhand with a gradual decrease in sleep time and sleep quality, and although largely correlative, a potential causal link is plausible. Individuals reporting poor or disturbed sleep, including shift workers, show increased incidences of diabetes and risk factors for the development of cardiovascular disease. It is also well documented that sleep deprivation has effects on cognitive function and emotionality. However, the regulation of sleep is only one aspect of the multitude of circadian rhythms in the body, and thus investigations of how general circadian disruption can affect the brain and body are essential.

Rhythms generated by the SCN have wide-reaching effects throughout the rest of the brain and body. As such, the disruption of the circadian clock can have significant downstream effects in multiple cell types and multiple organ systems. In addition to the SCN, peripheral oscillators throughout the brain and body express circadian rhythms, and in many cases can persist for several cycles in vitro. However, most of these rhythms quickly dampen without input from the SCN master clock, or without an exogenous synchronizer, such as serum shock. Tissues throughout the body show circadian rhythmicity, and are synchronized by the SCN in vivo or by other behaviors regulated by the brain clock, such as feeding. Conceptually, different tissues and organs are kept in synchrony so as to operate most efficiently with each other. When an organism undergoes a phase shift (i.e., experimental jet lag), a resynchronization of the circadian clock to a new phase is required, and a transient state of internal desynchronization between the SCN clock and peripheral oscillators occurs. Eventually, a stable phase relationship between these oscillators and the SCN is reestablished after numerous cycles. However, the rate at which different oscillators reentrain following a phase shift varies, and because of this, continuous resynchronization could result in a chronic state of desynchronization.

The molecular basis of the circadian clockworks has been well studied, and rhythms in gene products throughout the body have been well characterized. Over the past decade, the mechanisms driving these diverse rhythms has become clearer, as it has been established that many of the circadian “clock genes” function as transcription factors that can further regulate hundreds of downstream elements. This presents an enlarged pool of factors within cells and tissues whose function could be compromised by circadian disruption. To add to this complexity, physiological rhythms, including endocrine function, are well documented. It has been shown that some of these endocrine signals can feedback to modulate circadian behavior and the SCN, whereas others, such as the glucocorticoids, modulate peripheral oscillators and have little effect on the SCN. Thus, the ubiquity of clock genes, clock-controlled molecules, and circadian rhythms in physiological signals places the circadian clock at the center of a complex web of regulation that if perturbed could have effects in disparate brain and body systems.

In the present study, researchers asked if normal physiological and behavioral function would be compromised in mice exposed to environmental circadian disruption. Disruption was induced by housing male mice in 20-h light/dark (LD) cycles, whereas controls were maintained in normal 24-h LD cycles. They found that circadian disruption (CD) results in altered body temperature rhythms, increased weight gain, and elevated levels of plasma insulin and leptin. These physiological changes are accompanied by remodeling of neocortical neuronal structure, supporting the hypothesis that CD can affect brain morphology. The neural changes were associated with changes in cognitive function and emotionality, suggesting neurobehavioral ramifications of chronic CD.


Karatsoreos IN, Bhagat S, Bloss EB, Morrison JH, McEwen BS. Disruption of circadian clocks has ramifications for metabolism, brain, and behavior. Proceedings of the National Academy of Sciences 2011;108(4):1657-62. Disruption of circadian clocks has ramifications for metabolism, brain, and behavior

Circadian (daily) rhythms are present in almost all plants and animals. In mammals, a brain clock located in the hypothalamic suprachiasmatic nucleus maintains synchrony between environmental light/dark cycles and physiology and behavior. Over the past 100 y, especially with the advent of electric lighting, modern society has resulted in a round-the-clock lifestyle, in which natural connections between rest/activity cycles and environmental light/dark cycles have been degraded or even broken. Instances in which rapid changes to sleep patterns are necessary, such as transmeridian air travel, demonstrate negative effects of acute circadian disruption on physiology and behavior. However, the ramifications of chronic disruption of the circadian clock for mental and physical health are not yet fully understood. By housing mice in 20-h light/dark cycles, incongruous with their endogenous ?24-h circadian period, we were able to model the effects of chronic circadian disruption noninvasively. Housing in these conditions results in accelerated weight gain and obesity, as well as changes in metabolic hormones. In the brain, circadian-disrupted mice exhibit a loss of dendritic length and decreased complexity of neurons in the prelimbic prefrontal cortex, a brain region important in executive function and emotional control. Disrupted animals show decreases in cognitive flexibility and changes in emotionality consistent with the changes seen in neural architecture. How our findings translate to humans living and working in chronic circadian disruption is unknown, but we believe that this model can provide a foundation to understand how environmental disruption of circadian rhythms impacts the brain, behavior, and physiology.
SCN.gif
 
Last edited:
[OMG: What if I have the warrior gene and take AAS!!! ROTFLMFAOPIMP]

Code rage: The "warrior gene" makes me mad! (Whether I have it or not)
http://www.scientificamerican.com/blog/post.cfm?id=code-rage-the-warrior-gene-makes-me-2011-04-26

By John Horgan
Apr 26, 2011

Just when you think the blame-it-on-our-genes craze can't get worse, the "warrior gene" goes viral. The latest media outlet to flog it is the Dr. Phil show, which on April 4 broadcast "Born to Rage?". From the promo: "Scientists believe they may know why some people are quicker to anger than others. A new study suggests that inside a rageaholic's DNA, 'a warrior gene' may be pulling the strings. Could today's guests be genetically predisposed to fits of fury?"

Dr. Phil, a psychologist whose real name is Phil McGraw, presented three "rageaholics"—including Lori, a self-described "Tasmanian devil," and Scott, a reality-TV star and "bully"—as well as Rose McDermott, a political scientist at Brown University and warrior gene researcher. McDermott claimed that the warrior gene, which occurs in about 30 percent of the population, makes you more likely to engage in "physical aggression".

Dr. Phil had the rageaholics tested, and guess what? They all had the warrior gene! "This is information to know that you are more susceptible, at risk for, and predisposed—like someone who is fair-skinned and will burn more readily in the sun," Dr. Phil sagely informed his guests. "It doesn't mean they need to go through life sunburned. They take precautions to protect against that." The Tasmanian devil sighed, "It's a relief there's something linked to this anger, and it's not brought on because I want to do it."

Dr. Phil's Web site links to a company called FamilyTreeDNA, "the leading direct-to-consumer DNA testing company in the world. " Send a cheek scraping to the company and it will tell you if you have the warrior gene for $69—$99 if you don't go through Dr. Phil's Web site.

This cheesy talk show is hardly alone in hyping the warrior gene. In fact, Dr. Phil borrowed his headline from a recent National Geographic broadcast, "Born to Rage?", which also explores "the disturbing possibility that some people are born to rage." The show follows Henry Rollins, a self-described former punk rocker with a nasty temper, as he interviews "outlaw bikers, mixed–martial arts fighters" and other tough guys and, once again, McDermott. ABC News jumped on the bandwagon last December with an interview with McDermott, who stated: "In many, many studies [the warrior gene] appears implicated in behaviors that look like they're related to physical aggression or some kind of conduct disorder."

The story of the warrior gene dates back to the early 1990s, when several groups reported a link between violent aggression and a gene on the X chromosome that encodes for an enzyme called monoamine oxidase A (MAOA), which regulates the function of the neurotransmitters such as dopamine and serotonin. The correlation first emerged from studies of a large Dutch family whose male members were mildly retarded and extremely violent. Two were arsonists, one tried to run over an employer with a car, another raped his sister and tried to stab the warden of a mental hospital with a pitchfork. The men all lacked monoamine oxidase A, suggesting that they possessed a defective version of the MAOA gene.

Later, other researchers reported a correlation between violent aggression and an allele of the MAOA gene, MAOA-L, that produces low levels of the MAOA enzyme; the correlation was reportedly stronger if carriers had experienced some sort of trauma as children. TheMAOA allele occurs in apes and Old World monkeys as well as in humans, leading to speculation that the allele arose 25 million years ago in the common ancestor of these primates and was subsequently favored by natural selection. In a May 4, 2004, article reviewing all this research, Science dubbed the MAOA allele "the warrior gene," the oldest reference I have found to the term.

Race, inevitably, reared its head. In 2007 Rod Lea and Geoffrey Chambers, researchers at Victoria University of Wellington in New Zealand, reported that MAOA-L occurs in 56 percent of Maori men. "It is well recognized," the researchers commented in The New Zealand Medical Journal, "that historically Maori were fearless warriors." The researchers' racial profiling was based on a study of 46 men, who needed to have only one Maori parent to be defined as Maori. Lea and Chambers reported that MAOA-L was less common among Caucasians (34 percent) and Hispanics (29 percent) but even more common among Africans (59 percent) and Chinese (77 percent).

In 2009 Kevin Beaver, a criminologist at Florida State University, claimed that males with MAOA-L are more likely to report being gang members (pdf). But his study also showed that the vast majority of MAOA-L carriers are not gang members; moreover, about 40 percent of the gang members were not MAOA-L carriers. Like McDermott, Beaver was featured on the National Geographic show "Born to Rage?"

The 2009 study by McDermott and four colleagues, "Monoamine Oxidase A Gene (MAOA) Predicts Behavioral Aggression Following Provocation," which triggered much of the recent publicity given to the warrior gene, was published in Proceedings of the National Academy of Sciences (PNAS). The article claimed that MAOA-L carriers were more likely than noncarriers to respond with "behavioral aggression" toward someone they thought had cheated them out of money they had earned in a laboratory test. "Behavioral aggression" was defined as making the putative cheater consume hot sauce.

Even disregarding the issue of whether giving someone hot sauce counts as "physical aggression," McDermott's study provides little to no evidence for the warrior gene, because the difference between carriers and noncarriers was minuscule. McDermott et al. examined 70 subjects, half of whom carried the warrior gene. The researchers found that 75 percent of the warrior gene carriers "meted out aggression" when cheated—but so did 62 percent of the noncarriers. Moreover, when subjects were cheated out of smaller amounts of money, "there was no difference" between the two groups.

Obviously, the warrior gene cannot possibly live up to its name. If it did, the whole world—and China in particular, if the racial statistics cited above are remotely accurate—would be wracked by violence. The warrior gene resembles other pseudo-discoveries to emerge from behavioral genetics, like the gay gene, the God gene, the high-IQ gene, the alcoholism gene, the gambling gene and the liberal gene.

(See my previous columns on the liberal gene - http://www.scientificamerican.com/blog/post.cfm?id=gene-whiz-science-strikes-again-res-2010-10-29 - and gay gene - http://www.scientificamerican.com/blog/post.cfm?id=queer-notions-how-christian-homopho-2010-08-09 -.)

The abysmal record of behavioral genetics stems from two factors. First, the quest for correlations between thousands of genes and thousands of traits and disorders is prone to false positives, especially when traits are as squishy as "aggression" and "childhood trauma" (the variable that helps some researchers link MAOA-L to violent behavior). Second, the media—including respected scientific journals like Science and PNAS as well as shows like Dr. Phil—are prone to hyping "discoveries" that will attract attention.

The media's fascination with the warrior gene recalls the lurid claims made decades ago concerning "XYY syndrome," in which men are born with two Y chromosomes instead of one; the syndrome affects about one in a thousand men. In the 1960s British researchers identified nine men who had an extra Y chromosome and had a record of violent outbursts. This correlation was not surprising, because the men were all incarcerated in a mental hospital for violent patients. Other researchers, also focusing on institutionalized patients and criminals, quickly claimed to have found evidence that XYY men were hyperaggressive "supermales" at risk of becoming violent criminals.

The XYY-supermale claim was propagated by The New York Times and other mainstream media, enshrined in biology and social science textbooks, and even written into plots for films, novels and television shows (as Wikipedia's excellent entry on XYY syndrome documents). Meanwhile, follow-up studies of noninstitutionalized XYY men failed to corroborate the initial claims. In a 1993 report "Understanding and Preventing Violence" the National Academy of Sciences concluded that there is no correlation between the XYY syndrome and violent behavior. In 2007 CSI: Miami nonetheless broadcast a show, titled "Born to Kill," which featured a serial killer with an extra Y chromosome.

Unlike, say, multiverse theories, unsubstantiated claims about human genetics can have real-world consequences. Racists have seized on warrior gene research as evidence that blacks are innately more violent than whites. In 2010 defense attorneys for Bradley Waldroup, a Tennessee man who in a drunken rage hacked and shot a woman to death, urged a jury to show him mercy because he carried the warrior gene. According to National Public Radio, the jury bought this "scientific" argument, convicting Waldroup of manslaughter rather than murder. A prosecutor called the "warrior gene" testimony "smoke and mirrors." He was right. [This case is unbelievable. Can Your Genes Make You Murder? by BARBARA BRADLEY HAGERTY - Can Your Genes Make You Murder? : NPR ]
 
Last edited:
***Snyder SH. Mind Molecules. Journal of Biological Chemistry. http://www.jbc.org/content/early/2011/05/04/jbc.X111.258020.full.pdf

Scientific styles vary tremendously. For me, research is largely about the unfettered pursuit of novel ideas and experiments that can test multiple ideas in a day -- not a year -- an approach that I learned from my mentor Julius "Julie" Axelrod. This focus on creative conceptualizations has been my metier since working in the summers during medical school at the National Institutes of Health, during my two years in the Axelrod lab, and throughout my 45 years at Johns Hopkins University School of Medicine. Equally important has been the "high" that emerges from brainstorming with my students. Nothing can compare with the eureka moments when, together, we sense new insights and, better yet, when high-risk, high-payoff experiments succeed. Though I've studied many different questions over the years, a common theme emerges -- simple biochemical approaches to understanding molecular messengers, usually small molecules. Equally important has been identifying, purifying and cloning the messengers' relevant biosynthetic, degradative or target proteins, at all times seeking potential therapeutic relevance in the form of drugs. In the interests of brevity, this review is highly selective, and, with a few exceptions, literature citations are to only findings of our lab that illustrate notable themes.


***Solomon H. Snyder (born December 26, 1938) is an American neuroscientist.
Solomon H. Snyder - Wikipedia, the free encyclopedia

Snyder attended Georgetown University 1955-1958 and received his MD from Georgetown University School of Medicine in 1962. After medical internship at the Kaiser Hospital in San Francisco, he served as a research associate 1963-1965 at the NIH, where he studied under Julius Axelrod. Snyder moved to the Johns Hopkins University School of Medicine to complete his residency in psychiatry (1965–1968). He was appointed to the faculty there in 1966 as Assistant Professor of Pharmacology. In 1968 he was promoted to Associate Professor of Pharmacology and Psychiatry and in 1970 to Full Professor in both departments.

His laboratory is noted for the use of receptor binding studies to characterize the actions of neurotransmitters and psychoactive drugs. In 1973, with then-graduate student Candace Pert, he discovered the opioid receptor and later identified the existence of normally occurring opiate-like peptides in the brain. For this work he was awarded the Albert Lasker Award for Basic Medical Research in 1978. He also received the Wolf Prize from the President of Israel in 1983, the Bower Award of the Franklin Institute in 1992, the National Medal of Science in 2003 and the Albany Medical Center Prize in Medicine and Biomedical Research in 2007. He is the recipient of eight honorary doctorates and has been elected to honorific societies including the US National Academy of Sciences, the American Academy of Arts and Science, and the American Philosophical Society.

He is also known for his work identifying receptors for the major neurotransmitters in the brain, in the process explaining the actions of psychoactive drugs, such as the blockade of dopamine receptors by antipsychotic medications. He has described novel neurotransmitters such as the gases nitric oxide and carbon monoxide and the D-isomers of amino acids, notably D-serine.

Presently he is University Distinguished Service Professor of Neuroscience, Pharmacology, and Psychiatry at the Johns Hopkins University School of Medicine. In 1980, he founded the Department of Neuroscience, and served as its first director from 1980 to 2006. In 2006, the department was renamed as the Solomon H. Snyder Department of Neuroscience in his honor.

In 1980, he served as the President of the Society for Neuroscience. He is also Associate Editor, PNAS (Proceedings of the National Academy of Sciences of the United States of America). He helped start the companies Nova Pharmaceuticals and Guilford Pharmaceuticals and has been an active philanthropist.

He is listed by the Institute for Scientific Information (ISI) as one of the 10 most-often cited biologists and he also has the highest h-index of any living biologist.
 
NY Times. 5-15-11

Why Worry? It’s Good for You
By ROBERT H. FRANK

THE late Amos Tversky, a Stanford psychologist and a founding father of behavioral economics, used to say, “My colleagues, they study artificial intelligence; me, I study natural stupidity.”

In recent decades, behavioral economics has been the economics profession’s runaway growth area. Scholars in this field work largely at the intersection of economics and psychology, and much of their attention has focused on systematic biases in people’s judgments and decisions.

They point out, for example, that people are particularly inept at predicting how changes in their life circumstances will affect their happiness. Even when the changes are huge — positive or negative — most people adapt much more quickly and completely than they expected.

Such prediction errors, behavioral economists argue, often lead to faulty decisions. A celebrated example describes an assistant professor at a distinguished university who agonizes for years about whether he will be promoted. Ultimately, his department turns him down. As anticipated, he’s abjectly miserable — but only for a few months. The next year, he’s settled in a new position at a less selective university, and by all available measures is as happy as he’s ever been.

The ostensible lesson is that if this professor had been acquainted with the relevant evidence, he’d have known that it didn’t make sense to fret about his promotion in the first place — that he would have been happier if he hadn’t. But that’s almost surely the wrong lesson, because failing to fret probably would have made him even less likely to get the promotion. And promotions often matter in ways that have little impact on day-to-day levels of happiness.

Paradoxically, our prediction errors often lead us to choices that are wisest in hindsight. In such cases, evolutionary biology often provides a clearer guide than cognitive psychology for thinking about why people behave as they do.

According to Charles Darwin, the motivational structures within the human brain were forged by natural selection over millions of years. In his framework, the brain has evolved not to make us happy, but to motivate actions that help push our DNA into the next round. Much of the time, in fact, the brain accomplishes that by making us unhappy. Anxiety, hunger, fatigue, loneliness, thirst, anger and fear spur action to meet the competitive challenges we face.

As the late economist Tibor Scitovsky said in “The Joyless Economy,” pleasure is an inherently fleeting emotion, one we experience while escaping from emotionally aversive states. In other words, pleasure is the carrot that provokes us to extricate ourselves from such states, but it almost always fades quickly.

The human brain was formed by relentless competition in the natural world, so it should be no surprise that we adapt quickly to changes in circumstances. Much of life, after all, is graded on the curve. Someone who remained permanently elated about her first promotion, for example, might find it hard to muster the drive to compete for her next one.

Emotional pain is fleeting, too. Behavioral economists often note that while people who become physically paralyzed experience the expected emotional devastation immediately after their accidents, they generally bounce back surprisingly quickly. Within six months, many have a daily mix of moods similar to their pre-accident experience.

This finding is often interpreted to mean that becoming physically disabled isn’t as bad as most people imagine it to be. The evidence, however, strongly argues otherwise. Many paraplegics, for instance, say they’d submit to a mobility-restoring operation even if its mortality risk were 50 percent.

The point is that when misfortune befalls us, it’s not helpful to mope around endlessly. It’s far better, of course, to adapt as quickly as possible and to make the best of the new circumstances. And that’s roughly what a brain forged by the ruthless pressures of natural selection urges us to do.

All of this brings us back to our decisions about how hard we should work — choices that have important implications for the lives we are able to lead.

Most people would love to have a job with interesting, capable colleagues, a high level of autonomy and ample opportunities for creative expression. But only a limited number of such jobs are available — and it’s our fretting that can motivate us to get them.

Within limits, worry about success causes students to study harder to gain admission to better universities. It makes assistant professors work harder to earn tenure. It leads film makers to strive harder to create the perfect scene, and songwriters to dig deeper for the most pleasing melody. In every domain, people who work harder are more likely to succeed professionally, more likely to make a difference.

THE anxiety we feel about whether we’ll succeed is evolution’s way of motivating us. And the evidence is clear that most of us don’t look back on our efforts with regret, even if our daily mix of emotions ultimately doesn’t change.

But evolutionary theory also counsels humility about personal good fortune. As Darwin saw clearly, individual and collective interests don’t always coincide. A good job is an inherently relative concept, and while the person who lands one benefits enormously, her lucky break means that some other equally deserving person didn’t get that job.

When people work harder, income grows. But much of the spending that comes from extra income just raises the bar that defines adequate. So, from society’s perspective, some of the anxiety over who gets what jobs may be excessive after all. But that’s very different from saying that people shouldn’t worry about succeeding.

Robert H. Frank is an economics professor at the Johnson Graduate School of Management at Cornell University.
 
Nice.

Most people would love to have a job with interesting, capable colleagues, a high level of autonomy and ample opportunities for creative expression. But only a limited number of such jobs are available — and it’s our fretting that can motivate us to get them. ... A good job is an inherently relative concept, and while the person who lands one benefits enormously, her lucky break means that some other equally deserving person didn’t get that job.

And if you just happen to luck into this, your fretting shifts to not wasting the opportunity. ... Hard to avoid this fretting business.
 
Last edited:
[Note: This is the same guy who blogged on why black women are “rated less physically attractive.” https://thinksteroids.com/community/posts/762151 ]


Why Liberals and Atheists Are More Intelligent

Liberals and Atheists Smarter? Intelligent People Have Values Novel in Human Evolutionary History, Study Finds
Liberals and atheists smarter? Intelligent people have values novel in human evolutionary history, study finds

ScienceDaily (Feb. 24, 2010) — More intelligent people are statistically significantly more likely to exhibit social values and religious and political preferences that are novel to the human species in evolutionary history. Specifically, liberalism and atheism, and for men (but not women), preference for sexual exclusivity correlate with higher intelligence, a new study finds.

The study, published in the March 2010 issue of the peer-reviewed scientific journal Social Psychology Quarterly, advances a new theory to explain why people form particular preferences and values. The theory suggests that more intelligent people are more likely than less intelligent people to adopt evolutionarily novel preferences and values, but intelligence does not correlate with preferences and values that are old enough to have been shaped by evolution over millions of years."

"Evolutionarily novel" preferences and values are those that humans are not biologically designed to have and our ancestors probably did not possess. In contrast, those that our ancestors had for millions of years are "evolutionarily familiar."

"General intelligence, the ability to think and reason, endowed our ancestors with advantages in solving evolutionarily novel problems for which they did not have innate solutions," says Satoshi Kanazawa, an evolutionary psychologist at the London School of Economics and Political Science. "As a result, more intelligent people are more likely to recognize and understand such novel entities and situations than less intelligent people, and some of these entities and situations are preferences, values, and lifestyles."

An earlier study by Kanazawa found that more intelligent individuals were more nocturnal, waking up and staying up later than less intelligent individuals. Because our ancestors lacked artificial light, they tended to wake up shortly before dawn and go to sleep shortly after dusk. Being nocturnal is evolutionarily novel.

In the current study, Kanazawa argues that humans are evolutionarily designed to be conservative, caring mostly about their family and friends, and being liberal, caring about an indefinite number of genetically unrelated strangers they never meet or interact with, is evolutionarily novel. So more intelligent children may be more likely to grow up to be liberals.

Data from the National Longitudinal Study of Adolescent Health (Add Health) support Kanazawa's hypothesis. Young adults who subjectively identify themselves as "very liberal" have an average IQ of 106 during adolescence while those who identify themselves as "very conservative" have an average IQ of 95 during adolescence.

Similarly, religion is a byproduct of humans' tendency to perceive agency and intention as causes of events, to see "the hands of God" at work behind otherwise natural phenomena. "Humans are evolutionarily designed to be paranoid, and they believe in God because they are paranoid," says Kanazawa. This innate bias toward paranoia served humans well when self-preservation and protection of their families and clans depended on extreme vigilance to all potential dangers. "So, more intelligent children are more likely to grow up to go against their natural evolutionary tendency to believe in God, and they become atheists."

Young adults who identify themselves as "not at all religious" have an average IQ of 103 during adolescence, while those who identify themselves as "very religious" have an average IQ of 97 during adolescence.

In addition, humans have always been mildly polygynous in evolutionary history. Men in polygynous marriages were not expected to be sexually exclusive to one mate, whereas men in monogamous marriages were. In sharp contrast, whether they are in a monogamous or polygynous marriage, women were always expected to be sexually exclusive to one mate. So being sexually exclusive is evolutionarily novel for men, but not for women. And the theory predicts that more intelligent men are more likely to value sexual exclusivity than less intelligent men, but general intelligence makes no difference for women's value on sexual exclusivity. Kanazawa's analysis of Add Health data supports these sex-specific predictions as well.

One intriguing but theoretically predicted finding of the study is that more intelligent people are no more or no less likely to value such evolutionarily familiar entities as marriage, family, children, and friends.


Kanazawa S. Why Liberals and Atheists Are More Intelligent. Social Psychology Quarterly 2010;73(1):33-57. Why Liberals and Atheists Are More Intelligent / http://www.asanet.org/images/journals/docs/pdf/spq/Mar10SPQFeature.pdf

The origin of values and preferences is an unresolved theoretical question in behavioral and social sciences. The Savanna-IQ Interaction Hypothesis, derived from the Savanna Principle and a theory of the evolution of general intelligence, suggests that more intelligent individuals may be more likely to acquire and espouse evolutionarily novel values and preferences (such as liberalism and atheism and, for men, sexual exclusivity) than less intelligent individuals, but that general intelligence may have no effect on the acquisition and espousal of evolutionarily familiar values (for children, marriage, family, and friends). The analyses of the National Longitudinal Study of Adolescent Health (Study 1) and the General Social Surveys (Study 2) show that adolescent and adult intelligence significantly increases adult liberalism, atheism, and men's (but not women's) value on sexual exclusivity.
 
Thought for Food: New CMU Research Shows
Imagining Food Consumption Reduces Actual Consumption
Landmark Discovery Reverses Decades-Old Assumption
That Thinking About Food Causes You To Eat More

CareyMorewedgePITTSBURGH—If you're looking to lose weight, it's okay to think about eating your favorite candy bar. In fact, go ahead and imagine devouring every last bite — all in the name of your diet.

A new study by researchers at Carnegie Mellon University, published in Science, shows that when you imagine eating a certain food, it reduces your actual consumption of that food. This landmark discovery changes the decades-old assumption that thinking about something desirable increases cravings for it and its consumption.

Drawing on research that shows that perception and mental imagery engages neural machinery in a similar fashion and similarly affect emotions, response tendencies and skilled motor behavior, the CMU research team tested the effects of repeatedly imagining the consumption of a food on its actual consumption. They found that simply imagining the consumption of a food decreases ones appetite for it.

"These findings suggest that trying to suppress one's thoughts of desired foods in order to curb cravings for those foods is a fundamentally flawed strategy," said Carey Morewedge, an assistant professor of social and decision sciences and lead author of this study. "Our studies found that instead, people who repeatedly imagined the consumption of a morsel of food — such as an M&M or cube of cheese — subsequently consumed less of that food than did people who imagined consuming the food a few times or performed a different but similarly engaging task. We think these findings will help develop future interventions to reduce cravings for things such as unhealthy food, drugs and cigarettes, and hope they will help us learn how to help people make healthier food choices."

For the study, the research team, which included Young Eun Huh, Tepper School of Business Ph.D. candidate, and Joachim Vosgerau, assistant professor of marketing, ran a series of five experiments that tested whether mentally stimulating the consumption of a food reduces its subsequent actual consumption. In the first experiment, participants imagined performing 33 repetitive actions, one at a time. A control group imagined inserting 33 quarters into a laundry machine (an action similar to eating M&M's). Another group imagined inserting 30 quarters into a laundry machine and then imagined eating 3 M&M'S, while a third group imagined inserting three quarters into a laundry machine and then imagined eating 30 M&M'S. Next, all participants ate freely from a bowl filled with M&M'S. Participants who imagined eating 30 M&M'S actually ate significantly fewer M&M'S than did participants in the other two groups.

To ensure that the results were due to imagined consumption of M&M'S rather than the control task, the next experiment manipulated the experience imagined (inserting quarters or eating M&M'S) and the number of times it was imagined. Again, the participants who imagined eating 30 M&M'S subsequently consumed fewer M&M'S than did the participants in the other groups.

The last three experiments showed that the reduction in actual consumption following imagined consumption was due to habituation — a gradual reduction in motivation to eat more of the food — rather than alternative psychological processes such as priming or a change in the perception of the food's taste. Specifically, the experiments demonstrated that only imagining the consumption of the food reduced actual consumption of the food. Merely thinking about the food repeatedly or imaging the consumption of a different food did not significantly influence the actual consumption of the food that participants were given.

"Habituation is one of the fundamental processes that determine how much we consume of a food or a product, when to stop consuming it, and when to switch to consuming another food or product," Vosgerau said. "Our findings show that habituation is not only governed by the sensory inputs of sight, smell, sound and touch, but also by how the consumption experience is mentally represented. To some extent, merely imagining an experience is a substitute for actual experience. The difference between imagining and experiencing may be smaller than previously assumed."

Other implications of this research include the discovery that mental imagery can enact habituation in the absence of pre-ingestive sensory stimulation and that repeatedly stimulating an action can trigger its behavioral consequences.

This research was funded by a grant awarded to Morewedge from the Berkman Faculty Development Fund at Carnegie Mellon.
 
"Habituation is one of the fundamental processes ... ", Vosgerau said.
Yeah it is. No futher qualification necessary.

"To some extent, merely imagining an experience is a substitute for actual experience. The difference between imagining and experiencing may be smaller than previously assumed."
Now the question is whether or not just imagining eating bonbons all day long while watching soap operas will make me fat. My gut tells me it could. ;)
 
Last edited:
Sex on the brain: Orgasms unlock altered consciousness
Sex on the brain: Orgasms unlock altered consciousness - life - 11 May 2011 - New Scientist
mg21028124.600-2_500.jpg

Image: Kayt Sukel's brain at the moment of orgasm. "You can see from the extent of activity that an orgasm is a whole-brain experience. Activation in the prefrontal cortex (A) is clearly visible, as well as activity in the anterior cingulate cortex (B), thought to be involved in the experience of pain."


Our intrepid reporter performs an intimate act in an fMRI scanner to explore the pathways of pleasure and pain

WITH a click and a whirr, I am pulled into the scanner. My head is strapped down and I have been draped with a blanket so that I may touch my nether regions - my clitoris in particular - with a certain degree of modesty. I am here neither for a medical procedure nor an adult movie. Rather, I am about to stimulate myself to orgasm while an fMRI scanner tracks the blood flow in my brain.

My actions are helping Barry Komisaruk at Rutgers University in Newark, New Jersey, and colleagues to tease apart the mechanisms underlying sexual arousal. In doing so, not only have they discovered that there is more than one route to orgasm, but they may also have revealed a novel type of consciousness - an understanding of which could lead to new treatments for pain.
 
Faking It: Can Ads Create False Memories about Products?
http://www.jcr-admin.org/files/pressreleases/050811130432_Rajagopalrelease.pdf

People who read vivid print advertisements for fictitious products actually come to believe they’ve tried those products, according to a new study in the Journal of Consumer Research.

“Exposing consumers to imagery-evoking advertising increases the likelihood that a consumer mistakenly believes he/she has experienced the advertised product, and subsequently produces attitudes that are as strong as attitudes based on genuine product experience,” write authors Priyali Rajagopal (Southern Methodist University) and Nicole Montgomery (College of William and Mary).

In one study, the researchers showed participants different types of ads for a fictitious product: Orville Redenbacher’s Gourmet Fresh microwave popcorn. Other participants ate what they believed to be Orville Redenbacher’s Gourmet Fresh microwave popcorn, even though it was another Redenbacher product. One week after the study, all the participants were asked to report their attitudes toward the product and how confident they were in their attitudes.

“Students who saw the low imagery ad that described the attributes of the popcorn were unlikely to report having tried the popcorn, and they exhibited less favorable and less confident attitudes toward the popcorn than the other students,” the authors write. People who had seen the high imagery ads were just as likely as participants who actually ate the popcorn to report that they had tried the product. They were also as confident in their memories of trying the product as participants who actually sampled it. “This suggests that viewing the vivid advertisement created a false memory of eating the popcorn, despite the fact that trying the fictitious product would have been impossible,” the authors write.

The authors found that decreasing brand familiarity and shortening the time between viewing the ad and reporting evaluations reduced the false memories in participants. For example, when the fictitious brand was Pop Joy’s Gourmet Fresh instead of the more familiar Orville Redenbacher’s, participants were less likely to report false memories of trying it.

“Consumers need to be vigilant while processing high-imagery advertisements because vivid ads can create false memories of product experience,” the authors conclude.


Rajagopal, Priyali and Montgomery, Nicole Votolato, I Imagine I Experience, I Like: The False Experience Effect (January 13, 2011). Available at SSRN: I Imagine I Experience, I Like: The False Experience Effect by Priyali Rajagopal, Nicole Montgomery :: SSRN

False memories refer to the mistaken belief that an event that did not occur, did occur. Much of the research on false memories has focused on the antecedents to and the characteristics of such memories, with little focus on the consequences of false memories. In this research, we propose that exposure to an imagery-evoking ad can result in an erroneous belief that an individual has experienced the advertised brand. We also demonstrate that such false experiential beliefs function akin to genuine product experience beliefs with regard to their outcomes (product attitude valence and attitude strength), a finding we call the false experience effect. We further demonstrate two moderators of this effect–plausibility of past experience and evaluation timing.
 
The Rubik's Cube is a 3-D mechanical puzzle invented in 1974 by Hungarian sculptor and professor of architecture Ern? Rubik. Originally called the "Magic Cube", the puzzle was licensed by Rubik to be sold by Ideal Toy Corp. in 1980 and won the German Game of the Year special award for Best Puzzle that year. As of January 2009, 350 million cubes have sold worldwide making it the world's top-selling puzzle game. It is widely considered to be the world's best-selling toy. Rubik's Cube - Wikipedia, the free encyclopedia


10.69 seconds: Robot Ruby breaks Rubik's record (w/ video)
10.69 seconds: Robot Ruby breaks Rubik's record (w/ video)

The robot, named Ruby, can solve the scrambled puzzle in just over 10 seconds, including the time taken to scan the initial status of the cube.

It was built from scratch by six students as their final year project for the double degree in Bachelor of Engineering (Robotics and Mechatronics)/Bachelor of Science (Computer Science and Software Engineering).
 
No Gender Difference in Risk-Taking Behavior, Study Suggests
No gender difference in risk-taking behavior, study suggests

ScienceDaily (June 10, 2011) — A new doctoral thesis from the University of Gothenburg shows that young Swedish women are more prone than men to perceive situations as risky. However, there are no gender differences in actual risk-taking behaviour.

In her doctoral thesis Music and Risk in an existential and gendered World, Margareta Bohlin studies risk-taking behaviour among 15-20 year olds. Previous similar studies in several countries have shown that males generally take more risks than women.

However Bohlin's study indicates that this is not the case in Sweden today. 'Girls have been given increased access to the public sphere, so they both want to and are expected to behave like boys, and they certainly do,' says Bohlin. At the same time, however, they tend to perceive risks as more dangerous, which corresponds to traditional gender role patterns. Although girls are expected to take risks to the same extent as boys, there are unwritten rules that apply to girls but not to boys. For example, while they are allowed to drink alcohol and have sex, they may not drink too much or have too many sex partners.

In one of the studies included in the thesis, Margareta Bohlin used interviews and group discussions to assess adolescents' reasoning about risks. 'They talked a lot about how boys have difficulties to show vulnerability. For example, hearing protection and helmets are for wimps, and it's uncool to think that the music is too loud,' says Margareta Bohlin. The introduction of music in this type of risk studies are new, and Bohlin says that it turned out to be very fruitful. 'It helped me understand the other types of risk behaviour much better.'

The adolescents said that they are aware of the risk of hearing damage at concerts and clubs. Yet at the same time they talked about how much they love music. They discussed how they can feel the music engulfing them and that the music takes them to a different existential level. Bohlin concludes that the existential dimensions must be included in research on risk-taking and preventative work. Risk-taking is more than a matter of risking one's health -- it also has an existential meaning. 'Information campaigns focusing on catastrophic death don't work. The kids just turn off. I think that adults must realize and communicate that risk-taking also give meaning to the adolescent life. That may motivate them to try to balance their risk-taking so that they don't risk their health.'


Bohlin, C. M. (2011). Music and Risk in an existential and gendered world. Department of Psychology, University of Gothenburg, Sweden and Department of Social and Behavioural Studies, University West, Sweden. GUPEA: Music and Risk in an existential and gendered world / http://gupea.ub.gu.se/bitstream/2077/25387/1/gupea_2077_25387_1.pdf

Adolescents in Western society often expose themselves to high levels of sound at gyms, rock concerts, discotheques etc. These behaviours are as threatening to young people’s health as more traditional risk behaviours. Testing boundaries and risk taking are fundamental aspects of young people’s lives and the processes of developing their identities. There is, however, a need to balance reasonable risk taking and risks that can damage health.

The aim of Study I was to analyze the relationship between self-exposure to noise, risk behaviours and risk judgements among 310 Swedish adolescents aged 15-20 (167 men/143 women). The adolescents’ behaviour in different traditional risk situations correlated with behaviour in noisy environments, and judgements about traditional risks correlated with judgement regarding noise exposure. Another finding was that young women judge risk situations as generally more dangerous than young men, although they behave in the same way as the men. We suggest that this difference is a social and culture based phenomenon which underlines the importance of adopting a gender perspective in the analysis of risk factors. Adolescents reporting permanent tinnitus judged loud music as more risky than adolescents with no symptoms and they did not listen to loud music as often as those with occasional tinnitus.

The aims of study II were to illuminate the complexity of risk behaviour, the meaning and purpose of adolescent risk-taking in both a traditional sense (e.g. smoking and drug use) and in noisy environments (e.g. discotheques and rock concerts), in relation to norms and gender roles in contemporary society. In total, 16 adolescents (8 men/8 women, aged 15-19) were interviewed individually and in focus groups. The interviewees’ responses revealed social reproduction of gender and class. Main themes of the phenomena for both genders emerged: Social identity and Existential identity of risk taking. The descriptive sub themes, however, which together formed the general structure, were rather diverse for men and women. The incorporation of social and existential theories on gender as basic factors in the analysis of attitudes towards risk-taking behaviours is considered to be of utmost importance. Likewise, research on hearing prevention for young people needs to acknowledge and make use of theories on risk behaviour and similarly, the theories on risk behaviour should acknowledge noise as a risk factor.

Study III aims to increase the knowledge about young women’s and men’s risk judgement and behaviour by investigating patterns in adolescent risk activities among 310 adolescents aged 15-20 (143 women; 167 men). The Australian instrument ARQ, developed by Gullone et al, was used with additional questions on hearing risks [1] and a factor analysis was conducted. The main results showed that the factor structure in the judgement and behaviour scale for Swedish adolescents was rather different from the factor structure in the Australian sample. The factor structure was not similar to the Australian sample split on gender and there were differences in factor structures between genders among Swedish adolescents. The results are discussed from a gender and existential perspective on risk taking, and it is emphasized that research on risk behaviour needs to reconceptualize stereotypical ideas about gender and the existential period in adolescence.

The aim of Study IV was to investigate possible gender differences regarding psychometric scales measuring risk perception in noisy situations, attitudes towards loud music, perceived susceptibility to noise, and individual norms and ideals related to activities where loud music is played. In addition, the purpose was to analyze whether these variables are associated with protective behaviour, e.g. the use of hearing protection. A questionnaire was administered to a Swedish sample including 543 adolescents aged 16 to 20. The result revealed significant gender differences for all the psychometric scales. Furthermore, all psychometric measures were associated with hearing protection use in musical settings. Contrary to previous studies, gender did not solely contribute to any explanation of protective behaviour in the analysis. One conclusion is that although gender does not contribute solely to the explanation of protective behaviour, gender may affect psychological variables such as risk perception, attitudes and perceived susceptibility and these variables may in turn be valuable for decision-making and protective behaviour in noisy situations. Although women tend to be more ’careful’ psychologically, they nevertheless tend to behave in the same way as men regarding actual noise-related risk-taking.
 
Back
Top