You’re probably familiar with the TV or movie plot device where a character is conked on the head, loses memory or identity and then gets conked again and memory is restored. Classic examples are in the 1951 Tom and Jerry Cartoon Nit-Witty Kitty and the movie “Clean Slate.”
The recent finding that telling lies induces changes in the brain has stimulated a number of misrepresentations that may wreak more harm on our understanding than the lies on which they report. CNN’s headline runs, “Lying May Be Your Brain’s Fault, Honestly,” and PBS reports, “Telling a Lie Makes Way for the Brain to Keep Lying.” These stories are based on a study from University College London using a brain imaging technique called functional MRI. The authors report that as subjects tell lies, activation of the amygdala, an area of the brain associated with emotion and decision making, actually decreases, suggesting that subjects may become desensitized to lying, thereby paving the way for further dishonesty.
We’ve known that bacteria live in our intestines as far back as the 1680s, when Leeuwenhoek first looked through his microscope. Yogurt companies use that information in the sales pitch for their product, claiming it can help keep your gut bacteria happy. The bacteria growing on our skin have also been effectively exploited to sell the underarm deodorants without which we can become, ahem, malodorous. Until fairly recently our various microbes were thought of as freeloaders without any meaningful benefit to our functioning as healthy human beings.
Most of us considered microbes little more than nasty germs before science recently began turning our view of the microbial world on its head. A “microbe” is a bacterium and any other organism too small to see with the naked eye. After decades of trying to sanitize them out of our lives, the human microbiome – the communities of microbes living on and in us – is now all the rage. And yet, some insist that we can’t really call microbes “good.” That’s nonsense.
Giving feedback is unquestionably one of the most challenging tasks for any leader, as it can be painful to both the giver and receiver. It is nonetheless invaluable: Research has shown that employees recognize the importance of feedback – whether positive or negative – to their career development.
Many even welcome it, provided it’s given well. One study of nearly a thousand employees both in the U.S. and abroad found that 92 percent believed that negative feedback is effective at improving performance – “if delivered appropriately.”
While the Nobel Prizes are 115 years old, rewards for scientific achievement have been around much longer. As early as the 17th century, at the very origins of modern experimental science, promoters of science realized the need for some system of recognition and reward that would provide incentive for advances in the field.
Before the prize, it was the gift that reigned in science. Precursors to modern scientists – the early astronomers, philosophers, physicians, alchemists and engineers – offered wonderful achievements, discoveries, inventions and works of literature or art as gifts to powerful patrons, often royalty. Authors prefaced their publications with extravagant letters of dedication; they might, or they might not, be rewarded with a gift in return. Many of these practitioners worked outside of academe; even those who enjoyed a modest academic salary lacked today’s large institutional funders, beyond the Catholic Church. Gifts from patrons offered a crucial means of support, yet they came with many strings attached.