Monday, December 31, 2012


Half the Facts You Know Are Probably Wrong

Old truths decay and new ones are born at an astonishing rate

Dinosaurs were cold-blooded. Increased K-12 spending and lower pupil/teacher ratios boost public school student outcomes. Most of the DNA in the human genome is junk. Saccharin causes cancer and a high fiber diet prevents it. Stars cannot be bigger than 150 solar masses.

In the past half-century, all of the foregoing facts have turned out to be wrong. In the modern world facts change all of the time, according to Samuel Arbesman, author of the new book The Half-Life of Facts: Why Everything We Know Has an Expiration Date (Current).

Fact-making is speeding up, writes Arbesman, a senior scholar at the Kaufmann Foundation and an expert in scientometrics, the science of measuring and analyzing science. As facts are made and remade with increasing speed, Arbesman is worried that most of us don’t keep up to date. That means we’re basing decisions on facts dimly remembered from school and university classes—facts that often turn out to be wrong.

In 1947, the mathematician Derek J. de Solla Price was asked to store a complete set of The Philosophical Transactions of the Royal Society temporarily in his house. Price stacked them in chronological order by decade, and he noticed that the number of volumes doubled about every 15 years, i.e., scientific knowledge was apparently growing at an exponential rate. Thus the field of scientometrics was born.

Price started to analyze all sorts of other kinds of scientific data, and concluded in 1960 that scientific knowledge had been growing steadily at a rate of 4.7 percent annually for the last three centuries. In 1965, he exuberantly observed, “All crude measures, however arrived at, show to a first approximation that science increases exponentially, at a compound interest of about 7 percent per annum, thus doubling in size every 10–15 years, growing by a factor of 10 every half century, and by something like a factor of a million in the 300 years which separate us from the seventeenth-century invention of the scientific paper when the process began.”

A 2010 study in the journal Scientometrics, looking at data between 1907 and 2007, concurred: The “overall growth rate for science still has been at least 4.7 percent per year.”

Since knowledge is still growing at an impressively rapid pace, it should not be surprising that many facts people learned in school have been overturned and are now out of date. But at what rate do former facts disappear? Arbesman applies to the dissolution of facts the concept of half-life—the time required for half the atoms of a given amount of a radioactive substance to disintegrate. For example, the half-life of the radioactive isotope strontium-90 is just over 29 years. Applying the concept of half-life to facts, Arbesman cites research that looked into the decay in the truth of clinical knowledge about cirrhosis and hepatitis. “The half-life of truth was 45 years,” he found.

In other words, half of what physicians thought they knew about liver diseases was wrong or obsolete 45 years later. Similarly, ordinary people’s brains are cluttered with outdated lists of things, such as the 10 biggest cities in the United States.

Facts are being manufactured all of the time, and, as Arbesman shows, many of them turn out to be wrong. Checking each one is how the scientific process is supposed to work; experimental results need to be replicated by other researchers. So how many of the findings in 845,175 articles published in 2009 and recorded in PubMed, the free online medical database, were actually replicated? Not all that many. In 2011, a disquieting study in Nature reported that a team of researchers over 10 years was able to reproduce the results of only six out of 53 landmark papers in preclinical cancer research.

In 2005, the physician and statistician John Ioannides published “Why Most Published Research Findings Are False” in the journal PLoS Medicine. Ioannides cataloged the flaws of much biomedical research, pointing out that reported studies are less likely to be true when they are small, the postulated effect is likely to be weak, research designs and endpoints are flexible, financial and nonfinancial conflicts of interest are common, and competition in the field is fierce. Ioannides concluded that “for many current scientific fields, claimed research findings may often be simply accurate measures of the prevailing bias.” Still, knowledge marches on, spawning new facts and changing old ones.

Another reason that personal knowledge decays is that people cling to selected “facts” as a way to justify their beliefs about how the world works. Arbesman notes, “We persist in only adding facts to our personal store of knowledge that jibe with what we already know, rather than assimilate new facts irrespective of how they fit into our worldview.” All too true; confirmation bias is everywhere.

So is there anything we can do to keep up to date with the changing truth? Arbesman suggests that simply knowing that our factual knowledge bases have a half-life should keep us humble and ready to seek new information. Well, hope springs eternal.

More daringly, Arbesman suggests, “Stop memorizing things and just give up. Our individual memories can be outsourced to the cloud.” Through the Internet, we can “search for any fact we need any time.” Really? The Web is great for finding an up-to-date list of the 10 biggest cities in the United States, but if the scientific literature is littered with wrong facts, then cyberspace is an enticing quagmire of falsehoods, propaganda, and just plain bunkum. There simply is no substitute for skepticism.

Toward the end of his book, Arbesman suggests that “exponential knowledge growth cannot continue forever.” Among the reasons he gives for the slowdown is that current growth rates imply that everyone on the planet would one day be a scientist. The 2010 Scientometrics study also mused about the growth rate in the number of scientists and offered a conjecture “that the borderline between science and other endeavors in the modern, global society will become more and more blurred.” Most may be scientists after all. Arbesman notes that “the number of neurons that can be recorded simultaneously has been growing exponentially, with a doubling time of about seven and a half years.” This suggests that brain/computer linkages will one day be possible.

SOURCE







Are video games really the villains in our violent age?

The Sandy Hook school massacre has revived concerns about the effects of first-person shooter games, but some of them are actually good for you

The number of aliens you kill may directly contribute to an improvement in your brain. This may not sound like a typical scientific discovery, but it has come from some of the world's finest neuroscience laboratories. In fact, it is the genuine outcome of studies on how action video games can improve your attention, mental control and visual skills. We're talking here about fast-moving titles such as Halo, Call of Duty and Grand Theft Auto, which demand quick reflexes and instant decision-making. They're often portrayed as the most trashy, vapid and empty-headed forms of digital entertainment, but it looks as if they may be particularly good at sharpening your mental skills.

This may come as a surprise if you read much of the popular press, which is often obsessed with technological scare stories. Scientific evidence has been less media-friendly but considerably more convincing. We now have numerous studies on how playing action computer games, as opposed to puzzle or strategy titles such as The Sims or Tetris, leads to an improvement in how well we pay attention, how quickly we react, how sensitive we are to images and how accurately we sort information. Crucially, these studies are not just focused on people who already play a lot of video games, but are testing whether action video game training genuinely leads to improvements.

The studies use randomised controlled trials. It is a method normally used to test medications, but it can be applied to anything. In this case, a group of people are randomly assigned to one of two groups. Half get the "treatment", perhaps blasting away at enemy combatants in Medal of Honor, while the others get the "placebo" – for example, managing a digital family in The Sims 3. Reliably, those assigned to play the fast-moving action games show improvements on neuropsychological tests that measure the ability to process quickly and react to visual information. It's worth saying that these conclusions were thrown into doubt in 2011 when several scientists, led by Walter Boot from Florida State University, suggested that these findings may be due to poor experimental design, but subsequent and better planned studies have continued to find a positive effect.

Another aspect of the game debate concerns the impact of violent video games. This has become a matter of public anxiety again in light of the tragic Sandy Hook killings after the gunman was identified as being a fan of first-person shooter games such as Call of Duty. It's worth saying that such appalling events are not a good basis for science, simply because the popularity of this form of entertainment makes it difficult to attribute any form of link between their use and statistically rare individuals. This does not, however, mean that the issue itself is not important and worthy of study – and it has, in fact, been researched widely.

Also using randomised controlled trials, research has found that violent video games cause a reliable short-term increase in aggression during lab-based tests. However, this seems not to be something specific to computer games. Television and even violence in the news have been found to have a similar impact. The longer-term effects of aggressive gaming are still not well studied, but we would expect similar results from long-term studies of other violent media – again a small increase in aggressive thoughts and behaviour in the lab.

These, however, are not the same as actual violence. Psychologist Christopher Ferguson, based at the Texas A&M International University, has examined what predicts genuine violence committed by young people. It turns out that delinquent peers, depression and an abusive family environment account for actual violent incidents, while exposure to media violence seems to have only a minor and usually insignificant effect. This makes sense even in light of horrifying mass shootings. Several of the killers did play video games, but this doesn't distinguish them from millions of non-violent young men. Most, however, had a previous history of antisocial behaviour and a disturbed background, something known to be much more common in killers.

Perhaps the most telling effect of video games concerns not what they involve but how much time someone spends playing them. A helpful study on the effect of giving games consoles to young people found that, while the gaming had no negative impact on core abilities, school performance declined for those kids who put aside homework for screen entertainment. Similarly, a significant amount of research has found that putting aside exercise for the physical inactivity of video games raises the risk of obesity and general poor health.

And while "addiction" is now the pop psychology label of choice for anything that someone does to excess (sex, video games, shopping), the same behaviour could just as easily, and more parsimoniously, be described as a form of avoidant or unhelpful coping. Rather than dealing with uncomfortable life problems, some people avoid them by absorbing themselves in other activities, leading to an unhelpful cycle where the distractions end up maintaining the problems because they're never confronted. This can apply as easily to books as video games.

The verdict from the now considerable body of scientific research is not that video games are a new and ominous threat to society but that anything in excess will cause us problems. The somewhat prosaic conclusion is that moderation is key – whether you're killing aliens, racing cars or trying to place oddly shaped blocks that fall from the sky.

SOURCE






No comments: