Standard Deviations: Flawed assumptions, Tortured Data, And Other Ways to Lie With StatisticsSubmitted by Anonymous (not verified) on Thu, 07/02/2015 - 08:59
Gary Smith, the Fletcher Jones Professor of Economics at Pomona College, Claremont, California, has written a deeply studied and thought-provoking book on how researchers torture data to produce outcomes that prove their assumptions which themselves could be flawed. People are taken for a ride sometimes even for prolonged periods. They lead people to draw false inferences and making wrong decisions. Even informed people are misled by biased or irrelevant data or by the flawed approaches to research. One reason could be the assumption that computers are infallible and that ‘no matter what kind of garbage we put in, computers will spit out gospel.’ This, according to the author, is because in the past data were scarce and computers non-existent, researchers had to work hard to collect good data and used to spend long time for making the necessary calculations. But today, with data so plentiful, researchers don’t spend time to distinguish good data from rubbish, and sound analysis from junk science. And most people assume that since we are able to handle large amounts of data, nothing can go wrong. And all these lead us to make decisions based on the nonsense that the computers spit out.