Should psychologists sue for mental abuse?
Lord knows, social sciences are troubled and easy to mock.
After all, we don’t know if a physics prof is describing hadrons accurately. But we sense we might very well know more than the psych prof does about the dynamics driving our workplace. Especially if it turns out that he is one of a minority percentage of the population who holds to identified political views and helps suppress challenges from within the profession.
Sure, they can get away with that if taxpayers are forced to fund them, but no one can be forced to take it seriously.
So we read, for example, at Vox,
In a recent blog post titled “I was wrong,” he [a psych prof] fesses up to adding a shoddy conclusion to the psychological literature (with the help of colleagues) while he was a graduate student at the University of Missouri. “[W]e ran a study, and the study told us nothing was going on,” he writes. “We shook the data a bit more until something slightly more newsworthy fell out of it.”
This a bold and honest move — the type that gives me reasons to be optimistic for the future of the science. He’s confessing to a practice called p-hacking, or the cherry-picking of data after an experiment is run in order to find a significant, publishable result. While this has been commonplace in psychology, researchers are now reckoning with the fact that p-hacks greatly increase the chances that their journals are filled with false positives. It’s p-hacks like the one Hilgard and his colleagues used that gave weight to a theory called ego depletion, the very foundation of which is now being called into question. More.
For more on “ego depletion,” see Slate: Big psych “ego depletion” theory debunked
Plus, we hear from Daniel Engber at Slate,
David Peterson would like to know why psychology has fallen into crisis. The discipline now seems rife with shoddy data. A recent, large-scale effort to reproduce experiments found that more than half of 100 major papers could not be replicated. Even certain bedrock findings—including those that spawned entire subfields of research—now appear to be unstable. What, exactly, led us to this point?
Peterson has some clues. The graduate student in sociology at Northwestern University has spent parts of the past four years conducting an ethnographic study of about a dozen different research labs. The subjects of his research were at times distressing in their honesty. “You want to know how it works?” one graduate student told him. “We have a bunch of half-baked ideas. We run a bunch of experiments. Whatever data we get, we pretend that’s what we were looking for.” More.
But the readiness of pop science mags to dump on social sciences for this sort of thing prompts a question: Are social science a scapegoat mainly because they are easy to make fun of, while serious problems are shovelled under the rug elsewhere?
One wonders, reading this from the Washington Post:
Scientists with pitchforks, torches and battering rams are threatening to storm the SOS headquarters in the wake of our item Thursday headlined “Many scientific studies can’t be replicated. That’s a problem.”
Their message: It’s just the psychologists! Not us! Our work is super-reproducible! Don’t lump us in with those squishy psycho-babbling crypto-phrenologists! (I’m paraphrasing liberally.) More.
The theories of the month that run through the main hosepipe around here about cosmology, origin of life, human evolution, and the human mind suggest that’s a good bet. So many of them are hot, Cool, or trendy, therefore untouchable (for now). For that matter, many are non-replicable in principle, non-falsifiable in principle or don’t make sense in principle (and proud to say so).
Never mind, it’s still science.
Our webmaster Jack Cole, himself a clinical psychologist, comments,
I don’t think the bias and error is any worse in psychology than in many other fields, but psychologists can be more willing to study this and reflect on the reasons than professionals in other fields. Even in the article linked, they note some theories as to why researchers over-emphasize and “shake” the data until something falls out. Many areas of biological science would benefit from the perspective of psychology’s research into how researchers are biased and make mistakes.
But the same science writers who would rush to dump on psychology spend similar energy defending nonsense against enemies named, unnamed, and imagined.
The serious enemy is approved nonsense, but they’ll be the last to know.