From R. Kelly Garrett at Phys.org:
Where people differ is in how often they do so. A 2016 survey that my colleague Brian Weeks and I conducted found that 50.3 percent of all Americans agreed with the statement “I trust my gut to tell me what’s true and what’s not.” Some of those polled felt quite strongly about it: About one in seven (14.6 percent) strongly agreed, while one in 10 (10.2 percent) strongly disagreed.
Gut feelings tell many of us not to trust anything in social sciences except the Sokal hoaxes played on its practitioners. But now and then, we learn something that reminds us vaguely of the world we live in:
Another study found that people with the strongest reasoning skills and the highest science literacy also tend to be more biased in their interpretation of new information. Even asking people to “think carefully” can lead to more biased answers.
In this context, our results are surprising. There are many individual qualities that seem like they should promote accuracy, but don’t.
Valuing evidence, however, appears to be an exception. The bigger the role evidence plays in shaping a person’s beliefs, the more accurate that person tends to be. More.
Most of the article sounds like an effort to reassure us that if we believe whatever the establishment tells us, despite all the scandals, we are doing the right thing. It would have been much stronger if it addressed some of the many instances where the establishment has just been wrong or off base about something for decades, as in many nutrition issues.
See also: What? Questioning evolution is not science denial?
Steve Fuller: Brexit, the repudiation of experts, and intelligent design
Study: More education leads to more doubt of science “consensus”