© 2014 Peter Free
Citation — to study
Aner Tal and Brian Wansink, Blinded with science: Trivial graphs and formulas increase ad persuasiveness and belief in product efficacy, Public Understanding of Science, DOI: 10.1177/0963662514549688 (online before print, 15 October 2014)
Citation — to press release
Cornell University, Blinded by non-science: Trivial scientific information increases trust in products, ScienceDaily (17 October 2014)
This would be a laughable juxtaposition of claim and inferred lack of proof — if the juxtaposition had not been so inadvertently presented
Here is the entirety of the abstract — notice its absence of any methodological data:
The appearance of being scientific can increase persuasiveness. Even trivial cues can create such an appearance of a scientific basis.
In our studies, including simple elements, such as graphs (Studies 1–2) or a chemical formula (Study 3), increased belief in a medication’s efficacy.
This appears to be due to the association of such elements with science, rather than increased comprehensibility, use of visuals, or recall.
Belief in science moderates the persuasive effect of graphs, such that people who have a greater belief in science are more affected by the presence of graphs (Study 2).
Overall, the studies contribute to past research by demonstrating that even trivial elements can increase public persuasion despite their not truly indicating scientific expertise or objective support.
© 2014 Aner Tal and Brian Wansink, Blinded with science: Trivial graphs and formulas increase ad persuasiveness and belief in product efficacy, Public Understanding of Science, DOI: 10.1177/0963662514549688 (online before print, 15 October 2014) (paragraph split)
Here is the entirety of Cornell University’s equally uninformative press release:
Published this week in Public Understanding of Science, the Cornell Food and Brand Lab study found trivial graphs or formulas accompanying medical information can lead consumers to believe products are more effective.
“Your faith in science may actually make you more likely to trust information that appears scientific but really doesn’t tell you much,” said lead author Aner Tal, post-doctoral researcher at the Cornell Food and Brand Lab. “Anything that looks scientific can make information you read a lot more convincing.”
The study showed that when a graph – with no new information – was added to the description of a medication, 96.6 percent of people believed that the medicines were effective in reducing illness verses 67.7 percent of people who were shown the product information without the graph.
“Even those with professed faith in science were more likely to be swayed by trivial scientific looking product information,” said Tal.
“In fact, the more people believed in science, the more they were convinced by the graphs.
“What this means is that when you read claims about new products, whether it’s a medication or a new technology, you should ask yourself,
‘where does this information come from?’,
‘what’s the basis for the claims being made?’
“Don’t let things that look scientific but don’t really tell you much fool you. Sometimes a graph is just a graph!”
Cornell University, Blinded by non-science: Trivial scientific information increases trust in products, ScienceDaily (17 October 2014) (reformatted for clarity)
Thus, the abstract and press release make a claim — in the absence of summarized data to support it
One would think that the authors would have taken advantage of their own discovered phenomenon.
A bit of quantitative and methodological data added to the abstract might have avoided the dismissive conclusion that I came away with.
How many subjects?
Why was the experimentally attached data in the study legitimately considered trivial?
And so forth.
Even an abbreviated answer to one of these questions might have cloaked the study summary with a hint of (legitimately acquired) credibility.
The moral? — The authors really should have summarized the answers to their own questions, before signing off on the abstract
Part of the scientific process is communicating at least the appearance of knowing what Science’s rules are.
In a world of busy people and short attention spans, this has to be done in the abstract:
“Where does this information come from?”
“What’s the [statistical and methodological] basis for the claims being made?”
Well said, indeed, Dr. Tal. No disrespect intended. Sometimes, we trip over our own feet in missing the need to communicate what we know, which our readers do not.
Presumably, the answers to Dr. Tal’s own questions are included in the body of his study’s report.
But, being behind the journal’s paywall, we cannot get to it. Which defeats much of the purpose of having an abstract.