The Boggle Threshold: How Open-Minded Are ‘Open-Minded’ Scientists?
“Boggle threshold” is a term coined by writer and historian Renée Haynes (1906–1994). She defined it as “the level above which the mind boggles when faced with some new fact or report or idea.”
Last year, T.M. Luhrmann wrote about Haynes’ own boggle threshold in an article for The New York Times: “Haynes herself was fine … with telepathy; hesitant about reincarnation; but appalled that a woman had flown across the Atlantic to have her torn ‘aura’ repaired by a guru expert in invisible mending.”
Reincarnation researcher Dr. Jim Tucker is fine with memories of past lives reported by children, but some of those memories involving past lives as animals, it pushes against his boggle threshold, he wrote in his book “Return to Life.”
Scientists who study the paranormal are often able to accept what is beyond the boggle threshold of more mainstream scientists. But as Society for Scientiﬁc Exploration (SSE) President William F. Bengston points out in his 2012 Edgescience article “The Boggle Factor”: “It is also clearly the case that those of us who are interested in scientiﬁc anomalies do not automatically accept the reality of all of the anomalous phenomena we are exposed to.”
“Where do we draw the line, and why?” he asked.
He conducted a study to find answers to this question, or at least hints. SSE is an organization that brings together many scientists who study anomalies, so he surveyed SSE members about their boggle thresholds. They were asked to rate their acceptance of anomalous phenomena on a scale of 1 to 10, 10 being total acceptance that the given phenomenon exists.
The findings of the Princeton Engineering Anomalies Research (PEAR) laboratory, well-known among SSE members, scored 8.01. PEAR research suggests that the human mind can influence a machine known as a random event generator (REG). REGs produce either 1s or 0s. They’re like electronic coin flippers, producing two possible outcomes generated at random. Operators were asked to direct their intention at the machine to cause it to produce either more 1s or more 0s. The REG displayed a tendency toward the choice of the subject at a rate well above chance.
Bengston wrote: “The PEAR lab data has a mean score of 8.01, indicating strong acceptance. But why isn’t the score even higher? Where are the 10s? (Is everyone just being appropriately skeptical?) They [PEAR researchers] have 30 years of data, meticulously gathered and analyzed, in hundreds of papers, books, and technical reports. What’s left? Is there anything lacking that if produced would make the holdouts convert to acceptance? What could that possibly be? Would a 31st year of experimental data make any difference?”
Bengston recalled a time when he had dismissed something beyond his threshold. Upon learning more about it, however, he had changed his mind and reeled it back within his threshold.
He was going to listen to a talk on crop circles at a conference. “I wasn’t particularly looking forward to it. After all, weren’t crop circles basically a group of guys with planks and a few too many beers pulling a prank on the gullible? What’s the point of sitting through this? But what I heard made my jaw drop. Here were carefully gathered data on what turned out to be a richly complex phenomenon that blew away my preconceived notions.”
Could it be simply a matter of not being familiar enough with the data to accept a phenomenon completely?
Luhrmann looks at psychological factors that could also contribute to the establishment of a boggle threshold: “When we draw a line between the plausible and the ridiculous—our boggle line—I think we become more confident about the beliefs on the plausible side of the line. You are, the boggle line tells you, a sensible, reasonable person. You do not believe in that. So a belief in this—well, a sensible person would take that seriously.”
Stanford professor Benoit Monin has found that people who view themselves as morally reasonable—a view reinforced by some moral action, such as disagreeing with blatant racism—may give themselves license to do something morally risky—such as express a less blatantly racist sentiment. Perhaps the same principle is in effect with the boggle threshold. A person disagrees with one blatantly hard-to-accept theory, giving him license to agree with a less hard-to-accept theory.
The sociologist Lester R. Kurtz wrote in 1983: “What people do not believe is often more clearly articulated than what they do believe.”
Follow @TaraMacIsaac on Twitter, visit the Epoch Times Beyond Science page on Facebook, and subscribe to the Beyond Science newsletter to continue exploring ancient mysteries and the new frontiers of science!