Construct beliefs actively

I find the particular interaction of these phenomena troubling:

  1. People often have local social goals that favor regurgitation.
  2. Repetition makes people more likely to believe stuff.

Most conversations aren’t dialectic. We’re a social species, and when we engage with each other it’s difficult to fight off urges to persuade, please, and impress, to name a few. The issue is that these goals make us liable to parrot.

A common context in which this becomes a problem for me is when I’m sharing something new I’ve learned with others. Someone pushes my argumentative buttons, or there’s an awkward silence, or I’m worried that I sound uninformed, and suddenly I’m repeating X that I’ve read or heard but don’t quite believe myself. I don’t not believe X. I just haven’t taken the time to ask myself what I really think, because, well, I’m learning. In the moment none of this seems important because my immediate social goals are staring me in the face. Hashing out my own beliefs can be done later, in the safety and privacy of my own room.

Except I probably won’t do that. And when the same topic comes up in a new social setting, I’ll be ever more inclined to repeat X because now I’ve had practice saying it - maybe even experienced positive reinforcement when it helped me survive that last conversation. This sort of vicious cycle leads me to unconsciously adopt X as my own belief over time, selected for its convenience in the name of maintaining fluid social interaction, which is not even close to a reasonable proxy for discovering what is true. (There might be some pressure here for more true positions since they’re generally easier to articulate and/or defend than less true positions, but this pressure is weak relative to the explicit pressure I’d be applying by engaging in critical thinking.)

Related is the illusory truth effect, where people tend to believe frequently repeated misinformation because the constant exposure makes it easier to process. In this case, however, we’re on both the delivering and receiving end of the repetition. Such an effect is obviously bad for epistemic reasons: we risk developing all sorts of incongruous beliefs because they’re optimized for appearances and not grounded in common principles. Possibly the more dangerous consequence of hasty imitation is that it builds passive minds and becomes habitual. It really is much easier to dig around just-in-time for thoughts other people have had than to carefully reach one’s own conclusions, especially when the short-term payoffs are unclear.

We shouldn’t let arbitrary social pressures be the basis upon which we build our belief systems. I attempt to avoid this by refusing to put off belief formation when I learn something new. That is, actively checking the thing I’ve learned against my current worldview and deciding whether or not I buy into it. If there’s crucial part of the picture I think I’m missing, I at least try to log the new information as “considered, but unresolved.” It’s better than leaving my beliefs entirely subject my future social whims.