Let’s say we want to assess what the real effect of some public policy is on some other variable. We can map out our estimates of this effect over time. If a rationalist Bayesian thinker is collecting evidence about it in an unbiased way, their belief about it might evolve over time like this:

This is the ideal case. We collect evidence, and we slowly converge on the truth. Unless the Devil is poisoning our pool of evidence to mislead us deliberately, everything is fine. This is the sort of thing that happens when doctors study new drugs: there are some bumps in the road that send them in the wrong direction, but they slowly figure out what it really does.
We don’t have this luxury in politics. Usually, we have a split: some people have a prior belief that the effect is positive, and others have a prior belief that the effect is negative. And not only do people have different prior beliefs, but they also collect evidence in a biased way. Rather than being exposed to the same evidence, we have two different teams being exposed to different collections of evidence selected1 to lead them to different (emotionally satisfying) conclusions. That looks something like this:
Not great. Now we introduce the topic of the day: the enlightened centrist. They won’t fall for the narratives of the two sides. Instead of just listening to MSNBC or Fox News, they’re going to listen to MSNBC and Fox News. They’re going to tune in to Hasanabi on Saturdays and dabble in some Newsmax on Sundays. Bill Maher is their favorite socialist, and they love to attend drag queen story hour:
Notice how the red and blue priors are not as extreme as what they’re drawn toward. This is only a suspicion of mine, but it seems likely that what people initially believe is extremified by the way information is selectively acquired. Regardless, the balanced approach taken by the centrist means that they’ve come closer to the truth, even though they’re still wrong.
I admire any effort to improve one’s understanding. But while this strategy is probably good, it’s absolutely not the ideal approach, which we saw at the outset. In effect, instead of allowing yourself to be drawn to the truth, you’ve chosen the conclusion!
Sometimes, making this decision will help you, as in the picture above. And it could be necessary because you don’t know how to pick up new evidence in an unbiased way. But there are conditions under which this is a very bad idea:
What if one side knows what the truth is? Then, if you start on that side, becoming an enlightened centrist will only make your estimate less accurate.
What if the two sides agree on a wrong conclusion? You’ll be just as wrong as they are.
What if one side is biased towards a wrong conclusion, but the other side is biased towards an extremely wrong conclusion in the other direction? Balancing these two equally will make you very wrong in the same direction as that side.
Again, I don’t think this is the worst way of going about it, but if you’re reading something like Ground News or AllSides, this is what you’re doing, and it will often cause you to be wrong. You’re not going to turn into a schizoterrorist or Manson cultist, but it has issues.
With that said, I can also imagine a way for this to work very well. If you consume information prolifically and remember things well (so as to avoid getting drawn towards a conclusion by the same evidence twice), at some point, one side is going to run out of evidence. You’ll notice the evidence on the other side has bigger sample sizes, or maybe it just doesn’t stink of Bad Vibes. Then you’ll be drawn closer to them rather than the exact middle point, and that will bring you even closer to the truth. I’m imagining two very large but finite pools of evidence, where we would like to know everything but can only sample a bit from each pool to try to understand the world. We want to learn from each pool in proportion to the quality and size of their evidence, like in the first picture, but we often can’t. But if each pool is small enough, the well eventually runs dry on one side.
But I think that the typical person simply doesn’t consume enough information for this long-run behavior to happen. There are other problems with this strategy, too: what if you don’t actually know what both sides are listening to? Maybe you think that the Wall Street Journal op-ed section is what liberals read, and the American right loves Breitbart, so the truth is some balance between Oren Cass and Adolf Hitler. Alternatively, you might think the American right loves the New York Times, and the left is all about coverage from random blue-haired womxn with pronouns on TikTok. More realistically, I can easily imagine someone who thinks a balanced approach involves avoiding “legacy media” entirely and listening to Brett Weinstein, Sam Harris, and Jordan Peterson, who have substantial disagreements with each other but are really just various flavors of the American cultural right wing.
All of these problems are solved if you rely on elites to find the truth for you by specializing in answering the question “How do we gather information in an unbiased way to converge on the truth?”, but Americans loathe the idea of someone who knows something they don’t, so we’re never getting that.
Such as through algorithms driven by positive engagement on social media. Most media sources behave similarly through story selection, picking and choosing what to run to appeal to their audience. Relevant:
The media makes you smarter
Common sense ideas in America have now come to include that the media is biased and not trustworthy. The truth is that the media, including Fox News, MSNBC, Reuters, the AP, and all the rest, rarely lies. Synthesizing all of these stories while not paying mind to story selection and interpretation is the problem.