I was talking about misinformation with a more centrist-y mate than me yesterday, and he was speaking with concern about how Zuckerberg's dismantling factchecking (and presumably moderation and safety) at Meta.
The difference between me and this friend is probably that he seems to think that national governments stand a chance of forcing media publishers and platform operators to regulate misinformation properly. That the platforms can even do this.
I have major doubts about that, I just don't really see it happening. Misinformation will get far worse from here. A big problem is that it's always been far, far easier to produce bullshit needing to be debunked than to debunk it, and generative AI is making that arbitrarily worse. Factchecking is one thing a ML system that doesn't "triangulate" with its empirical senses is never likely to do well at.
The thing that's worrying me more than mere misinformation at the moment—and you see this clearly with the followings of Musk and Trump—is that there's a large and growing cohort who not only know perfectly well that they're feeding on these new types of bullshit, but have a real appetite for it.
This cohort enjoys and prefers the lies and their consequences, even though they simultaneously know they are lies, and they even dimly know the outlines of something closer to the truth.
I think, as usual, that the issue is power. Lies are a preferable consolation to truth if knowing the truth about things doesn't tend to empower you to change those things.