Ah, the "fake news" problem.

Yes, we laugh now, but before the term was embraced by President Trump, fake news was already a real problem. Not because people were spreading slight errors due to honest (or even dishonest) reporting mistakes. Not even because of tendentious, opinionated takes on the news produced by partisans on all sides. Rather, fake news became a thing because during last year's election there were many, many outlets created online for the sole purpose of spreading actually fake, completely false news stories over the Internet, with the explicit idea of deceiving people.

Fake news has become a bigger issue than it might have otherwise, because some people (wrongly) believe that fake news is what got Trump elected. And now Facebook, the source of most fake news, wants to sanitize its reputation by rescuing us from it.

To that end, Facebook founder Mark Zuckerberg has announced his company's new plan to fight fake news. After noting that Facebook's news content would be reduced, from 5 percent to 4 percent of its apparently misnamed "News Feed", Zuckerberg writes that the platform will be taking steps to improve the integrity of the news sources it carries through what appears to be a plebiscite system:

The hard question we've struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you — the community — and have your feedback determine the ranking.
We decided that having the community determine which sources are broadly trusted would be most objective.
Here's how this will work. As part of our ongoing quality surveys, we will now ask people whether they're familiar with a news source and, if so, whether they trust that source. The idea is that some news organizations are only trusted by their readers or watchers, and others are broadly trusted across society even by those who don't follow them directly. (We eliminate from the sample those who aren't familiar with a source, so the output is a ratio of those who trust the source to those who are familiar with it.)
This update will not change the amount of news you see on Facebook. It will only shift the balance of news you see towards sources that are determined to be trusted by the community.

So, Facebook wants to rescue us from fake news. That's a lot like saying the foxes will take care of preventing further poultry disappearances.

I will grant that this solution avoids certain pitfalls — the bias of self-styled experts, the ignorance of Facebook employees, etc. You could even say that this is the least bad way to weed out garbage news sources, or perhaps that it's the worst way to do it except for all the others, although I'm not really too sure about any of that.

Here's the real issue: It's this false representation of Facebook as a place where it's not completely, insanely stupid for people to go looking for their news. The real answer to garbage news sources is to stop getting news through Facebook. According to the Pew Research Center, Americans are between two and three times more likely to trust the local news (82 percent) and the national news (76 percent) than they are to trust anything they see on social media (34 percent).

The truly necessary task, which Zuckerberg has no interest in performing or assisting in, is to teach that last 34 percent that they're idiots, if they believe what they see on Facebook.

If someone on Facebook sends you lots of news articles, you should unfriend them. Better yet, you should quit Facebook. The purpose of Facebook is to spam people you barely know with baby and wedding pictures — and also for click-farms to help build up the economies of Third World countries.