The Facebook Journalism Project debuted to help save the media and tackle the social network's growing misinformation problem.
The company said this project's focus lies with helping media outlets find sustainable business models; teaching journalists how to use its technologies; and making sure its users are better informed than ever. The company has made similar claims in the past--that's what Instant Articles, Graph Search, and its News Feed were all designed to do--but this new project could prove more critical than its predecessors because it follows months of controversy.
Facebook was criticized throughout the 2016 presidential election for allowing "fake news" to spread on its platform. Many people get most of their news from the social network, yet the service has struggled to separate misleading blog posts or satirical essays from quality journalism. A link is a link is a link; Facebook users post, discover, read, share, comment on, and react to web pages they find on the network in the same way, regardless of its source.
The Facebook Journalism Project wants to change that by teaching people news literacy. The company explained:
We will work with third-party organizations on how to better understand and to promote news literacy both on and off our platform to help people in our community have the information they need to make decisions about which sources to trust. [...] In the short-term, we are working with the News Literacy Project to produce a series of public service ads (PSAs) to help inform people on Facebook about this important issue. Our longer-term goal is to support news organizations with projects and ideas aimed at improving news literacy, including financial grants where needed.
That's how Facebook wants to use its money to address its fake news problem. It's also going to use its technology; the company recently made it easier for its users to report hoaxes, blocked scammers from its advertising platform, and partnered up with "third-party fact checking organizations that are signatories of Poynter’s International Fact Checking Code of Principles" to identify misinformation on its network and take the appropriate action.
This matters to Facebook because it needs the media to give its users something to do besides post memes, look at baby pictures, and decline friend requests from people they barely knew in high school. It matters to the media because Facebook is currently one of the best ways for journalism to find an audience while also bringing in the revenues needed to fund that reporting. And it matters to Facebook users, because they should have access to news.
The question is whether or not it will work. Facebook doesn't have the best track record when it comes to supporting the media--much of the industry has to scramble to keep up with Facebook's whims--or with policing its service. The company has pulled videos of public interest from its platform, only to restore them later; censored images for depicting the female body; and flip-flopped on whether or not it's okay to share newsworthy but violent content.
Partnering with outside groups could be what Facebook needed to improve. "This problem is much bigger than any one platform, and it’s important for all of us to work together to minimize its reach," the company said. "This is just the beginning of our effort on that front — we have much more to do." Every announcement related to the Facebook Journalism Project and its progress will be collected into a single page on Facebook's media website.
Personally, I'm pretty certain FB's version of the truth isn't going to jive with my own.
If anything, it would be helpful if FB takes a more traditional newspaper or site approach and have opinion or blog pieces clearly differentiated from actual "news" articles that are just reporting on the facts of a situation. That being said, I must be one of the few that don't rely on social media for news. It just seems weird and potentially problematic to me.
"Truth stands, even if there be no public support. It is self-sustained."
-I laughed! But I agree with your intended wish.. that people would actually be smart enough.. but I think there is a huge FB population that is provably NOT smart enough, no matter what FB believes.
When people say "something could be true for someone but not for someone else", they are mixing "truths" with "beliefs". Truth doesn't have to be believed in. It is a truth, plain and simple. Your point of view doesn't matter. It's a truth.
The problem with fake news is that many outlets these days are publishing stories trying to pass them as legitimate news, but are instead either an opinion piece, or something based on rumors or informations that can't be verified.
I agree that the world is not a binary entity. It is not merely 0s or 1s. There's a whole spectrum of variables to take into account. And I also agree that people SHOULD be able to separate the BS from the real stuff, but we all know that's not the case. People usually tend to "believe" as truth whatever seems to validate their pre-conceived opinions. Even if they're based entirely on non-proven "facts". THIS is the heart of the problem.
An innitiative to label "fake news", "opinion pieces", "parody websites" or "unreliable sources" as such, isn't censorship. Censorship would be to prevent those from appearing on the platform, which, as far as I know, isn't what this would be about. This would merely be a label to inform people about the content they're about to click on. Recent studies have demonstrated that people in general are more enclined to believed whatever has been shared by their friends on Facebook, than stuff coming from serious and dedicated news sources. This is a huge problem.
Let me give you an example. The facts: Two different married men catch their wives having an affair and decide to file for divorce. Although the basic facts are the same, one man is happy about if because he was looking for a reason to file for divorce while the other is sad because he loved his wife. The truth, based on the same facts, can be different for two people based on circumstances, experiences, or even things like culture.
My point is, no one can decide what the truth is for you, but you.
Yes, "fake news" exists outside the networks too. But I see the crack down on "fake news" as being a little too broad in scope... to the point of shutting down "unauthorized" news sources not controlled by big money.
Perhaps it would be better to offer 3 levels of news. 1. unfiltered, un-fact-checked news from all sites and sources; 2. Filtered news, removing all news from known fake news sites and sources; 3. Filtered and verified news adding fact checking of all stories before posting.
I'm sure number 3 would be the least popular most people can't handle the truth (isn't that right Jack).
People are more inclined to read news with sensational headlines so the news organizations air news designed to bring in viewers, not convey the truth. Stations that just tell the real news w/o leaning it one way or the other or exaggerating the facts tend to do poorly in ratings.
Case in point no major news network is talking about the nand shortage yet, but you hear about a cat that used to ride a train everyday in Japan that died. One can really affect you and your business the other not at all unless you live in Japan but you can see which was reported in the USA.