Your Source for NPR News & Music
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Murder Video Again Raises Questions About How Facebook Handles Content

A conference worker passes a demo booth at Facebook's annual F8 developer conference, on Tuesday in San Jose, Calif.
Noah Berger
/
AP
A conference worker passes a demo booth at Facebook's annual F8 developer conference, on Tuesday in San Jose, Calif.

Video of a murder uploaded to Facebook this week upset many users, especially since it took Facebook two hours to take it down. But the incident illustrates a dilemma for the company as it becomes an open platform for both recorded and livestreamed video.

Facebook CEO Mark Zuckerberg was contrite about the incident when he appeared on stage at the company's F8 developer's conference.

"Our hearts go out to the family and friends of Robert Godwin Sr.," said Zuckerberg, referring to the man whose murder was posted on Facebook. "And we have a lot of work, and we will keep doing all we can to prevent tragedies like this from happening."

But, doing more may not be so easy for Facebook. On the one hand, its users want to be free to express themselves; on the other hand, they do want some protection.

"Half the time it's, 'Oh no, Facebook didn't take something down, and we think that's terrible; they should have taken it down,' " says Daphne Keller, a law professor at Stanford University. "And the other half of the time is, 'Oh no! Facebook took something down and we wish they hadn't.' "

Keller points to an incident last year when Facebook took down a post of an iconic Vietnam War photo of a naked girl running from a napalm attack. The removal upset users.

Keller says Facebook isn't actually under legal obligation to keep anything up or to take down a video of a crime. The company wants to respond to keep users happy. "They want to take things like this down, and they're working really hard to have a good way to do that," she says.

Keller thinks part of Facebook's dilemma is that society isn't sure yet whether the company should be like the phone company, which isn't responsible for what people say, or if it should be like a traditional broadcaster, subject to strict regulations on what can be put on air.

"And I think Facebook isn't really exactly like either of those two things," says Keller, "and that makes it hard as a society to figure out what it is we do want them to do."

Nearly 2 billion people use Facebook each month, and millions of them are uploading videos every day. Facebook also pays media outlets, including NPR, to upload videos. That volume of content makes Facebook's job a lot harder.

The company has three ways of monitoring content: There are the users — like the ones who flagged the murder videos from Cleveland. Facebook also has human editors who evaluate flagged content. And, there's artificial intelligence, which can monitor enormous amounts of content.

But, even AI has its limits, says Nick Feamster, a professor of computer science at Princeton University. Take that iconic naked girl photo from Vietnam, he says. "Can we detect a nude image? That's something that an algorithm is pretty good at," he says. "Does the algorithm know context and history? That's a much more difficult problem."

Feamster says it's not a problem that's likely to be solved anytime soon. However, he says AI might be able to detect signs of a troublesome account. It's sort of like the way a bank assesses credit ratings.

"Over time you might learn a so-called prior probability that suggests that maybe this user is more likely to be bad or more likely to be posting inappropriate or unwanted content," Feamster says.

So, Facebook would keep a closer eye on that account.

Between artificial intelligence and more human monitoring, it might be possible to stop the posting of criminal videos and hate speech.

But, Stanford's Keller wonders if that's really what we want.

"Do we want one of our key platforms for communication with each other to have built-in surveillance and monitoring for illegal activity and somebody deciding when what we said is inappropriate and cutting it off?" she asks. "That's kind of a dystopian policy direction as far as I'm concerned."

Keller is willing to make a prediction — very soon someone else will upload another video that forces the country to be asking these same questions again.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Laura Sydell fell in love with the intimate storytelling qualities of radio, which combined her passion for theatre and writing with her addiction to news. Over her career she has covered politics, arts, media, religion, and entrepreneurship. Currently Sydell is the Digital Culture Correspondent for NPR's All Things Considered, Morning Edition, Weekend Edition, and NPR.org.
Related Stories