“There is a kind of nefarious block of the ‘safe harbor’ act — the Digital Millennium Copyright Act — that basically says that nobody, no musician, no individual can sue Facebook or Google, or YouTube, for posting stuff that they don’t have permission to. So this is why there are 55,000 ISIS videos on YouTube. Right? They claim, ‘We have no responsibility . . . it’s First Amendment rights. We don’t know.’ The only pushback they’ve gotten is from advertisers. Procter & Gamble said, ‘Hey, we’re not so comfortable with our advertising being on terrorist videos. Stop it, please.’ Now they say, ‘Oh, there’s too many videos being uploaded to YouTube, we can’t control it.’
“But you notice there’s no porn on YouTube. So why is that? That’s because they have A.I., artificial intelligence algorithms, that when someone tries to upload porn, it sees a bare breast and it stops it and puts it into a separate queue where a human looks at it and says, ‘Well, is this National Geographic video? Or is this porn?’ And if it’s porn, it doesn’t go up, and if it’s National Geographic, it does go up. Well, they could do the same thing with ISIS videos. As you well know — any of you who’ve ever used [the music recognition app] Shazam — they can do the same thing with every tune, every movie, that someone doesn’t want up there, in three seconds the audio signature would tell them, ‘This is something we don’t want’ and stop it.
Link