SFGate, San Francisco
In the now-viral video, an activist and an Alameda County sheriff’s deputy enter a tense discussion over a sign at a protest for Steven Taylor, a Black man shot and killed by a San Leandro police officer at a Walmart in 2020.
The protest took place Thursday in front of a hearing for Jason Fletcher, the officer charged with involuntary manslaughter for the shooting.
“You just said [the sign] was a tripping hazard,” says the protester, James Burch with the Anti Police-Terror Project, to the officer.
“You can’t keep twisting this … let’s go back to the beginning,” says the officer, Sgt. D. Shelby. “[The sign] was on the walls, right, and you removed it from the walls.”
As Burch responds to the deputy, Shelby pulls out his phone and taps the screen a few times.
Then, suddenly, a mechanical snare sound begins blaring from the deputy’s cellphone. Taylor Swift’s voices comes in, “Nice to meet you, where you been?” — the officer is playing Swift’s Grammy-nominated smash ” Blank Space.”
“Are we having a dance party now?” Burch asks.
“You can record all you want, I just know that it can’t be posted to YouTube,” the deputy later responds.
The clip has received intense scrutiny since the Oakland-based police abolition group Anti Police-Terror Project posted the video to Twitter. KTVU reports that Shelby was reported internally and is being investigated. The practice was also condemned by an Alameda County sheriff’s spokesman.
(Burch did not immediately provide comment to SFGATE.)
The strategy is increasingly being used by police departments and sheriff’s offices nationwide to deter activist groups and civilians alike from sharing recordings of officers and other officials.
An incident went viral in February in which a Beverly Hills police officer blasted “Santeria” by the Southern California band Sublime. Another incident in March showed an Illinois sheriff’s office worker playing a Blake Shelton song to work around being recorded.
And cases like these will likely just become more common, experts told SFGATE.
“Unfortunately, using copyright as a tool of censorship is a longstanding and effective technique,” said Katharine Trendacosta of the San Francisco-based digital rights organization Electronic Frontier Foundation. “It’s even easier to do on platforms like YouTube, which have employed automated filters which preemptively remove material when a match is detected. “
How — and why — are law enforcement officers turning to this? The tactic exploits YouTube, Instagram and other social media platforms’ automatic filters, which take down videos containing copyrighted material, including pop songs.
“This is a crude exploit of automated content filtering systems and shows the real danger of so-called ‘upload filters’ that block content from appearing on websites with no human review or discretion,” said Erik Stallman, an assistant clinical professor of law at UC Berkeley, in an email to SFGATE.
For instance, once YouTube’s system (called Content ID) identifies that a song by Swift or another musician is being played, it will either take down the video without much possibility for recourse or mute or demonetize the video entirely.
This has stopped YouTube content creators and other influencers from featuring popular music in their videos lest they lose money or viewership. Music is uniquely susceptible to copyright claims, Trendacosta said, since “filters are oversensitive to it.”
“It is so easy to just play a song when you see someone pointing a camera at you,” she added.
And it’s part of several algorithmic flaws on YouTube, such as videos of journalists and activists exposing hate speech being taken down for violating the site’s anti-hate speech policies. YouTube has done little to rectify this, Stallman argues.
“Unfortunately, refinements to upload filters that make them harder to evade are also making them easier to abuse, as happened here,” he said.
And while YouTube does offer an appeals process, it’s more often than not time-consuming and not guaranteed to work, even in cases such as this, Stallman says.
But this sort of recording — a newsworthy event that arguably falls under fair use laws — should be protected and kept up, both experts tell SFGATE. And the algorithmic filters used by digital media companies aren’t advanced enough to take fair use laws into consideration.
“Having them prevent video from being seen is a huge threat to legal expression online,” Trendacosta said.
To be fair, as of Friday morning, the video remains live on the Anti Police-Terror Project’s YouTube and Twitter accounts.
Still, copyright laws perhaps did not anticipate this consequence, that officials can use copyrighted media — especially music — to avoid having a light shone on public conflict shared online.
Both Stallman and Trendacosta argue that rather than addressing these issues, YouTube and other social media platforms are merely intensifying their copyright campaigns.
“I’m especially concerned because we keep seeing calls for more of these filters, not fewer, and for them to be more restrictive, not less,” Trendacosta said. “That would make them even easier to game by anyone trying to prevent well-earned criticism of their actions.”
(c)2021 SFGate, San Francisco
Visit SFGate, San Francisco at www.sfgate.com
Distributed by Tribune Content Agency, LLC.