Spotify’s Joe Rogan Saga Highlights Podcast Moderation Challenges
/cloudfront-us-east-2.images.arcpublishing.com/reuters/QT3EBBX4YVPSFMRCM5E4ZZLKKE.jpg)
Feb 22 (Reuters) – At an advertising industry conference in New York this month, one of the main architects of Spotify’s podcasting strategy (SPOT.N) described what she saw as the biggest challenge facing platforms: how to moderate content.
Dawn Ostroff, chief content and publicity officer, the television veteran who helped bring American podcaster Joe Rogan and other top talent to Spotify, was asked about the backlash to the COVID-19 misinformation being spread on his podcast as Neil Young and other artists pulled their protest music. She said companies faced a “moderation versus censorship dilemma” and there was “no silver bullet”.
Content moderation has been a thorny challenge for online platforms. As social media companies like Meta’s Facebook (FB.O) and Twitter (TWTR.N) have come under pressure to be more transparent about moderation and increase investment in human and artificial intelligence review systems , podcasting has often gone under the radar.
Join now for FREE unlimited access to Reuters.com
Register
The backlash to “The Joe Rogan Experience” which Spotify licensed in an exclusive deal worth more than $100 million in 2020, heightens scrutiny of Spotify’s overall approach to moderation as that it’s evolving from a music streaming service to a podcast giant and investor in original content, industry professionals and researchers say.
It also highlights the podcast industry’s historically hands-off approach to moderation, in part due to its open and fragmented nature.
Different podcasts are hosted by various platforms and sent via RSS feeds or services to directory applications such as Apple Podcasts or Spotify, whose catalog is displayed for listeners. The sheer volume of material – millions of podcasts and hour-long episodes – and the technical challenges of transcription and audio analysis make moderation even more difficult.
Spotify first added podcasts in 2015 and made a major push into the medium from 2019, buying podcast networks Gimlet and Anchor and spending hundreds of millions on exclusive content deals with celebrities like Kim Kardashian and former US President Barack Obama.
It was only last month, when its podcast library reached 3.6 million, that Spotify released its platform rules fully online, in response to the Rogan controversy. The policies have been actively enforced for years and more than 20,000 episodes have been deleted for misinformation about COVID-19 during the pandemic, he said.
Unlike Facebook or Twitter, Spotify does not publish transparency reports that would offer public accountability for content removal. A Spotify spokesperson said it was working towards this goal.
Spotify chief executive Daniel Ek recently told investors he knows his podcasting strategy will “test our teams in new ways.” He said he was “implementing several ground-breaking measures to combat misinformation and ensure greater transparency.”
Audio moderation usually involves converting it to text and using automated tools to filter content or identify moments for human review, but this is time-consuming and inaccurate, experts said. Nuances in speaker tones, the evolution of terms and slang from language to language, and the need to contextualize in longer discussions all contribute to complexity. Read more
Audio moderation is “a perfect storm,” said Mark Little, co-founder of Kinzen, a company hired by Spotify to alert it to brewing issues around election integrity, misinformation and hate speech across the board. the platforms.
“You’re up against something that’s uniquely complex, having this volume…, having a format that defies the kind of textual analysis we’ve relied on in the past.”
In an interview with Reuters on Feb. 2, Ek called Spotify’s global content moderation team a “really big operation.” But he and a spokesperson declined to quantify its investment in content moderation, the number of employees working on platform security or say what technologies it uses.
Spotify uses third-party reviewers to help identify harmful content. Its content team receives advice from a dozen partners who specialize in hate speech, harassment, child exploitation, extremism and disinformation, the spokesperson said.
These consultants, whom Spotify mostly declined to name, provide its internal team – which makes all content moderation decisions – with information, alerting it to potential dangers and helping it detect abuse.
Spotify added 1.2 million podcasts to its catalog last year alone. As the content available on top platforms increases and new show deals are signed, more robust moderation should be built in, some industry experts have argued.
“I’m really hesitant to fall back on ‘it’s hard,’ because we know it’s hard. Is it as hard as building a multi-billion dollar multinational organization that’s basically…the go-to audio app ?” said Owen Grover, former CEO of podcast app Pocket Casts.
SERVICE WEBSITE
The Rogan saga raises both questions about Spotify’s duties when it grants exclusive licenses for shows, and the broader challenge that moderation poses for the podcasting industry.
Podcasts are usually uploaded to hosting platforms and distributed to directory apps like Apple or Google Podcasts or Amazon Music via RSS feeds or the hosting services.
The patchwork nature of hosting sites and directory apps dilutes accountability and makes enforcement uneven on non-proprietary podcasts, industry experts said. Spotify, for example, does not host podcasts, although it does have hosting platforms like Anchor, the home of Rogan’s podcast, and Megaphone.
Podcasts not hosted by a Spotify-owned platform submit shows to Spotify for review before posting to the app. But “a lot of people don’t even realize how easy it is to get something on Spotify or Apple,” said Nick Hilton, who runs Podot, a UK-based independent podcast production company, and has said Spotify’s approval process could only take a few minutes.
Several hosting platforms have said in interviews that they don’t have the ability or desire to verify all the content they host. “We don’t act as moderators,” Blubrry CEO Todd Cochrane said, though he responds to takedown requests, citing the example of a white supremacist group’s removal of measurement services.
“When we get wind of something…we just grab a bag of chips and crank it up to 1.5x speed and sit back and listen,” said Mike Kadin, CEO of the platform. RedCircle hosting, which relies heavily on user reports or signals like racist artwork. “To transcribe every piece of podcast content would be prohibitively expensive.”
The open and accessible nature of podcasting is a key feature of the medium, industry professionals and researchers said, but further scrutiny and advances in moderation tools could lead to more investment in revising the contents.
“We will react here to any changes in the market,” said Daniel Adrian, general counsel for podcast platform Acast. “We don’t know where this will end.”
Join now for FREE unlimited access to Reuters.com
Register
Reporting by Elizabeth Culliford in New York, Dawn Chmielewski in Los Angeles and Supantha Mukherjee in Stockholm; Editing by Kenneth Li and Richard Chang
Our standards: The Thomson Reuters Trust Principles.