HomeUncategorizedSpotify's Joe Rogan saga spotlights podcast moderation challenges

Spotify’s Joe Rogan saga spotlights podcast moderation challenges

By Elizabeth Culliford, Dawn Chmielewski and Supantha Mukherjee

(Reuters) -At an ad industry conference in New York this month, one of the key architects of Spotify’s podcasting strategy outlined what she saw as the biggest challenge facing platforms: how to moderate content.

Chief Content and Advertising Business Officer Dawn Ostroff, the television veteran who had helped bring U.S. podcaster Joe Rogan and other top talent to Spotify, had been asked about the backlash to COVID-19 misinformation spread on his podcast as Neil Young and other artists yanked their music in protest. She said companies faced a “dilemma of moderation versus censorship” and there was “no silver bullet.”

Content moderation has been a thorny challenge for online platforms. While social media companies like Meta’s Facebook and Twitter have faced pressure to be more transparent over moderation and ramp up investment in human and artificial intelligence review systems, podcasting has often flown under the radar.

The backlash over “The Joe Rogan Experience” which Spotify licensed in a more than $100 million exclusive deal in 2020, heightens scrutiny on Spotify’s overall approach to moderation as it evolves from a music streaming service to a podcast giant and investor in original content, industry professionals and researchers said.

It also turns the spotlight on the podcast industry’s historically hands-off approach to moderation, partly a result of its open and fragmented nature.

Different podcasts are hosted by various platforms and sent through RSS feeds or services to directory apps like Apple Podcasts or Spotify which catalog shows for listeners. The sheer volume of material – millions of podcasts and hours-long episodes – and technical challenges of transcribing and analyzing audio make moderating even tougher.

Spotify first added podcasts in 2015 and made a major push into the medium from 2019, buying podcast networks Gimlet and Anchor and spending hundreds of millions on exclusive content deals with celebrities like Kim Kardashian and former U.S. President Barack Obama.

Only last month, as its podcast library swelled to 3.6 million, did Spotify publish its platform rules in full online, in response to the Rogan controversy. The policies have been actively enforced for years, and over 20,000 episodes have been removed for COVID-19 misinformation during the pandemic, it said.

Unlike Facebook or Twitter, Spotify does not issue transparency reports that would offer public accounting of content removal. A Spotify spokesman said it was working toward this goal.

Spotify Chief Executive Daniel Ek recently told investors he knew its podcasting strategy would “test our teams in new ways.” He said it was “implementing several first-of-its-kind measures to help combat misinformation and provide greater transparency.”

Moderating audio generally involves converting it to text and using automated tools to filter content or identify moments for human review, but it is time-intensive and inexact, experts said. The nuances of speakers’ tones, evolving terms and slang across languages, and the need to contextualize within longer discussions all contribute to the complexity.

Audio moderation is “a perfect storm,” said Mark Little, co-founder of Kinzen, a firm contracted by Spotify to alert it to brewing problems concerning election integrity, misinformation and hate speech across platforms.

“You’re faced with something that is uniquely complex, having this volume …, having a format that defies the kind of textual analysis that we’ve relied upon in the past.”

In a Feb. 2 Reuters interview, Ek called Spotify’s global content moderation team a “very big operation.” But he and a spokesman declined to quantify its investment in content moderation, how many employees work on platform safety, or say what technologies it uses.

Spotify uses third-party reviewers to help identify harmful content. Its content team gets advice from a dozen partners with expertise in hate speech, harassment, child exploitation, extremism and misinformation, the spokesman said.

These consultants, most of whom Spotify declined to name, provide its in-house team – which makes all content moderation decisions – with insights, alert it to potential dangers and help it detect abuse.

Spotify added 1.2 million podcasts to its catalog last year alone. As the content available on top platforms swells and new show deals are inked, more robust moderation should be built in, some industry experts argued.

“I’m really hesitant to just fall back on ‘it’s hard,’ because we know it’s hard. Is it as hard as creating a multi-billion-dollar, multinational organization that basically is … the go-to audio app?” said Owen Grover, former CEO of podcast app Pocket Casts.

WEB OF SERVICES

The Rogan saga both raises questions about Spotify’s duties when it exclusively licenses shows, and about the broader challenge moderation poses for the podcasting industry.

Podcasts are generally uploaded to hosting platforms and distributed to directory apps like Apple or Google Podcasts or Amazon Music via RSS feeds or the hosting services.

The patchwork nature of hosting sites and directory apps dilutes responsibility and makes for spotty enforcement on non-exclusive podcasts, industry experts said. Spotify, for example, does not host podcasts, though it owns hosting platforms like Anchor, home of Rogan’s podcast, and Megaphone.

Podcasts not hosted by a Spotify-owned platform submit shows to Spotify for review before their release on the app. But “a lot of people don’t even realize how simple it is to get something through on Spotify or Apple,” said Nick Hilton, who runs Podot, a UK-based independent podcast production company and said the Spotify approval process can take only a few minutes.

Several hosting platforms said in interviews they did not have the ability or desire to vet all the content they host. “We don’t act as moderators,” said Blubrry CEO Todd Cochrane, though it responds to takedown requests, citing the example of removing measurement services from a white supremacist group.

“When we get wind of something … we’d just get a bag of potato chips and turn the speed up to 1.5x and sit down and listen,” said Mike Kadin, CEO of hosting platform RedCircle, which largely relies on user reports or signals like racist artwork. “Transcribing every piece of podcast content would be prohibitively expensive.”

Podcasting’s open, accessible nature is a key feature of the medium, industry professionals and researchers said, but greater scrutiny and advancements in moderation tools could lead to more investment in reviewing content.

“We will react to any market changes here,” said Daniel Adrian, general counsel of podcast platform Acast. “We don’t know where this will end up.”

(Reporting by Elizabeth Culliford in New York, Dawn Chmielewski in Los Angeles and Supantha Mukherjee in Stockholm; Editing by Kenneth Li and Richard Chang)

tagreuters.com2022binary_LYNXMPEI1L0FU-VIEWIMAGE

RELATED ARTICLES
- Advertisment -
Google search engine

Most Popular

Recent Comments