Artwork

Player FM - Internet Radio Done Right

21 subscribers

Checked 3M ago
Додано two роки тому
Вміст надано Stanford Law School and Evelyn douek. Весь вміст подкастів, включаючи епізоди, графіку та описи подкастів, завантажується та надається безпосередньо компанією Stanford Law School and Evelyn douek або його партнером по платформі подкастів. Якщо ви вважаєте, що хтось використовує ваш захищений авторським правом твір без вашого дозволу, ви можете виконати процедуру, описану тут https://uk.player.fm/legal.
Player FM - додаток Podcast
Переходьте в офлайн за допомогою програми Player FM !
icon Daily Deals

An Investigation into Self-Generated Child Sexual Abuse Material Networks on Social Media

39:24
 
Поширити
 

Manage episode 365871238 series 3397905
Вміст надано Stanford Law School and Evelyn douek. Весь вміст подкастів, включаючи епізоди, графіку та описи подкастів, завантажується та надається безпосередньо компанією Stanford Law School and Evelyn douek або його партнером по платформі подкастів. Якщо ви вважаєте, що хтось використовує ваш захищений авторським правом твір без вашого дозволу, ви можете виконати процедуру, описану тут https://uk.player.fm/legal.

Stanford’s Evelyn Douek and Alex Stamos are joined by Stanford Internet Observatory (SIO) Research Manager Renée DiResta and Chief Technologist David Thiel to discuss a new report on a months-long investigation into the distribution of illicit sexual content by minors online.

Large Networks of Minors Appear to be Selling Illicit Sexual Content Online

The Stanford Internet Observatory (SIO) published a report last week with findings from a months-long investigation into the distribution of illicit sexual content by minors online. The SIO research team identified a large network of accounts claiming to be minors, likely teenagers, who are producing, marketing and selling their own explicit content on social media.

A tip from The Wall Street Journal informed the investigation with a list of common terms and hashtags indicating the sale of “self-generated child sexual abuse material” (SG-CSAM). SIO identified a network of more than 500 accounts advertising SG-CSAM with tens of thousands of likely buyers.

With only public data, this research uncovered and helped resolve basic safety failings with Instagram’s reporting system for accounts with expected child exploitation, and Twitter’s system for automatically detecting and removing known CSAM.

Most of the work to address CSAM has focused on adult offenders who create the majority of content. These findings highlight the need for new countermeasures developed by industry, law enforcement and policymakers to address sextortion and the sale of illicit content that minors create themselves.

Front-Page Wall Street Journal Coverage

  • A Wall Street Journal article first covered Twitter’s lapse in safety measures to prevent known CSAM from appearing on the site and the importance of researcher access to study public social media data to identify and help address issues. - Alexa Corse/ The Wall Street Journal
  • Instagram was the focus of a larger Wall Street Journal investigation, based in part on SIO’s research findings. The app is currently the most significant platform for these CSAM networks, connecting young sellers with buyers with recommendation features, searching for hashtags, and direct messaging. - Jeff Horwitz, Katherine Blunt/ The Wall Street Journal

Bipartisan Concern and Calls for Social Media Regulation

The investigation sparked outrage across the aisle in the U.S. and grabbed the attention of the European Commission as the European Union prepares to enforce the Digital Services Act for the largest online platforms later this summer.

  • Thierry Breton, the top EU official for trade and industry regulation, announced that he will meet with Meta CEO Mark Zuckerberg later this month at the company’s Menlo Park headquarters to discuss the report and demand the company takes action.

In Congress, House Energy and Commerce Democrats and GOP Senators were most outspoken about taking action to address the concerning findings.

Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.

Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.

Like what you heard? Don’t forget to subscribe and share the podcast with friends!

  continue reading

86 епізодів

Artwork
iconПоширити
 
Manage episode 365871238 series 3397905
Вміст надано Stanford Law School and Evelyn douek. Весь вміст подкастів, включаючи епізоди, графіку та описи подкастів, завантажується та надається безпосередньо компанією Stanford Law School and Evelyn douek або його партнером по платформі подкастів. Якщо ви вважаєте, що хтось використовує ваш захищений авторським правом твір без вашого дозволу, ви можете виконати процедуру, описану тут https://uk.player.fm/legal.

Stanford’s Evelyn Douek and Alex Stamos are joined by Stanford Internet Observatory (SIO) Research Manager Renée DiResta and Chief Technologist David Thiel to discuss a new report on a months-long investigation into the distribution of illicit sexual content by minors online.

Large Networks of Minors Appear to be Selling Illicit Sexual Content Online

The Stanford Internet Observatory (SIO) published a report last week with findings from a months-long investigation into the distribution of illicit sexual content by minors online. The SIO research team identified a large network of accounts claiming to be minors, likely teenagers, who are producing, marketing and selling their own explicit content on social media.

A tip from The Wall Street Journal informed the investigation with a list of common terms and hashtags indicating the sale of “self-generated child sexual abuse material” (SG-CSAM). SIO identified a network of more than 500 accounts advertising SG-CSAM with tens of thousands of likely buyers.

With only public data, this research uncovered and helped resolve basic safety failings with Instagram’s reporting system for accounts with expected child exploitation, and Twitter’s system for automatically detecting and removing known CSAM.

Most of the work to address CSAM has focused on adult offenders who create the majority of content. These findings highlight the need for new countermeasures developed by industry, law enforcement and policymakers to address sextortion and the sale of illicit content that minors create themselves.

Front-Page Wall Street Journal Coverage

  • A Wall Street Journal article first covered Twitter’s lapse in safety measures to prevent known CSAM from appearing on the site and the importance of researcher access to study public social media data to identify and help address issues. - Alexa Corse/ The Wall Street Journal
  • Instagram was the focus of a larger Wall Street Journal investigation, based in part on SIO’s research findings. The app is currently the most significant platform for these CSAM networks, connecting young sellers with buyers with recommendation features, searching for hashtags, and direct messaging. - Jeff Horwitz, Katherine Blunt/ The Wall Street Journal

Bipartisan Concern and Calls for Social Media Regulation

The investigation sparked outrage across the aisle in the U.S. and grabbed the attention of the European Commission as the European Union prepares to enforce the Digital Services Act for the largest online platforms later this summer.

  • Thierry Breton, the top EU official for trade and industry regulation, announced that he will meet with Meta CEO Mark Zuckerberg later this month at the company’s Menlo Park headquarters to discuss the report and demand the company takes action.

In Congress, House Energy and Commerce Democrats and GOP Senators were most outspoken about taking action to address the concerning findings.

Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.

Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.

Like what you heard? Don’t forget to subscribe and share the podcast with friends!

  continue reading

86 епізодів

Усі епізоди

×
 
Alex and Evelyn repeat the now-annual tradition of recording the podcast in front of probably their entire active listener base. They are joined by David Thiel, Brian Fishman, and Daphne Keller, to say goodbye to Theirry Breton and RT's accounts on Meta, talk about Zuckerberg's retreat from politics, and all the developments in the land of the First Amendment and platform regulation.…
 
Alex and Evelyn are joined by Carlos Affonso Souza, a Professor of Law at Rio de Janeiro State University and the Director of the Institute for Technology & Society in Rio de Janeiro, to talk about Brazil's ban of X, the local legal and political context, and how this is similar or different to other show downs between regulators and American tech platforms.…
 
Alex and Evelyn discuss the arrest and charges against Telegram's CEO, Pavel Durov, in France, what we do and don't know, and what it means for the future of platform regulation, with Frédérick Douzet, Professor at the French Institute of Politics and the director of GEODE, and Daphne Keller, director of the Program on Platform Regulation at Stanford’s Cyber Policy Center.…
 
Alex and Evelyn talk about Trump's return to X and other platforms, Thierry Breton's attempt to make it all about him, the hack and leak of Trump's campaign, the FBI's new rules around communicating with platforms about foreign interference, Apple imposing its 30% commission on Patreon, and a small little sporting event that happened recently.…
 
Evelyn sat down with Professor Genevieve Lakier, of the University of Chicago Law School, to discuss the Supreme Court's decision regarding the Texas and Florida social media laws. Not the worst opinion the Supreme Court issued on July 1, but predictably there's a lot to complain about anyway.
 
Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments: The attention grabbing 404 Media headline “ Has Facebook Stopped Trying? ” could be on to something. Alex discusses significant disinvestment in trust and safety at Facebook with lots of junk spreading (such as AI-generated Shrimp Jesus ) and a sextortion challenge. - Jason Koebler/ 404 Media YouTube announced it is testing a feature for users to add notes under videos with context or fact checks. - YouTube The U.S. military ran a covert anti-vaccine influence operation on social media intended to discredit China’s COVID vaccine in the Philippines. - Chris Bing, Joel Schectman. Reuters U.S. Surgeon General Vivek Murthy called for a warning label on social media platforms in a New York Times opinion essay. - Vivek Murthy/ The New York Times , “The Daily” , Erin Burnett/ CNN Not everyone agrees with his recommendation (not to mention the First Amendment or existing evidence). - Clay Calvert/ AEI , Deidre McPhillips/ CNN , J. Nathan Matias, Janet Haven/ Tech Policy Press , Mike Masnick/ The Daily Beast , Caroline Mimbs Nyce/ The Atlantic They reference this report from the National Academies of Sciences, Engineering, and Medicine - National Academies And this one pager New York state lawmakers passed and Gov. Kathy Hochul signed into law the Stop Addictive Feeds Exploitation (SAFE) for Kids Act prohibiting social media companies from using “addictive” algorithmic feeds for minors under 18 without parental consent. - Anthony Izaguirre/ Associated Press , Carolyn Thompson/ Associated Press , Mark Wilson/ Fast Company , Kat Tenbarge/ NBC News , Austin Jenkins/ Pluribus News , Anthony Ha/ TechCrunch , Common Sense Media , Governor of New York Negotiations at the end of the state legislative session on June 6 limited a restriction on overnight notifications and removed the right to take private legal action against social media companies for alleged violations. Tech trade associations oppose the legislation arguing it is unconstitutional with free speech restrictions that make children less safe with less curation of social media feeds. - Chamber of Progress , NetChoice Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos . Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance. Like what you heard? Don’t forget to subscribe and share the podcast with friends!…
 
Alex and Evelyn sit down with the authors of two recently released books about our online information ecosystem and what to do about it: Annalee Newitz, author of Stories are Weapons: Psychological Warfare and the American Mind, and Renee DiResta, author of Invisible Rulers: The People Who Turn Lies into Reality.…
 
Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments: OpenAI published its first transparency report on covert influence operations using the company’s AI models finding the tools were used for existing campaigns by Russia, China, Iran and Israel with limited reach. - Ina Fried/ Axios , OpenAI In very related news, Meta announced it removed foreign influence operations using AI-generated content. - Aisha Counts/ Bloomberg News , Margarita Franklin, Lindsay Hundley, Mike Torrey, David Agranovich, Mike Dvilyanski/ Meta Meta claims it is still able to detect influence operations using AI-generated content, but recent Stanford Internet Observatory research found such content is being widely used for spam that generates engagement with surreal or emotional content. Both Meta and OpenAI point fingers at Israeli actors for using generative AI in influence operations and Meta claimed a victory in stopping the infamous Russian Doppelganger operation. California legislators are considering dozens of bills with AI regulations. One of the most prominent and controversial is SB 1047 , the Safe and Secure Innovation for Frontier Artificial Intelligence Models Act. - Jeremy B. White/ Politico What it Would Do: The bill would create sweeping AI safety regulations against “hazardous capabilities” and a Frontier Model Division of California Department of Technology to set those new rules for the most powerful AI models, including a “kill switch.” The bill also includes CalCompute, a public cloud computing cluster for AI safety research. The Politics: The bill was introduced by State Senator Scott Wiener, an ambitious Democrat seeking to succeed former House Speaker Nancy Pelosi. While state lawmakers have introduced many ambitious AI safety regulations, Governor Gavin Newsom is urging a focus on innovation to protect the state’s tech industry. - Jeremy B. White/ Politico TikTok Tick-Tock TikTok is funding a lawsuit brought by a diverse group of eight creators against the federal government’s divest-or-ban measure. The new suit was combined with the lawsuit brought by TikTok and parent company ByteDance with an expedited schedule to hear the case in September. - Josephine Rozzelle/ CNBC , David Shepardson/ Reuters , Julia Shapero/ The Hill , Taylor Lorenz, Drew Harwell/ The Washington Post The creators include a cattle rancher, cookie baker, feminist activist, college football coach and a rapping conservative commentator. Their challenge focuses on First Amendment free speech rights. The tech trade association NetChoice booted TikTok earlier in May following pushback from Congressional offices that warned of an investigation into organizations tied to TikTok. - Daniel Lippman, Brendan Bordelon/ Politico In a possible preview of what to expect in Murthy v. Missouri , the Supreme Court released a unanimous decision in NRA v. Vullo that found a New York state official likely violated the free speech rights of the National Rifle Association by pressuring banks and insurers to cut ties with the organization after the Parkland high school shooting. - Justin Jouvenal/ The Washington Post Down Under An Australian court rejected an eSafety Commissioner global removal order for X to hide content with video of a stabbing attack at a Sydney church. eSafety has since dropped the case against X. - Jake Evans, Jordyn Butler/ ​​ ABC News (Australia) , Rod McGuirk/ Associated Press , Tanvi Nair/ Australian Institute of International Affairs , Sumathi Bala/ CNBC , Josh Taylor/ The Guardian The court ruling acknowledged that the order would likely “be ignored or disparaged in other countries.” Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos . Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance. Like what you heard? Don’t forget to subscribe and share the podcast with friends!…
 
Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments: TikTok Tick-Tock A law requiring TikTok parent company ByteDance to divest or face a U.S. ban was passed and signed into law as part of a foreign aid and national security funding package. - Casey Newton/ Platformer , Rebecca Kern/ Politico , Mike Scarcella/ Reuters , John Perrino/ Tech Policy Press , Sapna Maheshwari, David McCabe / The New York Times , Drew Harwell/ The Washington Post , Cristiano Lima-Strong/ The Washington Post , @TikTokPolicy Netzwerkdurchsetzungsgesetz (EU Policy Corner) The European Commission opened formal proceedings against Meta on potential DSA violations including the “deprecation and planned discontinuation of CrowdTangle” happening “without an adequate replacement” ahead of the European elections. - Jon Brodkin/ Ars Technica , Clothilde Goujard/ Politico , Clothilde Goujard, Aoife White/ Politico , Natasha Lomas/ TechCrunch , Lisa O'Carroll/ The Guardian , Adam Satariano/ The New York Times , European Commission , Mathias Vermeulen (@mathver) The probe will also investigate how foreign influence operations are spreading on Facebook and Instagram, how Meta is handling political advertising and content recommendations and issues with flagging and removing illegal content. Meta’s Threads announced it wouldn’t recommend political commentary earlier this year and recent research found the pro-Kremlin Doppelganger network is buying Facebook ads ahead of the EU election. - Clothilde Goujard/ Politico , Taylor Lorenz, Naomi Nix/ The Washington Post The REPORT Act was signed into law by President Biden on May 7 after passing both chambers of Congress on April 29. The law makes common sense updates to the nation's online child abuse reporting system and expands reporting requirements to include instances of child grooming and trafficking. - Dave Williams/ Capitol Beat , Julie Tsirkin/ NBC News , Lauren Forristal/ TechCrunch , Amanda Silberling/ TechCrunch , Kate Klonick, Margo Williams/ The Klonickles , The White House , @HouseFloor , Office of Congresswoman Laurel Lee , Office of Senator Jon Ossoff , Office of Senator Marsha Blackburn Georgia lawmakers passed the Protecting Georgia's Children on Social Media Act of 2024, SB 351 , requiring age verification and parental consent for teens under 16 to create social media accounts. The bill also updates school education requirements to cover online safety. - Brenna Goth/ Bloomberg Law , FOX 5 Atlanta Legal Corner SCOTUS denied an application for a stay of the Texas age verification law for adult sites. The cert petition is still pending and they didn’t give a reason, but it’s still kind of amazing given the precedent is so firmly against them and you’d normally expect a stay when First Amendment rights are threatened. - Andrew Chung/ Reuters , Adam Liptak/ The New York Times , Free Speech Coalition Sports Corner Alex said he is “excited” to root for the New York Knicks in the NBA playoffs with his Sacramento Kings failing to make the playoffs. If only there was more New York sports coverage. - Chris Herring/ ESPN Despite calling New York sports fans “the worst,” his show notes writer says there is still time to be a bandwagon Jalen Brunson fan. Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos . Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance. Like what you heard? Don’t forget to subscribe and share the podcast with friends!…
 
Stanford’s Evelyn Douek and Alex Stamos are joined by Stanford Internet Observatory’s Shelby Grossman to discuss SIO’s just-released report on the Strengths and Weaknesses of the Online Child Safety Ecosystem. Read the report here . SIO is also calling for presentation proposals for its annual Trust and Safety Research Conference. Proposals are due April 30. Details are here: https://io.stanford.edu/conference Join the conversation and connect with Evelyn and Alex on your favorite social media platform that doesn’t start with “X.” Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance. Like what you heard? Don’t forget to subscribe and share the podcast with friends!…
 
Stanford’s Evelyn Douek and Alex Stamos are joined by University of Washington professor Kate Starbird to discuss research on election rumors. Kate Starbird is an associate professor at the University of Washington in the Department of Human Centered Design & Engineering where she is also a co-founder of the Center for an Informed Public. - University of Washington House Judiciary Committee Kate Starbird interview transcript House Judiciary Committee Alex Stamos interview transcript Sports Corner Noted American sports expert Evelyn Douek discusses the NCAA women’s basketball championship in this slam dunk segment. Dawn Staley’s South Carolina Gamecocks defeated superstar Caitlin Clark’s Iowa Hawkeyes 87-75 on Sunday in what is expected to be the most watched women’s basketball game of all time with an average ticket price hovering around $500. - Jill Martin/ CNN , Alexa Philippou/ ESPN Join the conversation and connect with Evelyn and Alex on your favorite social media platform that doesn’t start with “X.” Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance. Like what you heard? Don’t forget to subscribe and share the podcast with friends!…
 
SHOW NOTES Stanford’s Evelyn Douek and Alex Stamos weigh in on the latest online trust and safety news and developments: X this week had its lawsuit against the Center for Countering Digital Hate thrown out by a Californian district court. It’s a good and important win for free speech. - Emma Roth / The Verge A Kremlin-linked group was spreading divisive stories about Kate Middleton as online rumors swirled about her whereabouts. Why? - Mark Lander and Adam Satariano / The New York Times In the aftermath of the collapse of Baltimore's Francis Scott Key Bridge, the destruction of X as a platform for useful information about breaking news was all too clear. - A.W. Ohlheiser / Vox Meta is shutting down its transparency tool, CrowdTangle. Brandon Silverman joins to talk about the tool and what this means for the future of platform transparency. - Vittoria Elliott / Wired Brandon’s substack is Some Good Trouble A group of civil society organizations and researchers wrote an open letter objecting to Meta’s decision - Mozilla GW’s tracker of Platform Transparency Tools & The Brussels Effect Join the conversation and connect with Evelyn and Alex on your favorite social media platform that doesn’t start with “X.” Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance. Like what you heard? Don’t forget to subscribe and share the podcast with friends!…
 
Loading …

Ласкаво просимо до Player FM!

Player FM сканує Інтернет для отримання високоякісних подкастів, щоб ви могли насолоджуватися ними зараз. Це найкращий додаток для подкастів, який працює на Android, iPhone і веб-сторінці. Реєстрація для синхронізації підписок між пристроями.

 

icon Daily Deals
icon Daily Deals
icon Daily Deals

Короткий довідник

Слухайте це шоу, досліджуючи
Відтворити