The Fight Against Instagram Porn Continues
It's been less than a year since controversy erupted over charges that Facebook (now Meta) was harming girls through its platform Instagram. Following bombshell revelations from whistleblower Frances Haugen, Mark Zuckerberg promised to introduce measures to protect girls. But after little to no progress was made to reform the picture-sharing app, a series of new lawsuits have been filed from citizens who are demanding change. This article will review key events from last year and bring you up to date with the latest legal developments in this ongoing controversy.
How “The Facebook Files” Put Instagram under Scrutiny
Instagram came under scrutiny last year when whistleblower Frances Haugen sent a trove of company documents to the Securities and Exchange Commission and the Wall Street Journal. The WSJ went public with the revelations in September 2021, publishing a series of nine reports, known collectively as “The Facebook Files.”
Using data from Haugen, but without revealing her name, the WSJ showed that Facebook’s own researchers had conducted focus groups, online surveys, and diary studies, hoping to discover if Instagram is toxic for teenage girls. Facebook learned that features unique to the Instagram app and not shared in common with other social media platforms were making girls feel bad about their bodies, leading to self-harm and even suicide. The WSJ also reported that the company was aware in 2019 that the platform was being used to “promote human trafficking and domestic servitude.”
Because Instagram specializes in salacious content (even using algorithms that prioritize female erotica) and because Instagram incentivizes girls to participate in their own self-commodification, it is hardly surprising that a myriad of female users have started to feel anxiety about their bodies.
“Among teens who reported suicidal thoughts,” the WSJ reported, “13% of British users and 6% of American users traced the desire to kill themselves to Instagram, one presentation showed.”
Among Haugen’s bombshell revelations was the fact that when this data was shared with Facebook’s upper management, they refused to do anything about it, and even worked to keep their research secret from both the public and government investigators.
Haugen revealed her identity on October 3 with an explosive 60 Minutes interview. Two days later she testified before the U.S. Senate. In her testimony, she claimed that Meta’s executives, including CEO Mark Zuckerberg, had made false representations about the safety of Instagram.
Haugen’s research echoes our own findings at Salvo and fieldwork conducted in the Pacific Northwest which found that social media is being used to fuel sex trafficking. We covered this in the reports below:
- How Sex Traffickers Use Social Media and Modeling: The Dark Secret Instagram Doesn't Want You to Know About
- The Drug That Fuels Human Trafficking: How One City Is Challenging the Porn-Trafficking Axis
Was Anything Done?
Two and a half weeks after Haugen’s Senate testimony, Mark Zuckerberg announced he was changing the name of his company to Meta Platforms and would be working “to help bring the metaverse to life.” There followed a blizzard of advertising about the metaverse, and the controversies surrounding Instagram faded from public attention.
But was anything actually done about Instagram?
The short answer is no. While Zuckerberg’s company released many statements about their commitment to protecting girls, and while they introduced new tools for reporting inappropriate content, the promised reforms never materialized.
This became clear to me last week when I had to go on Instagram to find the hours of a local business I wanted to patronize. As soon as I opened the picture-sharing app, my eyes were assaulted with a range of pornographic images, even though I had never registered interest in this type of content. My experience is not unique. Anyone going on Instagram is immediately assaulted with images of semi-naked women, including sexually suggestive images of minors posing in underwear. This has been well documented over the years (for example, see here and here and here and here.)
Reporting in the The Guardian in April 2022, Shanti Das observed that “Instagram is failing to remove accounts that attract hundreds of sexualised comments for posting pictures of children in swimwear or partial clothing, even after they are flagged to it through the in-app reporting tool.”
Das went on to report that Instagram’s parent company, Meta, ruled various accounts acceptable even after being flagged as suspicious through the in-app reporting tool.
In one case, an account posting photos of children in sexualised poses was reported, using the in-app reporting tool, by a researcher. Instagram provided a same-day response saying that…its ‘technology has found that this account probably doesn’t go against our community guidelines’.
What’s the Big Deal?
“What’s the big deal?” people sometimes ask. “After all, porn has always been around. If you don’t like Instagram, just don’t use it. But quit moralizing about those who do enjoy the platform.”
Well, things are not quite so simple. People who make this argument don’t actually understand how Instagram works, nor how it affects the girls who use it.
As we explained in our earlier report, Instagram is essentially a self-branding service, enabling girls as young as 14 to cash in on stardom and wealth. However, it is normally only through posting sexualized selfies that a girl can gain enough social credit to make the big bucks on the platform. Girls are incentivized to do this because the potential rewards are so big.
The solution to this problem is surprisingly simple. We know that bots can detect anti-vax content on YouTube and other platforms with a high degree of effectiveness. Facebook has even taken out a patent for a system of artificial intelligence that can recognize the “state of undress” of a person in a photo.
So why aren’t these technologies being leveraged to remove sexualized images from Instagram? A likely answer answer is: follow the money. The more girls using the service to promote sexualized selfies, the more Facebook can monetize their content through working with weight-loss apps and cosmetic companies.
Law Firm Takes Meta to Court
The good news is that Meta may finally be held accountable.
Last month, a series of eight lawsuits were filed against Meta Platforms, alleging among other concerns that Instagram is exploiting children. The complaints, which were filed in seven states, accuse the company of exploiting young people for profit.
In a press release, the prosecuting law firm explained that this collection of lawsuits is being filed “on behalf of adolescents, teens, and young adults who became addicted to social media and suffered detrimental mental health effects, including anxiety, depression, eating disorders, body dysmorphia, ADD/ADHD, self-harm, and suicidal ideation.”
Andy Birchfield, an attorney involved in the case, said, “The defendants knew that their products and related services were dangerous to young and impressionable children and teens, yet they completely disregarded their own information.”
Mr. Birchfield continued: “They implemented sophisticated algorithms designed to encourage frequent access to the platforms and prolonged exposure to harmful content.”
You can read more about this case on the Beasley Allen Law Firm website at the following links:
is the author of Gratitude in Life's Trenches: How to Experience the Good Life Even When Everything Is Going Wrong (Ancient Faith 2020). He has a Master's in history from King’s College, London, and is currently working on a Master’s in Library Science through the University of Oklahoma. He is Blog & Media Managing Editor for the Fellowship of St. James and a frequent contributor to Salvo and Touchstone magazines. He operates a blog at www.robinmarkphillips.com.• Get SALVO blog posts in your inbox! Copyright © 2022 Salvo | www.salvomag.com https://salvomag.com/post/lawsuits-target-meta-for-exploiting-girls