
“She was 12 when he started demanding nude photos.”
The man told her he was her friend, and that she was pretty. She felt that they had “connected” on Snapchat and that all the photos and videos she was sending would eventually disappear, as it appears to on the app.
The girl is now 16 and she is leading a class-action lawsuit against Snapchat that claims that the designers behind the app have “done almost nothing to prevent the sexual exploitation of girls like her”.
The lawsuit filed by L.W (who has requested anonymity as a victim of sexual abuse) has been filed in a California federal court and it accuses Snapchat of “negligently failing to design a platform that could protect its users from egregious harm”.
The lawsuit filed by L.W and her mother seeks at least $5 million in damages along with assurances that the company is going to invest more in protection.
More importantly, the lawsuit “could send ripple effects through not just Silicon Valley but Washington, by calling out how the failures of federal lawmakers to pass tech regulation have left the industry to police itself,” as The Washington Post pointed out.
Snapchat has been a “mainstay” of teen life across the world, especially in America, and has managed to surpass 300 million active users globally. Closer home, the app announced that it had crossed 100 million users in India in October last year.
The app boasts of being a safe space for youngsters. And ideally, the basic concept of how it works fits the image.
For the non-Snappers, all messages and media content shared on the app disappear within 24 hours unless they are manually saved. If anything is saved, a snap, a chat, or a video, the other person is able to see that on the app. This creates a sense of security and privacy which, on one level gets understood as a “safe space”, and on the other, can and evidently is exploited by potential abusers - like the “active-duty Marine who was convicted last year of charges related to child pornography and sexual abuse in a military court” for saving the teenager’s Snapchat photos and videos and sharing them with others online.
Unlike other social networks that depend on a central feed, Snapchat is all about a user’s inbox of private snaps and messages, photos and videos exchanged with their friends on the platform, which are all supposed to be ephemeral and are meant to disappear once they have been viewed.
In its early years, Snapchat has been called a “sexting app” and it is not too farfetched to say that it still continues to function as such for many, many of its users. At the same time, it continues to be popular having solidified itself as “a place for joking, flirting, organising and working through the joys and awkwardness of teenage life”.
The Washington Post report, which first reported on the incident, rightly points out that the teenager’s lawsuit also raises “difficult questions about privacy and safety", and it throws a harsh spotlight on the tech industry’s biggest giants, arguing that the systems they depend on to root out sexually abusive images of children are fatally flawed.
Snapchat’s parent company, Snap, has “defended its app’s core features of self-deleting messages and instant video chats as helping young people speak openly about important parts of their lives” and has told The Washington Post (WaPo) in a statement that it employs the “latest technologies and develops its own software to help us find and remove content that exploits or abuses minors”.
“While we cannot comment on active litigation, this is tragic, and we are glad the perpetrator has been caught and convicted. Nothing is more important to us than the safety of our community,” Snap spokeswoman Rachel Racusen said.
While Snapchat has been doing significantly well globally, both in terms of active users, revenues, new product feature additions (the company recently launched a drone to help take selfies/photos), the lawsuit likens the app to a “defective product” that has “focused more on innovations to capture children’s attention than on effective tools to keep them safe”.
The girl’s lawyers wrote in the lawsuit that Snapchat relies on an “inherently reactive approach that waits until a child is harmed and places the burden on the child to voluntarily report their own abuse”.
And Snapchat is not the only app/company caught in this lawsuit. Apple and Google have also been listed as defendants in the case for their role in hosting the app on their app stores, along with Chitter, the app the Marine used to distribute the girl’s images. The WaPo report said that Apple and Google said that they had removed the app from their stores following questions from the publication, but the app was visible for us on both the App Store and the Play Store (it could have been done for the US only).
While Apple spokesman Fred Sainz said in a statement that “the app had repeatedly broken Apple’s rules around proper moderation of all user-generated content”, Google spokesman José Castañeda said that “the company is deeply committed to fighting online child sexual exploitation and has invested in techniques to find and remove abusive content”.
While users are notified if a photo, video or a chat has been saved on the app, or if a screenshot is taken, third-party workarounds are rampant and it is possible to capture/save a snap without the sender knowing.
Parents have argued that Snapchat has been drawing in adults who are looking to prey on kids while Snap has responded with features like how users younger than 18 are not allowed to post publicly from places such as Snap Maps and that it limits how often children and teens show up in the “Quick Add” suggestions for other users. Snap also “encourages people to talk with friends they know from real life and only allows someone to communicate with a recipient who has marked them as a friend”. The platform also does not have an age-verification system despite only allowing users who are 13 or older.
But it is not too hard to bypass all and any of this.
“Snap representatives have argued they’re limited in their abilities when a user meets someone elsewhere and brings that connection to Snapchat. They’ve also cautioned against more aggressively scanning personal messages, saying it could devastate users’ sense of privacy and trust,” the WaPo report pointed out.
Snap has also said that “its servers delete most photos, videos and messages once both sides have viewed them, and all unopened snaps after 30 days” and that “it preserves some account information, including reported content, and shares it with law enforcement when legally requested”.
For the mother, and L.W, the last few years have been devastating. “The criminal gets punished, but the platform doesn’t. It doesn’t make sense. They’re making billions of dollars on the backs of their victims, and the burden is all on us,” she said.
"“While we cannot comment on active litigation, this is tragic and we are glad the perpetrator has been caught and convicted. Nothing is more important to us than the safety of our community. We employ the latest technologies and develop our own tools to help us find and remove content that exploits or abuses minors. We will continue to do all that we can to protect minors on our platform, which includes reporting all detected child sexual abuse material (CSAM) to the National Center for Missing and Exploited Children," Snapchat told Business Today.
Also Read: Snapchat’s India team adds new members, current head Durgesh Kaushik to exit soon
Also Read: Snap hires first head of carrier partnerships to bolster growth outside US
For Unparalleled coverage of India's Businesses and Economy – Subscribe to Business Today Magazine
Copyright©2025 Living Media India Limited. For reprint rights: Syndications Today