Uploaded and Re-Uploaded

Photos and videos turned into “sexual products” without subjects’ consent (1)

2022.11.18 14:49 Mariko Tsuji

An App for sharing and purchasing photos, used to spread sexual images, is available from Google and Apple and has been downloaded over 100,000 times.   

(Illustrations by qnel)

It’s possible that sexual photos or videos of you or someone close to you may have been posted and spread online without your knowledge. Once posted, they can be reposted to various sites and platforms, never to disappear. The victim’s suffering, too, will continue forever.

In fact, this happened to an acquaintance of mine. 

Her photos and videos were bought and sold as a “product” through a smartphone app. Those who post to the app are paid by those who download the photos and videos, thereby encouraging them to post more and more sexual content. 

The app has been downloaded at least 100,000 times, and a huge number of people have had their personal photos and videos posted without their consent. Images are collected in a variety of ways, including through hacking, secret filming without consent, reposting from other sites, and as revenge porn. The app also contains child pornography.

Apps like this are available via Google and Apple app stores. Anyone can access them, and they have users all over the world, so the harm they cause can spread rapidly.

However, Google and Apple have not addressed this issue. Although I have repeatedly requested them to comment, they have not done so.

Japanese authorities have not adequately regulated IT giants such as Google and Apple, and the police have not been able to address internet sex crimes in a meaningful way. Even now, the number of victims continues to increase.

This problem will continue until the structure enabling internet sex crimes is dismantled. My reporting here aims to investigate and change this systemic issue.

Please get in touch if you have any information about internet sex crimes, including information related to victims, those posting content, those running relevant apps, or Google and Apple’s response to this issue. Please refer to this page for how to send information securely, such as encrypted emails; we will keep your information strictly confidential. 

It began with DMs

One day in August this year, I noticed a message on my phone from a female friend, A (19). It was almost midnight, but she wanted to discuss something over the phone. As we normally communicate by text, I had a feeling it wasn’t something good.

My hunch was right. 

In our call, A told me that, a few days ago, she noticed she had suddenly gained about 20 new followers on her public social media account. Becoming suspicious, she saw that she had also received dozens of direct messages (DMs) from unknown, anonymous accounts. 

They were making fun of the fact that A’s sexual photos and videos had leaked online.

“Your naughty pictures and videos have leaked”

“I saw your video and thought I’d come say hi lol”

One man also sent A a picture of his genitals.

“I was hacked”

A had no idea what was going on, so she asked for information from some of the people who had sent DMs.

It seemed her photos and videos were being purchased through a smartphone app. The folder posted to the app also contained a screenshot that included A’s social media account name. Those who saw it had looked her up and sent the DMs.

A was shocked, but still in disbelief that something like this had happened to her. She quickly searched Apple’s app store for the app the DMers had told her about. At first glance, it looked like a normal app for exchanging photos and videos.

She found that 53 of her photos and 19 of her videos had been posted. They included selfies and pictures with friends, as well as sexual content.

The photos and videos covered a range of four to five years, from recent ones to photos of A as a junior high school student — and they included images she had never shared with anyone.

However, they were saved in the cloud that A synchronized with her phone.

“I was hacked,” A felt intuitively. She found a record of an unknown terminal in her cloud account login history.

In the app, a photo of A’s face had been used as a cover image for the folder containing her images, in order to entice users to pay and download the folder. Below the photo, the app showed how many times the folder had been downloaded: by then, 282.

Even when they were deleted…

The app had a button to report posts that constituted a “terms of service violation” to the app operator. A used it to report the folder containing her photos and videos.

After reporting the folder, she still felt uneasy and checked her phone all day. The folder had been deleted by the next morning. She contacted me to say, “I’m at least glad it’s been taken down.”

But 10 days later, the folder was re-uploaded; she couldn’t tell by whom. 

This time, even after reporting it, the folder wasn’t taken down immediately. The number of downloads increased steadily, reaching 862 — more than three times the number when A first discovered it. 

Even more concerning, people who obtain A’s photos and videos can spread them via social media and other means. Those who see them on social media can spread them further. In this way, there is no limit on who might see A’s images.

In the following days, the app administrator continued to delete the folder, only for someone to re-upload it. By A’s count, she has repeated this process seven times. 

One of the anonymous DMs she had received said “That post will last forever,” and she understood well what they had meant.

During this time, A continued to be harassed via DM on her social media.

She tried changing her social media profile and username, but when the folder was re-uploaded, it contained screenshots of the new version.

The male police officer “speaking frankly”

A few days later, A tried calling the police to ask for help. A male police officer spoke with her for about 30 minutes. 

On the phone, the officer often said “Hmm…” or paused for several seconds. A told him the name of the app, but he said he didn’t know anything about it. 

When A told him that her image was being shared to hundreds of people, he said flatly, “Speaking frankly, if it’s being spread like that online, it cannot be completely eliminated. There’s nothing for it but to request the photos and videos be deleted when you find them.”

A also told him she was being harassed on social media and asked, “Isn’t it a crime to access photos uploaded without the subject’s consent and then contact said subject to sexually harass them?”

“It depends on the situation,” the police officer said, stumbling over his words. “I can’t really say one way or another without seeing the messages themselves, and, in the first place… Well, there’s a lot of things like defamation these days.

“The police handle incidents where the offending party can be punished in some way. So if you want to claim compensation for defamation, you can file a civil lawsuit. That’s, well, up to you. For civil matters, ask a lawyer or someone like that.”

However, A’s objective isn’t to punish the other party; first and foremost, she wants her photos and videos removed from the internet. However, the police officer just repeated, “Please ask the operator of the app to delete them.”

“If I think about it, I can’t go on with my life”

A graduated from high school this past spring and is currently studying design and other subjects at university. She moved for school and is just beginning a new life.

She has a good head on her shoulders, and even though I was older I could go to her for advice. She would give her honest assessment and was reliable. 

“I have no choice but to stop thinking about what happened to me,” she said to me about her images being shared online. “If I think about it, I can’t go on with my life.” 

There is no reason why she should have to go through something like this. I couldn’t stand to see her life be destroyed in this way.

Originally posted “for his own amusement”

From then on, I started gathering information about this app on the internet and social media. After a while, I learned about the case of B, whose video was posted to the same app as A. I got in touch via her relatives, and she agreed to let her relatives tell me her story.

B, who was in her 20s at the time of the incident, was filmed while having sex with a man she met on a dating app several years ago. B had initially refused to be filmed, but she had reluctantly accepted a few times after being asked repeatedly.

The man posted the video to an online message board without B’s consent. The message board is free and open to anyone. B said the man told her he posted it just for his own amusement. He explained that he soon became nervous and deleted the post after about an hour.

After a while, B received an anonymous message on her social media that a video of her had leaked. Similar to A’s case, the victim herself can’t discover the leak unless someone notifies her.

B’s video was posted to the same app as A’s, and it was also circulated via another similar app, as well as on platforms such as Twitter. When B learned what had happened, she searched for the posts and sent requests to the platform operators to delete them, dealing with the matter on her own.

Followed home, sent letter

After discovering that the video had been spread online, B’s life changed completely.

She was afraid to go out, so she took time off work and rarely left her home. When she walked down the street, she was terrified of anyone recognizing her from her video.

Over the course of several months, she returned to normal life little by little.

But on her way home from work, on multiple occasions she felt like someone was following her. Around this time, she often called her relatives, saying “I might have a stalker.” When she went out, she tried to be accompanied by relatives as much as possible.

One day, she received a letter about her leaked video. There was no postmark, so someone must have put it directly into her mailbox.

B immediately moved out. But she still worries that someone might know who she is, no matter where she goes.

Around that time, B tried to commit suicide. She was taken to hospital and survived.

Even after three years

B contacted the police but has yet to file a complaint, out of fear that her leaked video will become public knowledge and that the situation will further worsen. Telling the police about the situation in detail is also a mental and emotional challenge for B.

She and the man who originally posted the video reached a settlement out of court. However, more than three years have passed since the first post, and B’s video is still being shared online several times a month.

When walking outside, B wears a mask and glasses to hide her face. She also drastically changed her hairstyle from what it was in the video. But despite the alterations, she can still be recognized. She is still frightened by others’ gaze.

Over 300 women’s photos posted

Sharing or publishing sexual images without the subject’s consent is a crime.

In addition to violating the “Act to Prevent Damage due to the Divulgence of Sexual Images in Private Affairs” (commonly known as the Revenge Porn Prevention Act), which came into effect in 2014, depending on what happened, it is possible for one incident to constitute multiple offenses, such as a violation of the Child Pornography Prohibition Law, or be considered criminal distribution of obscene material, extortion, or a violation of the Anti-Stalking Act. The government is also considering adding acts such as filming without the subject’s consent to the list of sex crimes.

According to the Ministry of Justice’s 2021 white paper on crime, there are about 200 to 250 arrests for such crimes each year.

However, is this number of arrests really enough to deal with the scope of the problem?

When I looked up the app where A and B’s images had been shared, I was at a loss for words.In just 10 minutes of searching online, I found over 300 women whose photos and videos had been posted to the app.

To be continued.

(Originally published in Japanese on Nov. 14, 2023.)

Uploaded and Re-Uploaded: All articles