Uploaded and Re-Uploaded

Over 200 videos of child sexual abuse traded on apps (6)

2023.01.23 16:34 Mariko Tsuji, Makoto Watanabe

Google and Apple continue to ignore the serious abuse being perpetuated via apps they provide.

(Illustration by qnel)

The group of perpetrators joined Tansa’s Twitter Space on the night of Dec. 14, 2022. Earlier that day, they had held a strategy meeting to discuss their response to my reporting, saying, “Let’s make Tansa’s article meaningless.”

Two of the perpetrators, using the online monikers “Benzaiten” and “Erosuke,” complained about Tansa’s previous articles. However, they recognized that posting and sharing sexually explicit photos and videos without the subject’s consent is illegal.

Shortly after the exchange with Tansa in our Twitter Space, Benzaiten again opened a Twitter Space for the group of perpetrators.

“I can’t really say anything, because what [Tansa] is saying is right,” he told them.

“We’ve known all along [that what we’re doing is illegal]. I was asked if I was worried about being arrested, but I’ve always said that if you’re worried about that, you should stop.”

Benzaiten deleted his Twitter account a few days later. Erosuke also declared he would no longer be involved in sharing sexually explicit images.

“I will not respond to any tweets about this in the future,” he wrote. 

“If it turns out there is any actual harm caused by my tweets or retweets, I will take it very seriously.”

Some of the Twitter accounts operated by other perpetrators were also deleted. A chat group that had been instructing people on how to make money from Video Share and Album Collection was also shut down.

However, this doesn’t mean the problem is solved.

Some continue to post, using various means to remain hidden from outsiders. Some set their accounts to private and continue to tweet to their tens of thousands of followers, or direct users who have paid an “admission fee” to invitation-only chat rooms where they distribute sexual products. New Twitter accounts continue to appear one after another.

Then, one day, I noticed an even more serious situation.

The message boards filled with code words

I initially thought the perpetrators were primarily using Twitter to promote the sexually explicit photos and videos they were posting to Video Share and Album Collection.

But they had another hidden method. Within the folders of photos and videos, there was also a message board-style chat space.

At first glance, the terms in the messages were inscrutable.

“L” “S” “C” “JK” “Ja”

After looking over a number of posts, I realized these terms were code for child sexual abuse material (also known as child pornography). “L” stood for “Lolita,” indicating a young girl; “S” for elementary school student (shogakusei in Japanese); “C” for junior high school student (chugakusei in Japanese); “JK” for female high school student (joshi-kokosei in Japanese); and “Ja” for a Japanese girl.

On the message board, descriptions of the photos and videos were written cryptically. The following are examples of phrases used to appeal to potential buyers. 

“An assortment of L”

“Locker room hidden camera, super beautiful girl, upper grade in S”

“Innocent S taken into car, then got dirty”

“A record of recent, super-fiendish Ja-S pranks”

The folder names to reach the boards were distributed on Twitter, just like any other folder. There was no need to purchase a key to unlock them; anyone could use the folders by typing their names, a random combination of numbers and letters, into the application. The tweets did not mention that child sexual abuse material was being exchanged on the message boards.

However, one Twitter user advertised the message board as follows.

“Use the message column to share the treasured gems you can’t show to just anyone!”

The creator of a message thread can set a time limit for it. For Album Collection, the time limit can be set from one minute to a maximum of 335 hours and 59 minutes, and for Video Share from five minutes to 119 hours and 59 minutes. After it expires, the thread will be automatically deleted, with no record remaining. Even if a thread disappears, another will created in its place. This is how perpetrators continue to post and disseminate child sexual abuse material: by changing threads one after another and erasing the evidence.

An exchange that took place on an Album Collection message board.

Girls crying and cowering

I checked the actual videos based on the “advertisements” on the message boards in the application folders. As a result, I found over 200 pieces of child sexual abuse material in Video Share and Album Collection. The victims were minors of almost all ages, from toddlers who appeared as young as three to elementary, junior high, and high school students.

The sexual abuse perpetrated against the children in the videos was wide-ranging.

The most common images were those of naked breasts and bottoms, hidden camera shots of minors changing clothes and using the bathroom in what appeared to be schools or other facilities, and forced intercourse or other sexual acts with adults.

There were also “selfie” videos that appeared to have been taken by the children themselves, and even images of them clothed. All these images were bought and sold as sexual products.

Some of the children smiled, while others appeared stunned or cried and cowered.

The children’s ages are “added value” for the perpetrators who shot the videos and to those who buy and sell them. I found a scene in which a child being filmed was asked to say her age in order to emphasize the fact that she was a minor.

In another video, a person filming themself having sex with a girl asks, “How old are you?” and “What junior high school do you go to?” The girl answered with her age and the name of the junior high school she attends. The description under the folder name read “[doing] a 14-year-old disabled girl outside.”

I also found several videos in which a girl told the camera her full name and age and then filmed herself naked. They appeared to be filming at home, but I wondered whether she was being instructed what to do by somebody else.

Another folder contained 22 videos, all of which were footage from hidden cameras that had been set up in toilets in public facilities and other places. The videos showed girls ranging from early elementary school age to junior high school, with their faces visible.

The perpetrators keen to avoid police

The perpetrators use all kinds of techniques to target children.

Unlike physical abuse and neglect, the damage caused by sexual abuse to children can be difficult to see from the outside. The individual may not immediately understand what has been done to them, making it difficult for them to seek help. However, childhood sexual abuse can cause lifelong suffering for the victim. This suffering is compounded if the abuse is filmed and distributed.

The police investigate child sexual abuse. It’s not always necessary for victims themselves to be identified in violations of the child sexual abuse material Act — if it is confirmed that the victim in the relevant photo or video is a minor, the perpetrator can be arrested. It’s easier for the police to begin an investigation into child sexual abuse material compared with revenge porn, which requires the victim to report the crime.

Even most of the perpetrators who post sexual photos and videos to the apps appear to understand that child sexual abuse material comes with a high risk of getting arrested by the police. Some even claim that they will not post anything that constitutes child sexual abuse material.

The man who ran the perpetrator chat group, who called himself “Peach,” advised members not to post child sexual abuse material. At the same time, however, he sent the group videos and photos of girls, most of whom appeared to be in their early teens, for “viewing purposes.” He thought it was okay if it was just for their own private enjoyment.

child sexual abuse material was also exchanged in the perpetrators’ chat group.

And the National Police Agency?

Dozens of posts promoting folders containing child sexual abuse material are made each day in the Video Share and Album Collection message boards alone. How are the police dealing with a situation in which sexually explicit images of children are posted and disseminated every day?

One organization I thought could help address the situation was the National Police Agency’s Special Cybercrime Investigation Team, which was established in April 2022. The team’s main purpose is to respond to large-scale cyber attacks on the government and corporations. However, team head Yoshitaka Sato stated the following at a press conference following his appointment.

“We are committed to ensuring safety and security in cyberspace.”

“[The Special Cybercrime Investigation Team] contains various resources, including investigative and technical personnel and specialized equipment and materials.”

I wondered whether the team would not only protect the government and large corporations from cyber crimes but also investigate sex crimes victimizing children and women. No matter how many sexually explicit and likely illegal videos and photos I find, they are too numerous for me to report all of them. However, this unit has a large investigative capacity thanks to its combined personnel, resources, and equipment.

On Dec. 9, 2022, I sent questions to the National Police Agency’s Special Cybercrime Investigation Team. I wanted to know whether they would open a criminal investigation into the rampant spread of child sexual abuse material and revenge porn using the apps I have been reporting on.

I received the following response on Dec. 23, 2022.

“We cannot answer regarding individual cases, but as for our response to child sexual abuse material etc. in general, we police are strengthening our response to malicious crimes, especially those that target young children for sexual reasons, as well as promoting the protection and support of children, based on the ‘Plan for the Prevention of Sexual Exploitation of Children 2022.’


“With regard to revenge porn, while considering victims’ feelings, we are also promoting measures in cooperation with related organizations, such as thorough control and removal of such images, as well as education and awareness-raising to prevent this kind of harm from occurring in the first place.


“Furthermore, the Special Cybercrime Investigation Team is required by the Police Act to investigate crimes related to serious cybercrime incidents, and this type of case [child sexual abuse material and revenge porn] is mainly handled by prefectural police departments’ community safety divisions.”

I absolutely do not believe that the current system under the jurisdiction of the prefectural police departments’ community safety divisions is sufficient to deal with online sex crimes such as child sexual abuse material, because online crimes are being committed every minute and every second, not only across prefectural borders but also across national borders.

And there are so many victims. Current police officers and former members of the National Police Agency whom I have interviewed up to now say that “the number of arrests is not at all sufficient to address the situation.”

If the situation is out of control, that is all the more reason to change how it is dealt with.

The National Police Agency declined to participate in an in-person interview.

Four months since my first questions

How will Google and Apple, which offer Video Share and Album Collection, respond? I first sent questions to both companies in September 2022. From the beginning, I have pointed out in my correspondence that child sexual abuse material is being bought and sold on both apps.

As reported in this series’ second article, several user reviews on Apple’s App Store have pointed out that illegal videos are being exchanged on these apps.

In the cases reported here, the Video Share and Album Collection message boards contain numerous posts promoting child sexual abuse material. Google and Apple, which provide the apps, have a serious responsibility in this situation.

I have frequently urged both companies to respond, but neither have done so.

The tech giants that allowed this harm to spread continue to avoid the issue.

I will never be able to forget the expressions and voices of the sexually abused children I saw. We must not let children pay the price for adults’ irresponsibility.

Why does this harmful system continue? I will continue my investigation.

To be continued.

(Originally published in Japanese on Jan. 12, 2023.)

Uploaded and Re-Uploaded: All articles