Uploaded and Re-Uploaded

Apple hasn’t prioritized countermeasures against apps used for trading child sexual abuse material(40)

2026.02.12 10:31 Mariko Tsuji, Makoto Watanabe

Japanese police asked Apple to respond to illegal images traded in image-sharing apps 10 years ago. In response to Tansa’s questions, Apple Japan’s public relations department suggested an off-the-record interview.

Apple’s headquarters in Cupertino, California. Photo by the NHK Special “Innovative Investigations” reporting team.

Warning: This article includes descriptions of sexual violence, including toward minors, in order to convey the reality of this issue. Please read at your own discretion.

Apple is too slow in responding to the digital sexual violence facilitated by its platform.

In December 2023, Apple removed the app Album Collection from its App Store. Album Collection had been used to buy and sell sexually explicit images, including child sexual abuse material. Tansa’s investigation prompted the move.

However, Apple continued to offer other, similar apps.

Video Share was one of them.

Since around 2019, Video Share was available for anyone to download via Apple’s App Store. Images, many of them illegal, traded within the app included hidden camera footage, revenge porn, and even footage of minors being sexually assaulted.

On Aug. 4, 2025, Tansa sent questions addressed to Apple CEO Tim Cook, asking whether the company would remove Video Share and what measures it would take to prevent further harm.

By Aug. 14, Apple had removed Video Share from its App Store.

However, why had Apple not done so in 2023, when it removed Album Collection? What’s more, in 2016 Japanese police had requested that the company stop providing apps used to trade illegal images.

Even after an app is removed, the images already posted there continue to spread online. The longer Apple waits to take action, the more harm is done.

Apple appears to have little sense of its responsibility as a major platform provider to stop digital sexual violence.

In response to our repeated questions, Apple Japan’s public relations department asked Tansa to meet for an off-the-record discussion.

The company appeared to care more about its image than about the victims. Perhaps it wanted to settle the matter quietly.

Child sexual abuse material posted practically every day

Among various images, Video Share was used to trade sexually explicit images of minors. Such images were posted practically every day — it’s difficult to imagine the scale of the harm caused.

In our investigation to date, we found both hidden camera footage and images taken by children themselves, as well as pictures and videos of minors being sexually assaulted.

One video even showed a young girl crying and trying to resist as she was sexually assaulted on the street. The perpetrator had filmed his own crime — truly harrowing footage.

A page on Video Share that includes two videos and thirteen photos of minors being sexually assaulted that are available for users to purchase.

On Video Share’s page in the App Store, a review dated June 1, 2022, included the following.

Title: This app…

“This app is criminal. Searching for ‘Video Share’ on Twitter or other social media platforms reveals countless filenames associated with child pornography, etc. Admins, please swiftly punish, including through prosecution, users posting child pornography or other illegal videos.”

The victims’ photos and videos are not just posted once. They are bought, downloaded, and reposted elsewhere. Even if someone requests the images’ removal in a specific instance, they continue to spread beyond control.

Profiting off of sexually explicit images

Video Share appears to have begun operating around December 2016.

The app allows users to trade large volumes of files, such as photos and videos. The “About” page in the app and its related website touts Video Share as a way to send images to family and friends.

“Share photos and videos filled with fun memories like sports days and field trips with your friends!! Share photos of family trips and birthday celebrations with your family!! No registration required, no login needed.”

However, in reality the vast majority of images traded in the app are sexually explicit and illegal. It is a crime to share or sell sexually explicit photos and videos without the consent of all parties involved, but Video Share is a hotbed of such criminal activity.

Apps are a platform of choice to trade sexually explicit images because they enable users to profit while sharing the images.

Users post sexually explicit images to locked folders they create within the app. Other users must pay to unlock the folders.

The more people who pay to access the folder, the more its poster profits, so posters often promote the content of their folders on online message boards and social media. They sometimes attach a photo of the victim’s face to entice potential buyers.

These images can be traded with others who have downloaded Video Share. Using an app (as opposed to a website, for example) makes their criminal activity more difficult to detect by cyber patrols and other online monitoring.

A perpetrator posts to X that they will distribute images obtained via Video Share.

Operated out of Shenyang, China?

Apple made Video Share available in its App Store. Although Apple functioned as a “marketplace,” it was Video Share’s operators who developed the app and managed the business that took place thanks to said “marketplace.” Both Apple, which allowed Video Share to do business in its “marketplace,” and the app operators made a profit this way.

Video Share was developed by a company called “Whison Group Limited.” Its website’s terms of use and privacy policy also use the name “Video Share Operating Committee.”

However, there were no details about who the operators really were.

Working with white hat hackers, Tansa tried to ascertain the operators’ identities. We utilized open source intelligence (OSINT) methods to gather evidence of crimes and personal information from various publicly available sources.

We were able to find the name, age, and operating location of someone we believe to be the head of Video Share. It was likely that Video Share was being operated out of Shenyang, China.

Although we looked into databases of Chinese companies and address registrations, we weren’t able to find any further information.

The operator of Video Share is aware that their service is being used to trade illegal, sexually explicit images.

In September 2022 and March 2023, we sent questions via email to the operator but never received a response. We also understand that victims have contacted Video Share to request that their images be deleted from the app.

Offered by Google too until three years ago

While the operators of Video Share are despicable, it is Apple which has let the app pass its review process and made it readily available for anyone to download.

In September 2022, Tansa sent questions addressed to Apple Japan President Ryo Akima. Since then, we have contacted them numerous times to concretely describe the situation and explain why it is problematic. However, we have never received a response.

Google also listed Video Share in its own app store, Google Play. As of 2022, the app had been downloaded over 100,000 times. After we contacted them, Google removed Video Share in September 2022 — three years faster than Apple’s response.

However, after that, the app reappeared as “Video Share+” in Google’s app store. Google’s app review process relies on machine learning, so it likely failed to detect that this new app was virtually identical to one it had already removed.

In April 2023, Google also deleted Video Share+, following another message from Tansa.

Both Apple and Google publicly claim to ensure that the apps listed in their stores are safe. However, their actions do not match their words.

Google’s Japan office in Shibuya, Tokyo. Photo taken on Nov. 8, 2024, by You Haga.

The proposal from Apple’s PR department

Just how reluctant are major platform operators to address digital sexual violence?

On Aug. 4, 2025, Tansa sent questions addressed to Apple CEO Tim Cook. The response from the company since then has been illustrative.

In our questions, in addition to asking whether Apple would remove Video Share from its App Store, we asked the following two points.

1. What measures will you take to prevent further harm from digital sexual violence? Will you prevent relevant operators from submitting new apps, or inform law enforcement in Japan and the US?

2. Will you implement effective measures to prevent similar harm from other image-sharing apps, such as checking user reviews, etc.?

After sending the questions, I checked the App Store every day to see whether Apple would remove Video Share. For a while, nothing changed.

On Aug. 10, I received a response by email. It was from a member of Apple Japan’s public relations department named Billie Cole. This was the first time anyone from Apple had responded.

“We understand that you are reporting on a very serious matter,” Cole wrote, adding that he would like to meet after speaking over the phone.

We agreed to have a phone call at 10:30 a.m. on Aug. 12.

However, Cole requested that our conversation be kept off the record, which meant that Tansa would not be able to report on anything discussed. I responded that I don’t do off-the-record interviews as a general rule. I also asked what topics he wanted to keep off the record and why.

To this, Cole never responded. He also failed to call at the agreed upon time.

“Battery replacement” and “screen repair” page given as online sexual abuse countermeasure

On Aug. 14, I received the following email from Cole.

In response to your report, Apple makes the following comment.

 

We have investigated these reported violations and removed this app from our store. We take complaints of this nature extremely seriously and have strict App Store rules that require developers to detect and respond to such abhorrent content.

 

Concerned individuals, both public and private, can report instances of illegal activity in apps obtained from the App Store at the following websites and email address:

 

Report a problem: https://reportaproblem.apple.com/
Contact Apple Support: https://support.apple.com/ja-jp/contact
Contact Apple Legal: https://www.apple.com/jp/legal/
Email for legal proceedings: lawenforcement@apple.com

 

Thank you.

Video Share was removed from the App Store that same day.

However, I was stunned when I opened the pages that Apple had indicated. None of them were appropriate channels for victims to report digital sexual violence perpetrated through apps.

For example, the “Report a Problem” page is only for reports about apps that the individual making the report has downloaded themself.

The “Contact Apple Support” page lists topics such as “iPhone screen repair” and “iPhone battery replacement.”

The “Contact Apple Legal” page listed product warranties and license agreement details. Clicking the “Ethics and Compliance” section displayed an English page, even when accessed from the Japan site.

It displayed a message from Cook, the CEO, at the top.

We do the right thing, even when it’s not easy.
Tim Cook

My questions had asked about concrete measures for preventing online digital sexual violence and helping victims — this wasn’t an answer. Although I tried to ask Cole again over email, he never responded.

Former Apple executive revealed sloppy management

How does Apple view the fact that it hosts apps that enable digital sexual violence — in effect, aiding criminal activity? My interview with a former Apple executive who helped build the foundations of the App Store revealed the company’s true stance.

In April 2024, Tansa and the NHK Special reporting team jointly interviewed former Apple executive Phillip Shoemaker.

From 2009 to 2016, Shoemaker had been the executive overseeing the App Store. He was involved in creating the App Store’s guidelines and had worked alongside Steve Jobs, Apple’s founder.

Shoemaker said Apple ignores those who point out wrongdoing, such as user reviews.

“The reviews on the App Store page are largely ignored by Apple, unfortunately,” he told us. “Now, one of the things that Apple could do is that, when I was there, there was a mission for us to be able to look and see if there are any words like ‘It’s crashing’ or ‘The app doesn’t function,’ ‘It doesn’t launch,’ ‘It crashes on launch,’ things like that.”

In the case of Video Share, multiple user reviews had pointed out that illegal activity was taking place in the app.

“So how can people who have been harmed by apps published by Apple tell Apple about it?” I asked.

Shoemaker could only say, “Yeah, it’s… I don’t know.”

Shoemaker, interviewed at his home in Nevada, US, in April 2024.

Kanagawa Prefectural Police: “Any and all measures should be promptly taken”

Apple and Google are also brushing off Japanese investigative authorities.

On Aug. 10, 2016, the Kanagawa Prefectural Police Cybercrime Division visited Apple’s and Google’s Japanese headquarters to formally request, in writing, countermeasures against digital sexual violence. Earlier that year, in February, the police had arrested individuals involved in illegal image trading via an image-sharing app. The case prompted their visit to Apple and Google.

In their request, the police pointed out that “numerous apps with similar functions are still made available, and illegal videos and photos are being published and disseminated,” according to reporting by the Kanagawa Shimbun newspaper on Aug. 11, 2016. After citing six examples of such apps, the police requested that “any and all measures be promptly taken, such as the early detection of illegal videos, warnings that such content will be removed, and suspension of such apps.”

Despite finally being removed from the App Store, Video Share is still usable for those who have already downloaded the app. Even now, perpetrators use various online tools to continue their activities, repeating their crimes unchecked.

The harm is spreading right this minute.

(Originally published on June 20, 2025.)

Uploaded and Re-Uploaded: All articles