Why the police aren’t sufficiently cracking down on digital sex crimes (7)
2023.03.16 16:39 Mariko Tsuji
Limited investigative capacity and restrictive legal frameworks allow digital sex crimes to flourish.
(Illustration by qnel)
Tansa has released six articles on the damage caused by the spread of sexually explicit photos and videos through the apps “Video Share” and “Album collection” and explained how said system works.
Both apps are still available. The posting, buying, and selling of photos and videos continues as well. Nevertheless, the National Police Agency has not changed its policy of having prefectural police departments’ community safety divisions’ respond on a case-by-case basis.
The “National Police Agency’s Special Cybercrime Investigation Team,” launched in April 2022, investigates “serious cybercrime incidents” under the Police Act, such as cyber attacks targeting the government and large corporations. In the eyes of the National Police Agency, cases involving women and children are not serious crimes.
The damage will only increase if this situation doesn’t change.
But are current laws and investigative structures sufficient to crack down on repeat offenders, app operators, and platforms such as Google and Apple?
I spoke with Chuo University Faculty of Law Professor Kou Shikata, a former bureaucrat in the National Police Agency.
Shikata served as director of the Kanagawa Prefectural Police Headquarters’ criminal investigation department and as director of the National Police Agency’s community safety bureau’s cybercrime division.
He is currently researching cybercrime and how to prevent repeat offenses. In addition to his legal knowledge, he is well versed in the actual investigative process.
Profile: Kou Shikata
Professor in the Faculty of Law, Chuo University. Shikata’s fields are criminal policy studies, criminology, social security policy theory, cybercrime, and how to prevent repeat offenses. He joined the National Police Agency in 1987 and served as director of the Kanagawa Prefectural Police Headquarters’ criminal investigation department, director of the National Police Agency’s community safety bureau’s cybercrime division, and director of the international affairs division in the National Police Agency’s Commissioner-General’s Secretariat.
What is a “private sexually explicit image record”?
As we have shown in previous articles, spreading or selling sexually explicit photos or videos without the subject’s consent constitutes multiple crimes. For example, the “Act to Prevent Damage due to the Divulgence of Sexual Images in Private Affairs” (commonly known as the Revenge Porn Prevention Act) stipulates penalties for the following acts.
Criminal publication
A person who provides or openly displays a private sexually explicit image record (object) to an unspecified person(s) in a manner that allows a third party to identify the subject of the photograph: punishable by imprisonment for up to three years or a fine of up to 500,000 yen (about $3,670).
Offering for criminal publication
An act of providing a private sexually explicit image record (object) for the purpose of causing an act that amounts to criminal publication. For example, providing such information to a specified few individuals for the purpose of spreading through [messaging app] LINE, etc.: punishable by imprisonment for up to one year or a fine of up to 300,000 yen (about $2,200).
If a sexually explicit photo or video of someone is posted or sold without their consent, first of all, that constitutes a violation of the “Revenge Porn Prevention Act,” right?
If an image is posted without the consent of the individual, we must consider whether the image constitutes a “private sexually explicit image record (thing).” The term “private sexually explicit image record (thing)” is defined in the Revenge Porn Prevention Act as follows.
- Sexual intercourse or sexual intercourse-like acts
- Touching or being touched by another person’s sexual parts in a way meant to arouse or stimulate sexual desire
- Sexual acts in which a person’s sexual parts are emphasized without wearing all or part of their clothing, in a way meant to arouse or stimulate sexual desire
Therefore, even if the individual is embarrassed by the image and does not want it to be public, the law may not apply if the above description is not met.
Does that mean that you can expose others’ personal lives without being charged with a crime?
It’s a violation of image rights under civil law to take or publish photographs that show the face or appearance of someone without their consent. If you use a method such as hacking to acquire someone’s images, it may be considered illegal unauthorized access.
Lost logs a “practical statute of limitations”
How do the police crack down on Revenge Porn Prevention Act violations?
For a crime to be prosecuted, the victim must first file a complaint. It’s also necessary to identify the person who committed the crime. It’s possible for the police to access the server where the revenge porn, etc., was exchanged and investigate the perpetrator. However, this method isn’t feasible if too much time has passed, because the storage period of communication logs is limited to, say, three months or six months.
How does the investigation proceed then?
If the victim hesitates to consult with the police, or if the police are late in starting an investigation, the logs will disappear. The log retention period becomes a sort of practical statute of limitations. Furthermore, it takes a very long time to disclose information if the server used for the communication in question is located overseas.
Is it impossible to investigate if the victim does not report?
The victim does not need to file a complaint if the act in question is a “criminal exposure of obscene images” or “criminal distribution obscene electromagnetic records.” It could fall under the criminal act of dissemination to the public or possession for the purpose of selling images or data that fall under “obscene material.” However, this law is more to uphold social morals rather than to address the damage caused by these crimes.
Not enough investigative capacity
Tansa’s own reporting found that more than 200 videos of children were being traded. Japan’s Child Pornorgaphy Act regulates sexually explicit photos and videos of children under the age of 18.
The Child Pornography Act prohibits the production and sharing of records such as child sexual abuse material for sexual purposes. In June 2014, the law was amended to make possession of child pornography a crime.
Children, such as elementary and junior high school students, are being victimized. Is it necessary to identify the victim in cases of child sexual abuse images?
No. Even if the individual cannot be identified, if it can be determined that the victim is under the age of 18, the perpetrator can be arrested.
How do you determine if the victim is underage?
The police judge by the victim’s physical appearance. They may also request a doctor’s evaluation.
However, it is often difficult to judge. Individuals’ appearances differ, and it may not be possible to conclude that the image in question is a child sexual abuse image if the child is around junior high school age. Even though the law applies to children under the age of 18, in reality the majority of arrests are in cases where children of elementary school age or younger are the victims.
It may be possible to arrest the perpetrator if the victim and the perpetrator are acquaintances, and the perpetrator knows that the victim is a minor.
Although child sexual abuse images are relative easy to make arrests for, in reality they still seem rampant.
Police also have limited capacity. After all, there are many crimes out there, and it’s not possible to respond to each one. However, the degree of maliciousness will help determine whether an investigation is prioritized. If a particular app has a considerable number of victims or is judged to be malicious, the police may respond.
Is it possible to crack down on the app operators?
What about the operators of both apps? They are likely aware that their apps are being used for illegal transactions, as they respond to some of the requests from victims to delete images.
However, the operators have not implemented fundamental measures to deal with the issue, such as changing the apps’ functions. Can they be held legally responsible for their negligence?
The two apps have a common mechanism: To access the photos and videos stored in their respective folders, users have to buy keys. One key is required to open one folder, and it can be purchased for 160 yen. Part of the payment goes to the user who posted the folder.
Perpetrators repeat similar posts to earn money. One of the perpetrators who joined Tansa’s Twitter Space admitted that making money was one of the reasons he continued posting.
The two apps I found in my research were used to trade victims’ images. Can the app operators be found guilty of a crime?
The point is how the app works. If there is a mechanism or wording that invites users to post sexually explicit photos or videos, you might have a case. It’s difficult to build a case under criminal law for simply neglecting what has been posted, because the administrator can get away with saying “it was posted without permission.
Under what circumstances can an app operator be prosecuted? In this case, the app operators also receive a portion of the posters’ sales. Is it a problem for them to make money by trading illegal images?
To give an example, when arresting someone for distributing obscene pictures or distributing obscene electromagnetic records, the focus is on the operator’s “degree of control.” If they select the content posted and decide whether or not to publish it, they will be seen as having a high degree of control. Additionally, if the operator makes users post, it is important circumstantial evidence if the operator is making money, in terms of recognizing their complicity in the crime.
Barriers to investigating overseas
Furthermore, through checking the domain information on the web versions of both apps and the companies’ registration information, I found that Video Share and Album Collection are likely managed overseas.
Both apps seem to have their operators’ servers located overseas. How would the police investigate in this case?
If the company is based or has a server outside of Japan, it may be managed by a foreigner or a Japanese person pretending to be a foreigner. Although we have to look at it case by case, domestic law can be applied if the actual operation takes place in Japan. In the case of Video Share and Album Collection, most of the people who post and have their image posted are Japanese, so Japanese law may be applicable. However, it’s difficult to grasp the operator’s actual situation.
Why is that?
The police need to request the local police’s cooperation in order to investigate a company based outside Japan. This request requires various procedures, and it usually takes three to four months. Perpetrators may pass through servers in multiple countries, so it would take several months to ask each country to cooperate with the investigation. Communication logs may disappear during this time — it’s a problem for other countries as well.
Google and Apple’s responsibility
Video Share and Album Collection were available on the Google and Apple app stores. As far as I could tell, Video Share was downloaded over 100,000 times on Google Play, and Video Share and Album Collection were also listed in the App Store’s popularity rankings.
Both apps became widely used because they were available via these platforms.
Can Google and Apple be legally liable because they offered the apps in their stores?
I think it would be difficult to hold them legally responsible for providing problematic apps. I have never heard of a platform operator being arrested in such a case. Especially outside the United States, where their headquarters are located, these corporations often do not address various damages. However, from the standpoint of doing business in each country, they should take responsibility. I think it deserves criticism from society.
Ill intent under the guise of a hands-off approach
According to Shikata, it may be difficult to arrest the operators of Video Share and Album Collection, as they are not explicitly inviting users to share illegal photos and videos.
However, I think we should judge based on the actual situation. Both apps appear “normal,” but in my research, I have never seen them used for anything other than exchanging sexual products. In my view, the operators’ hands-off approach while letting said business continue is itself a sign of their ill intent.
At the same goes for the platforms. Both apps were available through Google and Apple’s app stores and spread to many users. However, in Shikata’s view, even if platforms are the target of social criticism, it is difficult to hold them legally responsible.
If so, we need a law to hold these platforms accountable. Considering the severity of the damage, I think it’s only natural.
To be continued.
(Originally published in Japanese on March 3, 2023.)
Uploaded and Re-Uploaded: All articles