Uploaded and Re-Uploaded

Former executive reveals Apple’s app approval process(31)

2024.12.05 16:23 Mariko Tsuji

Each app is reviewed for an average of 13 minutes, and that the company largely ignores user reviews, according to the former executive.

Apple’s headquarters in Cupertino, California, U.S. Photo taken by reporters from the NHK Special “Innovative Investigations.”

Album Collection, a hotbed of digital sexual abuse, was a popular app in Apple’s App Store. For a time, it ranked first in the store’s photo-related category, ever higher than YouTube and Instagram.

The Hawaii-based Eclipse Incorporated (hereafter “Eclipse”), which operated Album Collection, received over a million dollars per year from Apple and Google, according to an accounting firm employee.

It was clear that Album Collection’s operators were motivated by such high revenue. But why did Apple and Google ignore this business that encouraged crime?

Up to this point in our investigation, Tansa had submitted multiple interview requests and questions to Apple’s Japanese subsidiary. However, we never received a response.

Together with reporters from the NHK Special “Innovative Investigations,” I traveled to the U.S. to interview a former executive familiar with Apple’s inner workings: Phillip Shoemaker. During his time at Apple, Shoemaker had been one of those responsible for the App Store and had helped create its approval guidelines.

The angry phone call from Steve Jobs

In April 2024, we visited Shoemaker’s house in Nevada, U.S. He welcomed us with a smile.

Shoemaker was in charge of the App Store from 2009 to 2016. He created the store’s foundation by leading the team in charge of the app review and approval process, which determines which apps are available in the store, including setting guidelines for said process.

He had also worked with Steve Jobs, Apple’s founder.

“We decided we needed to have a set of guidelines written. And so I worked closely with Steve on getting those guidelines written,” Shoemaker told us.

What kind of person was Jobs?

“He was a brilliant man, a very smart man who was easily frustrated by people around him not seeing the same thing he did,” Shoemaker said.

“It was painful to me at the time: I was on the job for three weeks when an app got approved that shouldn’t have,” he continued. “I got into my office, and the first phone call I received was a phone call from Steve; it said ‘from the office of Steve Jobs.’ I started sweating, and I picked up the phone. Steve got on the phone and said one thing. He said, ‘You’re stupid, and you hire stupid people.’ And he hung up the phone, and I was scared.”

“I didn’t know the app he was talking about, but we changed the process to ensure that things like this would never happen again,” Shoemaker said. “It made me really understand more about him and his leadership techniques.”

“Steve often did a lot of things by emotion, but there was something, always something behind it, some logical reason behind it. And Steve wanted to get his point across in the fastest, most succinct manner as possible. When we later worked on the guidelines together, I saw this in his writing and what he wanted to say on the guidelines as a way to make it as clear as possible in as few words as possible.”

The three goals

What had the original App Store guidelines been like?

“Over the years, we defined a lot of guidelines for Apple. In the beginning, there were really only about six things that Steve put on a whiteboard. And there was one of them called ‘unforeseen,’” Shoemaker said.

He explained that “unforeseen” referred to undesirable apps, such as scam apps that steal money from users. To keep them out of the App Store, Shoemaker and his colleagues decided to create guidelines targeted primarily toward app developers.

According to Shoemaker, Apple has three main objectives, and they created the guidelines based on these.

“One is their brand. They don’t want anything to go on the store that’s going to put a bad mark against their name, against the big brand of Apple,” he explained.

“The second thing is they don’t want to hurt their customers. They don’t want to put anything in the App Store that could potentially harm their customers.”

Shoemaker gave examples of apps that could cause harm, such as those which only worked while driving and game apps that required users to enter personal information such as their social security number. There were also apps that were created to look like games, with operators masquerading as users and repeatedly paying fees; this was a money laundering technique that used the App Store.

“And then the third is Apple just wants to make sure they get their cut of the money. If you’re paying for something in the app, they want their 30%. And a majority of the guidelines were built around those three objectives,” Shoemaker said.

(Illustration by qnel)

Apps reviewed by employees in 13 minutes

Based on the guidelines, what kind of criteria are checked in the review and approval process?

“I would say there’s a variety of problems with the App Store,” Shoemaker responded.

The first problem Shoemaker mentioned was that all reviewing is done manually by employees. No AI or machine learning is incorporated into the approval process for apps listed in the App Store.

“What I’m telling you is how we built it. It might not be completely the same today, but the idea is that there are certain teams within Apple that review apps. Now we have an entire app review team. I believe it’s now over 450 people. These are reviewers that are focused on just looking at apps.”

The team reviews approximately 100,000 apps per week. Each app takes about 13 minutes to review. Updates that modify existing apps take about 6 minutes.

“They pull it down onto their device, and they go through them one by one, and they go through a variety of processes,” Shoemaker said.

“The first one is they look at all of the text that the developer submitted that advertises them in the App Store: the marketing text, the screenshots, how the age rating is, etc. They look at all of these things, and then they launch the app on their device, and then they just walk through the app as though they were any normal customer.”

Can’t check each “coin locker”

The review team assigns its members based on the country where a given app will be used. For example, if an app is being reviewed for use by Japanese users, a staff member who understands Japanese will conduct the review based on Japanese law.

That means Album Collection passed this process too.

In reality, however, child sexual abuse material and hidden camera images were traded on the app. If Apple had seen these, Album Collection would have been immediately removed from the App Store.

Apple may have approved Album Collection because of the app’s mechanism. Album Collection users cannot view illegal images just by downloading the app. They first have to enter passwords to access illegal images other users have posted.

“There are certain apps that come with no content. And for them, they just try to add content, to see if it works. And if all the features work, they move on,” Shoemaker said.

“When an app comes that you can go to different repositories or different locations to look at content, and that content might be under lock and key, Apple would have no ability to review that content,” he added.

It’s like coin lockers for luggage storage. No matter how thoroughly one may check a new coin locker, one won’t find anything illegal. However, once it’s installed, people may use them for illegal purposes.

Indeed, it’s impossible to check each piece of content included in an app. It would also require breaches of user privacy. However, this is exactly the point being misused in order to make money off of digital sexual abuse.

Ignoring user reviews

Even without looking at all the content of an app, there must be other ways to spot problems, including through user reviews posted to the App Store.

Each app’s page in the App Store contains a review section where users can rate and comment on the app, and which people considering downloading it can refer to. Reviews of Album Collection and similar apps indicated that they were being used to trade illegal images.

We asked Shoemaker whether Apple reads these reviews.

“The reviews on the App Store page are largely ignored by Apple, unfortunately,” he responded. “Now, one of the things that Apple could do is that, when I was there, there was a mission for us to be able to look and see if there are any words like ‘It’s crashing’ or ‘The app doesn’t function,’ ‘It doesn’t launch,’ ‘It crashes on launch,’ things like that.”

Apple’s stance was to address functional issues but not to worry about anything else.

“But if there are people saying there’s inappropriate content [related to] children or exploitation or anything — like pornography — objectionable, Apple should absolutely be looking at those. But like I said, they are largely ignored,” Shoemaker continued.

“There are all kinds of things in the comments that could make Apple get in trouble in certain jurisdictions. And to me, it’s mind-boggling that they don’t look at that.”

Why wan’t Apple looking at important reviews like these?

“If the question is why weren’t they, that’s a very hard question to answer. I don’t have a good answer for that. It was frustrating when I was there that people just didn’t seem to care about the reviews that were appearing there,” Shoemaker said.

However, Shoemaker then began speaking about another important reason.

To be continued.

(Originally published on July 25, 2024.)

Uploaded and Re-Uploaded: All articles