Apr 19, 2014 fbook icon twitter icon rss icon
Technology

Apple Removes Photo App Because Nude Photos Were Too Easy to Find

We've written here before about Apple's autocratic control of its app store, which has resulted in many questionable removals. To date, Apple has blocked a dictionary (because it contained profanity), a Project Gutenberg scan of The Kama Sutra (ancient sexytime), any app that connects to Dropbox (because... Dropbox?), an educational game based on the current war in Syria (too topical?), a DUI checkpoint location app (political pressure), an app that allowed a mute 4-year-old girl to communicate with her parents (patents!), as well as many apps that were potentially competitive with its home-grown software.

Now, Apple has pulled apps related to the 500px photography network, citing the "easy" availability of nude photos.

Apple has pulled the apps from photography network 500px from its App Store because, after 16 months of use, their clearly marked nude photo galleries suddenly became intolerable.

In addition for 500px's own app, the third-party 500px app ISO500, whose parent company 500px acquired because of ISO500's excellent integration, has also received notice that its app will be removed from the App Store shortly - for the same reason.

500px was notified by Apple that its software update made it "too easy" to find nude images via the integrated search. Initially, Apple said the app wouldn't be pulled but reverted to an earlier version. An hour later, it apparently decided that potentially searchable nudes was too much of a threat to its walled ecosystem and yanked the apps entirely. This seems a little harsh, considering the precautions taken by the software.

Just as it has the whole time, the app defaults to a Safe Search mode that excludes nudity, and you have to log in on the desktop version to change that.

Apple has responded to the strange and sudden removal of 500px's software with the following statement:

The app was removed from the App Store for featuring pornographic images and material, a clear violation of our guidelines. We also received customer complaints about possible child pornography. We've asked the developer to put safeguards in place to prevent pornographic images and material in their app.

Child pornography accusations notwithstanding (back to that in a moment), Apple's stance on nudity (or "pornography," as it prefers to term it) in its "curated" apps is ridiculous. 500px took steps to prevent just anybody from picking up the app and loading up on explicit images. But whether or not 500px made access to explicit images easier ultimately makes no difference. If it's nudity Apple wants to be rid of, it's going to need to shut down a whole lot of software.

But here's the thing: Flipboard integrates completely with 500px as well. Everything you can do on 500px's app, you can do on Flipboard. Is Apple going to pull Flipboard as well? What about Tumblr, Instagram and all browsers - including Apple's own Safari? You can get to nude images with them pretty easily, too.

And the child pornography concerns? 500px claims it was never informed about any alleged child porn -- not by Apple and not by its users.

Tchebotarev has responded to us, saying that 500px was not told about the child pornography complaints and that Apple had not mentioned any issues around nudity until a phone call yesterday. "We've never ever, since the beginning of the company, received a single complaint about child pornography. If something like that ever happened, it would be reported right away to enforcement agencies."

Apple is prone to overreaction when apps are criticized or accused of possible moral/legal issues (see opening paragraph). Rather than contact 500px and have them investigate the offending account(s)/images, it simply dumped the app. On top of that, it didn't even bother to tell the involved parties about this accusation. Apparently, developers should just find articles involving their embattled software and hit F5 until Apple's official explanation is appended to the post.

Yes, child pornography is serious and should be dealt with expediently, but knocking apps out of the market without verified claims is ridiculous. As was pointed out, if a user wants to access child porn, he's got plenty of options, all contained within Apple's approved apps, including its own software. It's a shoot-first approach that does a lot of damage to its relationships with its developers -- but the collateral damage is apparently acceptable.

Apple's arbitrary decisions on apps like this and the ones listed above make it harder and harder to respect its position as a self-appointed moral guardian of all things i-related. Its lack of communication and questionable tactics seem to be the unfortunate byproduct of its position as the most desirable market.

Personally, I'm glad I have an Android. As an adult, I prefer to be given the options befitting an adult, rather than have my software choices limited by a corporation's belief that I shouldn't be trusted with anything "mature" on the off chance that a child might access it. The Android marketplace is full of questionable apps, malware and outright sleaze, but at least it assumes I can make my own decisions on what sort of content I want to view or interact with, rather than pre-screen everything like a helicopter parent rummaging through the Halloween "take," looking for anything with loose wrappers or heavily processed sugars.

For more news, visit techdirt.com


Hot Gallery of the Day

22 People That Will Make You Feel Smart

Close x
Don't Miss Out! |
Like us on Facebook?