How to deal with users potentially uploading nudity/offensive material to app?

Instead of how do you "prevent" users using your app from uploading nudity/offensive material to your app where other users can potentially view it, I would like to ask what Apple recommends we do - because you really aren't ever going to be able to 100% prevent it unless you have the resources of a huge app company.


I know on the App store you must have a way for users to report photos and after some threshhold is reached the photo/video should be removed but I was hoping someone could give me more information on this.


Are there any articles or links on Apple that explain in more detail how an app should deal with their users publicially displaying photos/videos in poor taste?


Thanks.

Comes up often enough here, I think. Example: https://forums.developer.apple.com/message/176227#176227


Just note that Apple tends to tell you that you should do it (moderation), not how to do it. Those details are on you, based on budget, skills, type of content, user load, etc.

Here's what it says in the guidelines:


1.2 User Generated ContentApps with user-generated content present particular challenges, ranging from intellectual property infringement to anonymous bullying. To prevent abuse, apps with user-generated content or social networking services must include:

  • A method for filtering objectionable material from being posted to the app
  • A mechanism to report offensive content and timely responses to concerns
  • The ability to block abusive users from the service
  • Published contact information so users can easily reach you

Apps with user-generated content or services that end up being used primarily for pornographic content, objectification of real people (e.g. “hot-or-not” voting), making physical threats, or bullying do not belong on the App Store and may be removed without notice. If your app includes user-generated content from a web-based service, it may display incidental mature “NSFW” content, provided that the content is hidden by default and only displayed when the user turns it on via your website.

In practice, Apple has not insisted on a "method for filtering objectionable material from being posted to the app" . They have required the other bullet points and they have required a method whereby the developer can remove objectionable material after it is reported and confirmed as objectionable.


why can't you stick an adult label or no one under 18?


some of us are adults, not everyone is prude.

They have no way to verify that a device is in the hands of an adult.

How to deal with users potentially uploading nudity/offensive material to app?
 
 
Q