Hello,
I submitted an AI-powered fashion try-on app to the App Store, but it was rejected under Guideline 1.1 - Safety - Objectionable Content. I would appreciate any advice on how to improve the app for review.
The app creates a virtual model based on the user’s own face photo and body information, then allows the user to simulate outfit try-ons and coordinate styling.
Apple pointed out the following:
“Specifically, simulating outfit try-ons and styling.”
In the app, we clearly warn users to use only their own photos. We also prohibit the use of photos of other people, celebrities, minors, or inappropriate images. The app is designed to use only the front-facing camera, and our Terms of Use and Privacy Policy explain how photos are used.
However, it seems that Apple may be treating AI-based, person-based outfit try-on and styling itself as a potential Guideline 1.1 risk.
My questions are:
- Are AI-powered, person-based virtual try-on features currently reviewed very strictly under App Store Guideline 1.1?
- Are warnings and “own photo only” rules generally not enough?
- Would adding image moderation, a reporting feature, and human review improve the chances of approval?
- For the initial release, would it be more realistic to remove person-based try-on features and limit the app to digital closet management and AI outfit suggestions?
- What kind of explanation should be included in the Review Notes?
This app is not intended for sexual image editing or inappropriate person manipulation. It is designed only to help users choose outfits for themselves.
If anyone has experience with App Review for AI image generation apps, virtual try-on apps, or apps that handle person images, I would really appreciate your advice.
Thank you.