Our children's educational app (COPPA-compliant) was rejected under Guidelines 5.1.1(i) and 5.1.2(i) for sharing personal data with a third-party AI service without clear disclosure and user permission.
How our app works:
Our app is an AI-powered learning assistant for kids. Children type questions (e.g., "Why is the sky blue?") and the app sends the question to Google
Gemini's API to generate an age-appropriate answer. This is the core and only purpose of the app — it's an AI chat app, similar to how a search engine
sends queries to its servers.
Our current setup:
- Google Gemini operates as a data processor (not a data recipient) — zero data retention, no model training on user data
- Our privacy policy already discloses Google Gemini as the AI provider, what data is processed, and that no data is stored
- The app is clearly marketed as an AI-powered assistant — users understand they are interacting with AI
Our questions:
- Infrastructure vs. data sharing: We use Google Gemini to process queries the same way we use Google Sign-In for authentication, MongoDB Atlas for our
database, and Railway for hosting. In all cases, user data passes through a third-party service to provide core functionality. Is the expectation that AI services require additional consent beyond what's expected for other third-party infrastructure services? If so, what distinguishes them? 2. Minimum consent implementation: If in-app consent is required, what constitutes sufficient "explicit permission"? Specifically: - Is a simple alert dialog (similar to the ATT prompt) with "Allow" / "Not Now" before first use sufficient? - Or is a more detailed consent screen with checkboxes/toggles required? - Since our app's sole purpose is AI-powered Q&A, what should happen if the user taps "Not Now"? The app cannot function without the AI service. 3. Privacy policy disclosure: Our privacy policy already identifies Google Gemini by name, describes what data is sent (child's questions, name, and age for personalization), and explains Google's zero-retention policy. Is updating the privacy policy alone sufficient, or is a separate in-app consent mechanism always required under 5.1.2(i)? 4. Children's apps specifically: Since parents set up the app (behind a parental gate), should the consent be presented to the parent during setup, or does it need to appear elsewhere?
Any guidance on the minimum compliant implementation would be greatly appreciated. We want to get this right.
Thank you.