Hello all,
I recently submitted my app to the App Store, but it was rejected due to the "design spam" guideline, stating that it duplicates content and functionality in the saturated astrology category.
My app primarily focuses on astrology but includes a unique AI-powered feature that I believe differentiates it from others in the category. Specifically, it leverages AI to provide highly personalized insights and interactive features that go beyond typical horoscope apps. Also it has a social media side that let people communicate and discuss about astrology.
How can I better communicate my app’s unique value to the App Review team?
Are there specific design or content tweaks that could help it stand out more clearly?
Has anyone successfully appealed a similar rejection? If so, what approach worked for you?
Overview
Post
Replies
Boosts
Views
Activity
While reading the developer documentation article Adopting SwiftData for a Core Data App, one particular line piqued my interest.
For apps that evolve from a version that doesn’t have any app group container to a version that has one, SwiftData copies the existing store to the app group container.
Given how troublesome it has been to migrate the Core Data persistent store to an app group container, I decided to try this out myself. I created an Xcode project using the default Core Data template. I then added a few Item objects with timestamps. There, I had what we would consider a regular Core Data app.
I then created a widget extension for this app since this is one of the most common uses for adopting an app group in an Xcode project. After that, I linked the main target with the widget extension using an app group. In the widget extension, I tried to fetch the Item objects. I utilized the SwiftData code in the sample project associated with the article above.
struct Provider: TimelineProvider {
private let modelContainer: ModelContainer
init() {
let appGroupContainerID = "group.com.genebogdanovich.CoreDataSwiftDataAppGroup"
guard let appGroupContainer = FileManager.default.containerURL(forSecurityApplicationGroupIdentifier: appGroupContainerID) else {
fatalError("Shared file container could not be created.")
}
let url = appGroupContainer.appendingPathComponent("CoreDataSwiftDataAppGroup.sqlite")
print("\(url)")
do {
modelContainer = try ModelContainer(for: Item.self, configurations: ModelConfiguration(url: url))
} catch {
fatalError("Failed to create the model container: \(error)")
}
}
}
func getTimeline(in context: Context, completion: @escaping (Timeline<Entry>) -> ()) {
Task { @MainActor in
let fetchDescriptor = FetchDescriptor<Item>()
let items: [Item] = try! modelContainer.mainContext.fetch(fetchDescriptor)
print(items)
let entry = SimpleEntry(date: .now, emoji: "😀", count: items.count)
let timeline = Timeline(entries: [entry], policy: .never)
completion(timeline)
}
}
The fetch yielded no results. However, as I explored the app group directory in the file system, I found a .sqlite file. That is interesting because SwiftData creates .store files by default. So, I am guessing that SwiftData did copy something. Or the ModelContainer initializer just created another empty SQLite file since the fetch returned zero results.
I would highly appreciate someone elaborating on that quote from the documentation.
FYI.
The source code of the FindSurface demo app for Apple Vision Pro (visionOS) is available now.
The Swift package of FindSurface™ library is required to build the source code into the demo app.
https://github.com/CurvSurf/FindSurface-visionOS
After starting the app, the floating panels (below) will appear on your right side, and you will see wireframe meshes that approximately describe your environments. Performing a spatial tap (pinching with your thumb and index finger) with staring at a location on the meshes will invoke FindSurface, with an indicator (blue disk) appearing on the surface you've gazed.
Voice commands:
“Tap” – Spatial tap (gazing & pinching). Invoke FindSurface.
“Tap plane” – Plane selection.
“Tap sphere” or “Tap ball” – Sphere selection.
“Tap cylinder” – Cylinder selection.
“Tap cone” – Cone selection.
“Tap torus” or “Tap donut” – Torus selection.
“Tap accuracy” or “Tap measurement accuracy” – Accuracy selection.
“Tap mean distance”, “Tap average distance”, or “Tap distance” – Avg. Distance selection.
“Tap touch radius” or “Tap seed radius” – Touch Radius selection.
“Tap Inlier” – “Show inlier points” toggle.
“Tap outline” – “Show geometry outline” toggle.
“Tap clear” – “Clear Scene” click.
I am currently working on implementing the Live Caller ID Extension for my iOS app, and I understand that a backend server is required for this functionality. While I’ve gone through Apple’s documentation, the details on the backend setup are limited and not very clear for my backend team to implement it effectively.
Could someone provide a more detailed explanation or sample implementation of the backend server required for this extension? Specifically, we are looking for:
A clear understanding of the APIs and endpoints the backend needs to expose.
Any authentication mechanisms required for communication with the extension.
Data format (e.g., JSON structure) for requests and responses.
Example code or additional resources, if available.
Any help or guidance in understanding the exact backend requirements would be greatly appreciated.
It's been more than 4 days since I create my developer's account and payments been successful but I still haven't recieved any instigation on the enrolment completion. How long does it usually take?
Hi,
I have built an app for my company where we are providing users to purchase our food bars via the app and then what they purchase , we will deliver to people in conflict effected or disaster effected areas. The purchase is done via stripe and apple pay.
The food bars are made of Spirulina and is full of energy and protein. Currently this is available via our web platform.
My app was rejected as my company is not a recognized non profit. I would like to know how I can get my app approved.
The only viable option given for me via app review was to take my payment screen to a web platform as donations.
However I would like to know if the issue is that the app is making users believe they are donating, by me changing the wordings, images , and structure of the app to ensure users know that they are purchasing the food bars and we are only delivering, will is help is app being approved by app store review.
Hi. I have reviewed the process of integrating Apple Pay on the web, but I still don’t understand how to implement it. For example: I currently have software A and a payment website that my software provides to restaurants. So, how can I integrate Apple Pay on the restaurants' payment websites?
I read that to integrate, we need to register for a Merchant ID with Apple Pay. So, is it the restaurants or the software provider who should register?
Each restaurant will have a different website domain -> does that mean when registering the Merchant ID, the website domain is the payment website of each restaurant?
When Apple Pay provides the verification file, the sales software (i.e., the payment website) must help the restaurants upload that file to the payment website of each restaurant, right?
To verify if it is valid or not depends on Apple Pay, right? If it is valid, the Apple Pay payment button will be displayed, correct?
I have tried to initialize service workers, but they only work in the WebView. When I open an iframe from that WebView, they do not function. Below is my implementation. Is this an issue because iOS does not support service workers in iframes? Please help me answer this. :man-bowing:
self.addEventListener('install', event => {
// Apply this service worker immediately
self.skipWaiting();
});
const putInCache = async (request, response) => {
const cache = await caches.open("v1");
await cache.put(request, response);
};
const customCache = async ({ request, preloadResponsePromise }) => {
};
self.addEventListener("fetch", (event) => {
event.respondWith(
customCache({
request: event.request,
preloadResponsePromise: event.preloadResponse,
}),
);
});
Good morning,
I'm new to the apple developer program and I'm working with someone I'm not in the same continent with. If I travel for a long time and I have to renew my subscription ($99), will there be a risk that my card for this new residence will not work? If I subscribe with a card from a well-defined region, will I necessarily need a card from this region for renewal?
Thanks…
How to determine whether the shortcut command is triggered by Siri or the user clicks?
I found some snapshot API in developer documents, like blows:
RealityKit / Views and attachments / ARView / /snapshot(saveToHDR:completion:)
SceneKit / SCNView / snapshot()
Is there a similar API in visionOS?and if not, how can I implement snapshot for realityview and usdz?
I'm developing an app that plays a WAV file through the Lightning headphone adapter. When i connect the adapter, a prompt appears asking whether to select "Headphones" or "Other Device" What does this setting actually do? I've noticed that it affects the maximum amplitude (volume) of the WAV output. Could you explain the precise difference between these two modes?
Starting with iOS 18, the behavior of searchable and searchSuggestions differs from previous versions.
In iOS 17.5, searchSuggestions remained visible even after selecting an item and navigating away. However, in iOS 18, searchSuggestions are dismissed after navigation.
Is there a way to keep searchSuggestions visible after navigation, as in iOS 17.5?
struct ContentView: View {
@State private var query = ""
var body: some View {
NavigationStack {
Color.red
.searchable(text: $query)
.searchSuggestions {
NavigationLink("Element") {
Color.blue
}
}
}
}
}
iOS 18.1
iOS 17.5
Issue faced on : 25 november 9:23AM
when trying to reply to customer-reviews through
this endpoint : https://api.appstoreconnect.apple.com/v1/customerReviewResponses
getting error as 500 Unexpected -error .
API Response :
{
"status": "500",
"code": "UNEXPECTED_ERROR",
"title": "An unexpected error occurred.",
"detail": "An unexpected error occurred on the server side. If this issue continues, contact us at https://developer.apple.com/contact/."
}
API Requestbody :
{
"data": {
"type": "customerReviewResponses",
"attributes": {
"responseBody": "Hi, thank you so much for your kind words and for sharing your positive experience! We're thrilled that you love the redeem points and their instant use. Keep exploring Tata Neu! _Meghal"
},
"relationships": {
"review": {
"data": {
"type": "customerReviews",
"id":"here is id of comment"
}
}
}
}
}
API Headers :
Authorization: Bearer {Token}
Content-Type: application/json
API URL :
https://api.appstoreconnect.apple.com/v1/customerReviewResponses
I am currently developing a game that runs on VisionOS using RealityKit and Swift.
I have a question regarding particle emitters.
It seems that there is a sorting order (render queue) between particle emitters themselves, but there doesn’t appear to be a render queue between particle emitters and regular model entities.
If such a feature exists, could you please provide a simple example?
Thank you!
Hi
I just encountered an reachability detection problem by calling SCNetworkReachabilityGetFlags function in iOS 16.
what did I do:
on device iPhone 12, iOS 16.1.1, turn on Airplane Mode, call SCNetworkReachabilityGetFlags, got flags = kSCNetworkReachabilityFlagsTransientConnection | kSCNetworkReachabilityFlagsReachable
on device iPhone 7, iOS 14.5.1, turn on Airplane Mode, call SCNetworkReachabilityGetFlags, got flags = 0
what I expect:
I'm expecting SCNetworkReachabilityGetFlags on my iOS 16.1 device behave same as my iOS 14.5 device, returning flags = 0. It's inappropriate returning kSCNetworkReachabilityFlagsReachable in this case.
Thank you!
Hi,
I am trying to upload the Certificate Signing Request but its failing and showing this error:
CSR algorithm/size incorrect. Expected: RSA(2048)
My test was conducted by changing the pink dice to have a 16 bit UUID using the ASKSampleAccessory app. Afterwards, I ran AccessorySetupKit Picker in the ASKSample app, but Pink dice was not found.
We confirmed that Pink dice was searched well in other apps.
Does AccessorySetupKit not support 16 bit UUID?
AccessorySetupKit Sample code : https://developer.apple.com/documentation/AccessorySetupKit/authorizing-a-bluetooth-accessory-to-share-a-dice-roll
We are reaching out to discuss an issue we have encountered with our app's activation process while running in the background. Currently, we are employing an iBeacon-based activation scheme, but we have noticed that after the app is activated, it is unable to receive UUID data from the scan-response while in the background.
We are considering the possibility of embedding the UUID data into the advertisement so that the app can receive it once activated by the iBeacon. Additionally, we are preparing to use both Core Bluetooth’s “Performing Long-Term Actions in the Background” feature and the iBeacon scheme simultaneously for app activation. We would like to know if these two methods can coexist without any mutual interference.
Currently, we are utilizing a method of updating the beacon advertisement after connection and disconnection to enhance the app's activation capability. However, in some scenarios where the signal is weak, the app does not detect the vehicle after being activated and will not be reactivated by the same beacon after going into sleep mode. Our current approach is to update the beacon advertisement every 10 seconds to improve this capability.
We have outlined our proposed changes and would appreciate your confirmation on whether they could lead to better optimization:
1.Embedding the UUID from the scan-response into the advertisement.
2.Updating the iBeacon advertisement content every 10 seconds.
3. Simultaneously using Core Bluetooth's "Performing Long-Term Actions in the Background" feature along with the iBeacon scheme for app activation.
Additionally, we would like to know if these changes could potentially cause any other issues.
Thank you for your assistance, and I look forward to your insights on this matter.
Hi, in visionOS2, I wonder if there is a method to add/remove referenceObjects of ObjectTrackingProvider after ARSession started without stop it.
Now I must stop it and run it again. The current tracking is stopped.
Thanks.