Hi,
we've developed an app for Vision Pro that utilises the GroupActivitites SDK to provide shared experiences for our users.
Remote Participation works great, but we can't get nearby sharing to work.
The behaviour we're observing:
User 1 engages share sheet from Volume, 2nd Vision Pro is visible.
User 1 starts nearby sharing
Session initialisation runs for approx. 30 seconds, then fails
Sometimes, the nearby participant doesn't show up at all after the initialisation has failed once.
As stated in the Configure your visionOS app for sharing with people nearby article, we didn't make any changes to our implementation to support nearby sharing.
Any help would be greatly appreciated.
Kind regards,
David
Overview
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
When I use BGContinuedProcessingTask to submit a task, my iPhone 12 immediately shows a notification banner displaying the task’s progress.
However, on my iPhone 15 Pro Max, there’s no response — the progress UI only appears in the Dynamic Island after I background the app.
Why is there a difference in behavior between these two devices?
Is it possible to control the UI so that the progress indicator only appears when the app moves to the background?
Hello,
If you add a ManipulationComponent to a RealityKit entity and then continue to add instructions, sooner or later you will encounter a crash with the following error message:
Attempting to move entity “%s” (%p) under “%s” (%p), but the new parent entity is currently being removed. Changing the parent/child entities of an entity in an event handler while that entity is already being reassigned is not supported.
CoreSimulator 1048 – Device: Apple Vision Pro 4K (B87DD32A-E862-4791-8B71-92E50CE6EC06) – Runtime: visionOS 26.0 (23M336) – Device Type: Apple Vision Pro
The problem occurs precisely with this code:
ManipulationComponent.configureEntity(object)
I adapted Apple's ObjectPlacementExample and made the changes available via GitHub.
The desired behavior is that I add entities to ManipulationComponent and then Realitiykit runs stably and does not crash randomly.
GitHub Repo
Thanks
Andre
Apple review says , my app displayed an error when we attempted to purchase subscriptions. Please review the details and resources below and complete the next steps.
Device type: iPad Air (5th generation)
OS version: iPadOS 26.0.1
Next Steps
When validating receipts on your server, your server needs to be able to handle a production-signed app getting its receipts from Apple’s test environment. The recommended approach is for your production server to always validate receipts against the production App Store first. If validation fails with the error code "Sandbox receipt used in production," you should validate against the test environment instead.
Question: Is it due to Device being used by reviewer or is it really from my code. As my code relies on Apple infrastructure for purchases and all things.
Initially i did had subscription reporting api for receipt handling and all.When i went through with ChatGPT it did say that issue is due to half baked subscription module on my server. So i decided not to send any Subscription related things to backend, now it's Apple only and on App side.
Is it correct fix ? Or do i need to fix backend even though i have no use for it ?
My team did test in sandbox env via internal testing that time we had no issues. And all was tested using Mobile devices, that's why i still have question just to be sure these errors are due to devices or not?
Screenshot shared by Apple team did show they got a error popup saying Something went wrong : Unable to complete request. I am trying to reproduce in development but can't.
Anyone had got same issue before and has information on how to resolve and test for it will be helpful.
Thanks
Shikhar Sahu
Successfully received submission history.
history
......
--------------------------------------------------
createdDate: 2025-10-19T18:34:47.472Z
id: d3248896-7841-421e-9470-101df9d0da21
name: ...
status: In Progress
--------------------------------------------------
createdDate: 2025-10-19T18:12:45.325Z
id: e5822fa0-5bcf-4610-81fc-9f541e8ad189
name: ...
status: In Progress
I found that unofficial apps like ChatGPT and Shadowrocket can use widgets in VisionOS 26. How is this achieved? How can I also enable widgets for apps I develop?
My external device can generate a fixed Wi-Fi network. When I connect to this Wi-Fi using my iPhone 17 Pro Max (iOS version 26.0.1), and my app tries to establish a connection using the following method, this method returns -1
int connect(int, const struct sockaddr *, socklen_t) __DARWIN_ALIAS_C(connect);
However, when I use other phones, such as iPhone 12, iPhone 8, iPhone 11, etc., to connect to this external device, the above method always returns successfully, with the parameters passed to the method remaining the same.
I also tried resetting the network settings on the iPhone 17 Pro Max (iOS version 26.0.1), but it still cannot establish a connection.
Topic:
App & System Services
SubTopic:
Networking
In Reality Composer Pro, why is the Sky Sphere so much larger than the Sky Dome?
By my estimate, the Sky Sphere has a radius of 100m, while the Sky only has a radius of only 12m.
I hate to be the new guy but this is our first venture into distributing an app through Apple. Our app will be finished by the end of the year but I've heard you can get pre approval but I'm not really sure where to turn to so I can get the process started either for that or full approval when it's done. Can anyone help point me in the right direction?
Thank you.
Topic:
App Store Distribution & Marketing
SubTopic:
General
Hi, I've been running into an issue with the Xcode Intelligence assistance on the side. I am hitting limits very fast despite having both Claude and GPT plus. I've tried reauthenticating several times, turning apple intelligence off and on, nothing seems to be working.
Topic:
Developer Tools & Services
SubTopic:
Xcode
I am building a 360 photo viewer in VisionOS 26. Which allows the user to choose a 2 by 1 jpg and then renders it with a sphere mesh entity. And I use: TextureResource(contentsOf: url, options: options).
I noticed two situations here in terms of mipmaps options.
When setting "mipmapsMode: .none":
The graphic quality within the "gaze area" looks sharp and clear
The two poles (top and bottom) are perfectly rendered
Massive shimmer around the "gaze area"
When setting "mipmapsMode: .allocateAndGenerateAll":
The graphic looks slightly blurrier than in ".none" within the "gaze area"
The two poles are very blurry and hard to recognize the texture
Much less shimmer around the "gaze area"
My question would be: Is there a way to have the perfect graphic quality in ".none" without the massive shimmer?
Thank you!
Screenshots:
mipmapsMode: .none
mipmapsMode: .allocateAndGenerateAll
I developed a XCode project using Flutter (v. 3.35.6).
The application basically has a IntentExtension to handle intents donation and the related business logic. We decided to go with ShortcutExtension in place of AppIntents because it fits with our app's use case (where basically we need to dynamically donate/remove intents).
We have an issue building the project, and it is due to the presence of the IntentExtension .appex file in the Build Phases --> Embed Foundation Extensions.
If we remove it , the project builds however the IntentHandling is not invoked in the Shortcuts app.
Build issue:
Generated error in the console:
Cycle inside Runner; building could produce unreliable results.
Cycle details:
→ Target 'Runner' has copy command from '/Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/ShortcutsExtension.appex' to '/Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/Runner.app/PlugIns/ShortcutsExtension.appex'
○ That command depends on command in Target 'Runner': script phase “[CP] Copy Pods Resources”
○ That command depends on command in Target 'Runner': script phase “[CP] Embed Pods Frameworks”
○ That command depends on command in Target 'Runner': script phase “FlutterFire: "flutterfire upload-crashlytics-symbols"”
○ That command depends on command in Target 'Runner': script phase “FlutterFire: "flutterfire bundle-service-file"”
○ That command depends on command in Target 'Runner': script phase “Thin Binary”
○ Target 'Runner' has process command with output '/Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/Runner.app/Info.plist'
○ Target 'Runner' has copy command from '/Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/ShortcutsExtension.appex' to '/Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/Runner.app/PlugIns/ShortcutsExtension.appex'
Raw dependency cycle trace:
target: ->
node: ->
command: ->
node: /Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Intermediates.noindex/Runner.build/Release-quality-iphoneos/Runner.build/Objects-normal/arm64/ExtractedAppShortcutsMetadata.stringsdata ->
command: P0:target-Runner-18c1723432283e0cc55f10a6dcfd9e0288a783a885d8b0b3beb2e9f90bde3f49-:Release-quality:ExtractAppIntentsMetadata ->
node: ->
command: P0:::Gate target-Runner-18c1723432283e0cc55f10a6dcfd9e0288a783a885d8b0b3beb2e9f90bde3f49--fused-phase10-copy-files ->
node: <Copy /Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/Runner.app/PlugIns/ShortcutsExtension.appex> ->
CYCLE POINT ->
command: P0:target-Runner-18c1723432283e0cc55f10a6dcfd9e0288a783a885d8b0b3beb2e9f90bde3f49-:Release-quality:Copy /Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/Runner.app/PlugIns/ShortcutsExtension.appex /Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/ShortcutsExtension.appex ->
node: ->
command: P0:::Gate target-Runner-18c1723432283e0cc55f10a6dcfd9e0288a783a885d8b0b3beb2e9f90bde3f49--fused-phase9--cp--copy-pods-resources ->
node: /Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/Runner.app/GoogleMapsResources.bundle ->
command: P2:target-Runner-18c1723432283e0cc55f10a6dcfd9e0288a783a885d8b0b3beb2e9f90bde3f49-:Release-quality:PhaseScriptExecution [CP] Copy Pods Resources /Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Intermediates.noindex/Runner.build/Release-quality-iphoneos/Runner.build/Script-B728693F1F2684724A065652.sh ->
node: ->
command: P0:::Gate target-Runner-18c1723432283e0cc55f10a6dcfd9e0288a783a885d8b0b3beb2e9f90bde3f49--fused-phase8--cp--embed-pods-frameworks ->
node: /Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Products/Release-quality-iphoneos/Runner.app/Frameworks/Alamofire.framework ->
command: P2:target-Runner-18c1723432283e0cc55f10a6dcfd9e0288a783a885d8b0b3beb2e9f90bde3f49-:Release-quality:PhaseScriptExecution [CP] Embed Pods Frameworks /Users/federico.gatti/Documents/comfort-mobile-app/apps/comfort/ios/DerivedData/Runner/Build/Intermediates.noindex/Runner.build/Release-quality-iphoneos/Runner.build/Script-1A1449CD6436E619E61D3E0D.sh ->
node: ->
command: P0:::Gate target-Runner-18c1723432283e0cc55f10a6dcfd9e0288a783a885d8b0b3beb2e9f90bde3f49--fused-phase7-flutterfire---flutterfire-upload-crashlytics-symbols- ->
node: <execute-shell-script-18c1723432283e0cc55f10a6dcfd9e024008e7f13be1da4979f78de280354094-target-Runner-18c1723432283e
Hi,
when a CoreMIDI driver controls physical HW it is probably quite commune to have to control the amount of MIDI data received from the system.
What comes to mind is to just delay returning control of the MIDIDriverInterface::Send() callback to the calling process. While the application trying to send MIDI really stalls until the callback returns it seems only to be a side effect of a generally stalled CoreMIDI server. Between the callbacks the application can send as much MIDI data as it wants to CoreMIDI, it's buffering seems to be endless... However the HW might not be able to play out all the data.
It seems there is no way to indicate an overflow/full buffer situation back the application/CoreMIDI. How is this supposed to work?
Thanks, any hints or pointers are highly appreciated!
Hagen.
Is there sample code on how to use VTFrameProcessorOpticalFlow?
Topic:
Media Technologies
SubTopic:
Video
We have developed Apple Wallet Extension for our App. The in-app provisioning for the card is working. However when we try to add the card from Wallet extension it gives error saying "Your issuer does not yet offer support for this card".
From the apple documentation we can see the issues is same as mentioned in Scenario 2 at following link https://applepaydemo.apple.com/in-app-provisioning#8.4
We are getting eligibilityStatus as 0
Below is the response from Wallet captured using SysDiagnosis
https://crt-pod1-smp-device.apple.com:443/broker/v4/devices/0434320BCB1A90022306073796318273728D0A367FA927F4/cards 200 Time profile: 1.77856 seconds
{
x-conversation-id = ......
Content-Type = "application/json"
x-pod = "crt-pod1"
x-xss-protection = "1; mode=block"
Server = "Apple"
x-pod-region = "paymentpass.com.apple"
regionbrokerurl = "https://crt-pod1-smp-device.apple.com:443/broker"
Date = "Wed, 06 Aug 2025 11:39:30 GMT"
Content-Length = "488"
x-envoy-upstream-service-time = "1400"
Strict-Transport-Security = "max-age=31536000; includeSubdomains"
cross-origin-opener-policy = "same-origin"
x-keystone-correlationid = ......
x-content-type-options = "nosniff"
Vary = "accept-language"
x-frame-options = "SAMEORIGIN"
}
{
applicationIdentifier = ......;
auxiliaryCapabilities = {
};
cardType = 4;
deviceProvisioningDataExpected = 1;
eligibilityStatus = 0;
identifier = ......;
learnMoreURL = "https://www.apple.com/ae/apple-pay/banks/ae/en-ae.html";
nonce = ......;
paymentApplications = (
{
appletTypeIdentifier = Argon;
paymentType = Credit;
}
);
region = "paymentpass.com.apple";
sanitizedPrimaryAccountNumber = 7008;
sanitizedPrimaryAccountPrefix = "";
}
I implemented AlarmKit framework to my app.
It works fine when the device's screen is off.
but it doesn't fire when the screen is on.
No sound, No dynamic Island widget. what's wrong... is this just me?
I tried other apps on appstore and they worked fine.
maybe is it cause they were built with SwiftUI?
I'm a flutter developer. just cause my app is non native could this happen... I'm so frustrated.
Topic:
App & System Services
SubTopic:
General
Would anyone help why my account got terminated. I checked my app & it had simple text pricing typo should not be grounds for account termination. Even i made a typo in iap price the price of product on apple server was correct & when user try to buy item the correct price is loaded from apple sever. If this was the reason for the decision, I would like the opportunity to correct the issue rather than face such a severe penalty.
"This letter serves as notice of termination of the Apple Developer Program License Agreement (the “ADP Agreement”) and the Apple Developer Agreement (the “Developer Agreement”) between you and Apple effective immediately.
Pursuant to Section 3.2(f) of the ADP Agreement, you agreed that you would not “commit any act intended to interfere with any of the Apple Software or Services, the intent of this Agreement, or Apple’s business practices including, but not limited to, taking actions that may hinder the performance or intended use of the App Store, Custom App Distribution, TestFlight, Xcode Cloud, Ad Hoc distribution, or the Program …” Apple has good reason to believe that you violated this Section due to documented indications of fraudulent conduct associated with your account.
Apple is exercising its right to terminate your status as an Apple developer pursuant to the Apple Developer Agreement and is terminating you under the ADP Agreement for dishonest and fraudulent acts relating to that agreement. We would like to remind you of your obligations with regard to all software and other confidential information that you obtained from Apple as an Apple developer and under the ADP Agreement. You must promptly cease all use of and destroy such materials and comply with all the other termination obligations set forth in Section 11.3 of the ADP Agreement and Section 10 of the Apple Developer Agreement.
If applicable, no further payments will be made to you pursuant to Section 7.1 of the Paid Applications agreement (Schedules 2 and 3 to the ADP Agreement).
This letter is not intended to be a complete statement of the facts regarding this matter, and nothing in this letter should be construed as a waiver of any rights or remedies Apple may have, all of which are hereby reserved. Finally, please note that we will deny your reapplication to the Apple Developer Program for at least a year considering the nature of your acts. If you want to file an official complaint pursuant to an applicable Platform Regulation in your country or region you may Contact Us."
Topic:
Developer Tools & Services
SubTopic:
Apple Developer Program
My app IPA shows get-task-allow=true even though my provisioning and info.plist are false. The issue started after transferring the app to a new developer account.
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Hi everyone,
I am developing a .NET MAUI Mac Catalyst app (sandboxed) that communicates with a custom vendor-specific HID USB device.
Within the Catalyst app, I am using a native iOS library (built with Objective-C and IOKit) and calling into it via P/Invoke from C#.
The HID communication layer relies on IOHIDManager and IOUSBInterface APIs.
The device is correctly detected and opened using IOHIDManager APIs.
However, IOHIDDeviceRegisterInputReportCallback never triggers — I don’t receive any input reports.
To investigate, I also tried using low-level IOKit USB APIs via P/Invoke from my Catalyst app, calling into a native iOS library.
When attempting to open the USB interface using IOUSBInterfaceOpen() or IOUSBInterfaceOpenSeize(), both calls fail with: kIOReturnNotPermitted (0xe00002e2).
— indicating an access denied error, even though the device enumerates and opens successfully.
Interestingly, when I call IOHIDDeviceSetReport(), it returns status = 0, meaning I can successfully send feature reports to the device.
Only input reports (via the InputReportCallback) fail to arrive.
I’ve confirmed this is not a device issue — the same hardware and protocol work perfectly under Windows using the HIDSharp library, where both input and output reports function correctly.
What I’ve verified
•Disabling sandboxing doesn’t change the behavior.
•The device uses a vendor-specific usage page (not a standard HID like keyboard/mouse).
•Enumeration, open, and SetReport all succeed — only reading input reports fails.
•Tried polling queues, in queues Input_Misc element failed to add to the queues.
•Tried getting report in a loop but no use.
https://ss.bscstorage.com/playlet-oss-res/video/2025195/%E6%BC%94%E7%A4%BA3.mp4?AWSAccessKeyId=6jndo5gcx079kvpu3myf&Expires=1760941761&Signature=hBcY7TDq%2BZHOARoFKY6CAthju%2Fc%3D
This is the link of the demonstration video. How is this function effect in the video achieved?