Search results for

“translate scheme”

6,658 results found

Post

Replies

Boosts

Views

Activity

Reply to What's the non-sandboxed path to Downloads folder on the iPhone
On iOS there’s no supported way to construct paths to items outside of your apps containers. The only supported way to get such paths is to ask the user for them, using the file selection API from the UI framework you’re using. Furthermore, this is a concern: [quote='781930021, delingren, /thread/781930, /profile/delingren'] I know the URL scheme is shareddocuments [/quote] AFAIK this is not a a documented URL scheme. I discuss that topic in more detail in Supported URL Schemes. WARNING Don’t build a product that relies on unsupported implementation details like. It’s very likely that your product will break in future releases of iOS and, if it does, there’s a good chance that there’ll be no path forward for your app. If you’d like us to add a supported mechanism to achieve this goal, I encourage you to file an enhancement request describing your requirements. Please post your bug number, just for the record. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Appl
Topic: App & System Services SubTopic: Core OS Tags:
Apr ’25
What's the non-sandboxed path to Downloads folder on the iPhone
I'm trying to construct a URL that, when tapped, would launch Files app and open the Downloads folder on the iPhone (not in iCloud Drive). I know the URL scheme is shareddocuments but I can't figure out the path. I have tried a few things including writing a simple iOS app and using Scriptable app. But I always get a sandboxed path such as /private/var/mobile/Containers/Data/Application/87CC2F48-AF1C-4C80-8D75-B6CC1FC642E3/Downloads/. But that wouldn't work across devices. Does anyone happen to know the path or a method to obtain the non-sandboxed path? Thanks. PS I already figured out the Downloads folder in iCloud Drive, which is shareddocuments:///private/var/mobile/Library/Mobile%20Documents/com~apple~CloudDocs/Downloads. But what I need is the one on the iPhone.
1
0
162
Apr ’25
Reply to How to set permanent environment variables?
There’s no general-purpose mechanism for you, as a user, to set environment variables on apps. Such a mechanism doesn’t really make sense on the Mac, because Mac apps aren’t supposed to rely on the Unix environment. Of course, that stance breaks down when you start talking about developer tools. It’s not uncommon for developer tools to rely on environment variables. And because of that GUI developer tool will often provide a mechanism for setting environment variables. Xcode is a good example of this, where you can set environment variables for the app you’re debugging in the scheme editor. It’s possible that your developer tool actually has similar support. I recommend that you ask that question via the vendor’s support channel. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Apr ’25
Translate extension bahvior
DESCRIPTION OF PROBLEM We need to add an implementation that will have the same swipe/scroll behavior as the Apple Translator extension, here is the code that we are currently using: import SwiftUI import TranslationUIProvider @main class TranslationProviderExtension: TranslationUIProviderExtension { required init() {} var body: some TranslationUIProviderExtensionScene { TranslationUIProviderSelectedTextScene { context in VStack { TranslationProviderView(context: context) } } } } struct TranslationProviderView: View { @State var context: TranslationUIProviderContext init(context c: TranslationUIProviderContext) { context = c } var body: some View { ScrollableSheetView() } } struct ScrollableSheetView: View { var body: some View { ScrollView { VStack(spacing: 20) { ForEach(0..<50) { index in Text(Item (index)) .padding() .frame(maxWidth: .infinity) .background(Color.blue.opacity(0.1)) .cornerRadius(8) } } .padding() } .padding() } } Using this code, on the first extension run, swipe up will expand
Topic: UI Frameworks SubTopic: SwiftUI
0
0
73
Apr ’25
Is it possible to view Xcode output from a scheme's archive post actions's script
In Xcode I've: select Product / Scheme / Edit scheme tap on Archive on the left hand side of the select post actions and + to add a new script Then in there I have added a script I want to run on the archive after its created. I'd like to be able to see the output the script churns out as it goes along but doesn't seem possible? If I just add something like echo hello to the start of the script then I don't see hello visible anywhere when I build an archive (via Product/Archive). I'm looking in the build navigator. Is there somewhere else to look or is it possible to get the logging into the navigator?
1
0
152
Apr ’25
Reply to iOS 18 Issue: When we add arm64 into Excluded Architecture and try to present screen, Screen will get froze and unable to select images.
Due to the existence of a framework that cannot be replaced temporarily in the project, I can only exclude arm64 in order to run it on the M-series chip simulator and reproduce this issue. Manifested as the emulator running on arm64 architecture and the project architecture being x86 At present, the image selector can only be used on real machines. The above is from automatic translation.
Topic: UI Frameworks SubTopic: UIKit Tags:
Apr ’25
CVPixelBufferCreate EXC_BAD_ACCESS
I am doing something similar to this post Within an AVCaptureDataOutputSynchronizerDelegate method, I create a pixelBuffer using CVPixelBufferCreate with the following attributes: kCVPixelBufferIOSurfacePropertiesKey as String: true, kCVPixelBufferIOSurfaceOpenGLESTextureCompatibilityKey as String: true When I copy the data from the vImagePixelBuffer rotatedImageBuffer, I get the following error: Thread 10: EXC_BAD_ACCESS (code=1, address=0x14caa8000) I get the same error with memcpy and data.copyBytes (not running them at the same time obviously). If I use CVPixelBufferCreateWithBytes, I do not get this error. However, CVPixelBufferCreateWithBytes does not let you include attributes (see linked post above). I am using vImage because I need the original CVPixelBuffer from the camera output and a rotated version with a different color scheme. // Copy to pixel buffer let attributes: NSDictionary = [ true : kCVPixelBufferIOSurfacePropertiesKey, true : kCVPixelBufferIOSurfaceOpenGLESTextureCompatibilityK
1
0
323
Apr ’25
Best way to pass a HomeKit or Matter setup code to the Home App Programatically
Apologies in advance for the long post. I'm new to HomeKit and Matter but not to development, I'm trying to write a SwiftUI app for my smart home to store all of my HomeKit and Matter setup barcodes along with other bits of information. The intention is to scan the QR codes with my App and then save that QR payload in a simple Database along with other manually entered device details. Example payloads: X-HM://00GWIN0B5PHPG <-- Eufy V120 HomeKit Camera MT:GE.01-C-03FOPP6B110 <-- Moes GU10 Matter Bulb I have it 99% working, my app is even able to discern the manual pairing code from the above payloads. However one of the key feature of this is that I want to open a device entry in my app and tap the HomeKit or Matter code displayed in my app and and either: a) Ideally pass it off to the Apple Home app to initiate pairing just like the native Camera App can. b) Create a custom flow in my app using the HomeKit or Matter API's to initiate paring from within my app. So ideally just like the flow that happens
4
0
265
Apr ’25
Reply to Duplicate apps launched when debugging in Xcode?
I have a command-line app experiencing this issue. It uses CVDisplayLink to render directly to a window through Metal. Apple recently deprecated CVDisplayLink and introduced a new recommended API, CAMetalDisplayLink. It has the same restrictions as CADisplayLink and prevents the user from having true low-level control over frame synchronization. The other apps mentioned with this bug rely on OpenGL and other frameworks, which likely delegate directly to CVDisplayLink. This might explain why a strange subset of all apps are affected by the bug. The solution for my use case, was to launch the app from swift run on command-line w/ SwiftPM, which is how it would run on non-Apple platforms as well. But my solution may not be the best for other people. Alternatively, you can add an intentional ≥350 ms delay upon app startup. The workaround sleep(1) worked because it introduced a 1000 ms delay, ~three times what is actually needed. https://github.com/philipturner/molecular-renderer/commit/655e367ef5a33218d7fc5654ded
Apr ’25
Using handleExternalEvents scene modifier to route external events to the correct scene
In an iPadOS SwiftUI app supporting multiple scenes, each Scene responds to a particular way in which the app was launched. If app was launched by tapping an associated file or a deep link (custom URL), then, the URLHandlerScene is invoked. If app was launched by QuickAction (long tap on the app icon), then another Scene is invoked etc. Each Scene has a purpose and responds to a particular launch. But after defining handlesExternlEvents(matching:) scene modifier, the scene was not getting launched when user taps the associated file or the app's Deeplinks was invoked. @main struct IOSSwiftUIScenesApp: App { var body: some Scene { DefaultScene() URLHandlerScene() .handlesExternalEvents(matching: [file://]) // Launched by an associated file .handlesExternalEvents(matching: [Companion://]) // Launched by Deeplink. // Other scenes } } struct URLHandlerScene: Scene { @State private var inputURL: URL // Store the incoming URL init() { self.inputURL = URL(string: Temp://)! } var body: some Scene { WindowGroup { URLha
1
0
152
Apr ’25
Reply to Debug Failed in Xcode Simulator
I experienced the same issue in my environment (XCode 16.2, Intel MBP, iOS Simulator). However after getting Couldn't find the Objective-C runtime library in loaded images at the first breakpoint, manually attaching the debugger allowed subsequent breakpoints to function properly. Therefore, I'm using the following workaround by running without a debugger and attaching manually: Product > Scheme > Edit Scheme... Choose Run from left pane. In the Info tab of right pane, uncheck Debug executable Run Application Debug > Attach to Process > (Choose Your Simulator from the top of the list) Since I've just started using this method, I can't guarantee it will always work, but it might be helpful for debugging except for the very initial startup phase.
Apr ’25
Reply to Summon gesture
Hi @cuo001 There are many ways to do this. Here's a snippet to get you started. It uses SpatialEventGesture and a hand AnchorEntity. When a person begins the pinch gesture, the code appends the entity to an anchor entity attached to the index finger, mantaining its world position, then animates the entity to a position near the index finger. When the pinch ends, the code appends the entity to its previous parent and animates the entity to its previous position. Create a custom component to store the state related to the custom interaction. struct SummonComponent: Component { let leftAnchor: AnchorEntity let rightAnchor: AnchorEntity var isHolding = false var parentEntity: Entity? var previousTransform: Transform? } Implement the interaction. struct ImmersiveView: View { @State var spatialTrackingSession = SpatialTrackingSession() var body: some View { RealityView { content in // Create the entity to summon. let box = ModelEntity(mesh: .generateBox(size: 0.15), materials: [SimpleMaterial(color: .red, isMetalli
Topic: Spatial Computing SubTopic: General Tags:
Apr ’25
Simulator 18.4 Webview CORS issues
I have a very specific issue that happens only on iOS Simulator version 18.4. It does NOT happen when I run my app on a real iOS 18.4 device through Testflight. My app displays a WebView (courtesy of Capacitor, url scheme capacitor://). Inside that Webview I'm using Firebase JS API (11.2.0) and calling signInWithEmailAndPassword, which works well in all other contexts, i.e. browser, Android webview, iOS webview in all other Simulator versions, and on real devices. Only when running in Simulator 18.4, I get a failed network request: cannot parse response Fetch API cannot load https://identitytoolkit.googleapis.com/v1/accounts:signInWithPassword?... due to access control checks. Failed to load resource: cannot parse reponse error: FirebaseError: (auth/network-request-failed) Everything is working correctly for both: Capacitor app webview installed on a real 18.4 device with Testflight Safari (non-webview) in the 18.4 Simulator The issue is severe for us, because we are unable to develop our app and tes
0
0
308
Apr ’25
Reply to What's the non-sandboxed path to Downloads folder on the iPhone
On iOS there’s no supported way to construct paths to items outside of your apps containers. The only supported way to get such paths is to ask the user for them, using the file selection API from the UI framework you’re using. Furthermore, this is a concern: [quote='781930021, delingren, /thread/781930, /profile/delingren'] I know the URL scheme is shareddocuments [/quote] AFAIK this is not a a documented URL scheme. I discuss that topic in more detail in Supported URL Schemes. WARNING Don’t build a product that relies on unsupported implementation details like. It’s very likely that your product will break in future releases of iOS and, if it does, there’s a good chance that there’ll be no path forward for your app. If you’d like us to add a supported mechanism to achieve this goal, I encourage you to file an enhancement request describing your requirements. Please post your bug number, just for the record. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Appl
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Apr ’25
What's the non-sandboxed path to Downloads folder on the iPhone
I'm trying to construct a URL that, when tapped, would launch Files app and open the Downloads folder on the iPhone (not in iCloud Drive). I know the URL scheme is shareddocuments but I can't figure out the path. I have tried a few things including writing a simple iOS app and using Scriptable app. But I always get a sandboxed path such as /private/var/mobile/Containers/Data/Application/87CC2F48-AF1C-4C80-8D75-B6CC1FC642E3/Downloads/. But that wouldn't work across devices. Does anyone happen to know the path or a method to obtain the non-sandboxed path? Thanks. PS I already figured out the Downloads folder in iCloud Drive, which is shareddocuments:///private/var/mobile/Library/Mobile%20Documents/com~apple~CloudDocs/Downloads. But what I need is the one on the iPhone.
Replies
1
Boosts
0
Views
162
Activity
Apr ’25
Reply to How to set permanent environment variables?
There’s no general-purpose mechanism for you, as a user, to set environment variables on apps. Such a mechanism doesn’t really make sense on the Mac, because Mac apps aren’t supposed to rely on the Unix environment. Of course, that stance breaks down when you start talking about developer tools. It’s not uncommon for developer tools to rely on environment variables. And because of that GUI developer tool will often provide a mechanism for setting environment variables. Xcode is a good example of this, where you can set environment variables for the app you’re debugging in the scheme editor. It’s possible that your developer tool actually has similar support. I recommend that you ask that question via the vendor’s support channel. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Replies
Boosts
Views
Activity
Apr ’25
Translate extension bahvior
DESCRIPTION OF PROBLEM We need to add an implementation that will have the same swipe/scroll behavior as the Apple Translator extension, here is the code that we are currently using: import SwiftUI import TranslationUIProvider @main class TranslationProviderExtension: TranslationUIProviderExtension { required init() {} var body: some TranslationUIProviderExtensionScene { TranslationUIProviderSelectedTextScene { context in VStack { TranslationProviderView(context: context) } } } } struct TranslationProviderView: View { @State var context: TranslationUIProviderContext init(context c: TranslationUIProviderContext) { context = c } var body: some View { ScrollableSheetView() } } struct ScrollableSheetView: View { var body: some View { ScrollView { VStack(spacing: 20) { ForEach(0..<50) { index in Text(Item (index)) .padding() .frame(maxWidth: .infinity) .background(Color.blue.opacity(0.1)) .cornerRadius(8) } } .padding() } .padding() } } Using this code, on the first extension run, swipe up will expand
Topic: UI Frameworks SubTopic: SwiftUI
Replies
0
Boosts
0
Views
73
Activity
Apr ’25
Is it possible to view Xcode output from a scheme's archive post actions's script
In Xcode I've: select Product / Scheme / Edit scheme tap on Archive on the left hand side of the select post actions and + to add a new script Then in there I have added a script I want to run on the archive after its created. I'd like to be able to see the output the script churns out as it goes along but doesn't seem possible? If I just add something like echo hello to the start of the script then I don't see hello visible anywhere when I build an archive (via Product/Archive). I'm looking in the build navigator. Is there somewhere else to look or is it possible to get the logging into the navigator?
Replies
1
Boosts
0
Views
152
Activity
Apr ’25
Reply to iOS 18 System Bug Causes URL Scheme Failure
FB17303841 (iOS 18 System Bug Causes URL Scheme Failure)
Replies
Boosts
Views
Activity
Apr ’25
Reply to iOS 18 Issue: When we add arm64 into Excluded Architecture and try to present screen, Screen will get froze and unable to select images.
Due to the existence of a framework that cannot be replaced temporarily in the project, I can only exclude arm64 in order to run it on the M-series chip simulator and reproduce this issue. Manifested as the emulator running on arm64 architecture and the project architecture being x86 At present, the image selector can only be used on real machines. The above is from automatic translation.
Topic: UI Frameworks SubTopic: UIKit Tags:
Replies
Boosts
Views
Activity
Apr ’25
CVPixelBufferCreate EXC_BAD_ACCESS
I am doing something similar to this post Within an AVCaptureDataOutputSynchronizerDelegate method, I create a pixelBuffer using CVPixelBufferCreate with the following attributes: kCVPixelBufferIOSurfacePropertiesKey as String: true, kCVPixelBufferIOSurfaceOpenGLESTextureCompatibilityKey as String: true When I copy the data from the vImagePixelBuffer rotatedImageBuffer, I get the following error: Thread 10: EXC_BAD_ACCESS (code=1, address=0x14caa8000) I get the same error with memcpy and data.copyBytes (not running them at the same time obviously). If I use CVPixelBufferCreateWithBytes, I do not get this error. However, CVPixelBufferCreateWithBytes does not let you include attributes (see linked post above). I am using vImage because I need the original CVPixelBuffer from the camera output and a rotated version with a different color scheme. // Copy to pixel buffer let attributes: NSDictionary = [ true : kCVPixelBufferIOSurfacePropertiesKey, true : kCVPixelBufferIOSurfaceOpenGLESTextureCompatibilityK
Replies
1
Boosts
0
Views
323
Activity
Apr ’25
Best way to pass a HomeKit or Matter setup code to the Home App Programatically
Apologies in advance for the long post. I'm new to HomeKit and Matter but not to development, I'm trying to write a SwiftUI app for my smart home to store all of my HomeKit and Matter setup barcodes along with other bits of information. The intention is to scan the QR codes with my App and then save that QR payload in a simple Database along with other manually entered device details. Example payloads: X-HM://00GWIN0B5PHPG <-- Eufy V120 HomeKit Camera MT:GE.01-C-03FOPP6B110 <-- Moes GU10 Matter Bulb I have it 99% working, my app is even able to discern the manual pairing code from the above payloads. However one of the key feature of this is that I want to open a device entry in my app and tap the HomeKit or Matter code displayed in my app and and either: a) Ideally pass it off to the Apple Home app to initiate pairing just like the native Camera App can. b) Create a custom flow in my app using the HomeKit or Matter API's to initiate paring from within my app. So ideally just like the flow that happens
Replies
4
Boosts
0
Views
265
Activity
Apr ’25
Reply to Duplicate apps launched when debugging in Xcode?
I have a command-line app experiencing this issue. It uses CVDisplayLink to render directly to a window through Metal. Apple recently deprecated CVDisplayLink and introduced a new recommended API, CAMetalDisplayLink. It has the same restrictions as CADisplayLink and prevents the user from having true low-level control over frame synchronization. The other apps mentioned with this bug rely on OpenGL and other frameworks, which likely delegate directly to CVDisplayLink. This might explain why a strange subset of all apps are affected by the bug. The solution for my use case, was to launch the app from swift run on command-line w/ SwiftPM, which is how it would run on non-Apple platforms as well. But my solution may not be the best for other people. Alternatively, you can add an intentional ≥350 ms delay upon app startup. The workaround sleep(1) worked because it introduced a 1000 ms delay, ~three times what is actually needed. https://github.com/philipturner/molecular-renderer/commit/655e367ef5a33218d7fc5654ded
Replies
Boosts
Views
Activity
Apr ’25
Using handleExternalEvents scene modifier to route external events to the correct scene
In an iPadOS SwiftUI app supporting multiple scenes, each Scene responds to a particular way in which the app was launched. If app was launched by tapping an associated file or a deep link (custom URL), then, the URLHandlerScene is invoked. If app was launched by QuickAction (long tap on the app icon), then another Scene is invoked etc. Each Scene has a purpose and responds to a particular launch. But after defining handlesExternlEvents(matching:) scene modifier, the scene was not getting launched when user taps the associated file or the app's Deeplinks was invoked. @main struct IOSSwiftUIScenesApp: App { var body: some Scene { DefaultScene() URLHandlerScene() .handlesExternalEvents(matching: [file://]) // Launched by an associated file .handlesExternalEvents(matching: [Companion://]) // Launched by Deeplink. // Other scenes } } struct URLHandlerScene: Scene { @State private var inputURL: URL // Store the incoming URL init() { self.inputURL = URL(string: Temp://)! } var body: some Scene { WindowGroup { URLha
Replies
1
Boosts
0
Views
152
Activity
Apr ’25
Reply to Debug Failed in Xcode Simulator
I experienced the same issue in my environment (XCode 16.2, Intel MBP, iOS Simulator). However after getting Couldn't find the Objective-C runtime library in loaded images at the first breakpoint, manually attaching the debugger allowed subsequent breakpoints to function properly. Therefore, I'm using the following workaround by running without a debugger and attaching manually: Product > Scheme > Edit Scheme... Choose Run from left pane. In the Info tab of right pane, uncheck Debug executable Run Application Debug > Attach to Process > (Choose Your Simulator from the top of the list) Since I've just started using this method, I can't guarantee it will always work, but it might be helpful for debugging except for the very initial startup phase.
Replies
Boosts
Views
Activity
Apr ’25
Reply to Summon gesture
Hi @cuo001 There are many ways to do this. Here's a snippet to get you started. It uses SpatialEventGesture and a hand AnchorEntity. When a person begins the pinch gesture, the code appends the entity to an anchor entity attached to the index finger, mantaining its world position, then animates the entity to a position near the index finger. When the pinch ends, the code appends the entity to its previous parent and animates the entity to its previous position. Create a custom component to store the state related to the custom interaction. struct SummonComponent: Component { let leftAnchor: AnchorEntity let rightAnchor: AnchorEntity var isHolding = false var parentEntity: Entity? var previousTransform: Transform? } Implement the interaction. struct ImmersiveView: View { @State var spatialTrackingSession = SpatialTrackingSession() var body: some View { RealityView { content in // Create the entity to summon. let box = ModelEntity(mesh: .generateBox(size: 0.15), materials: [SimpleMaterial(color: .red, isMetalli
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Apr ’25
Simulator 18.4 Webview CORS issues
I have a very specific issue that happens only on iOS Simulator version 18.4. It does NOT happen when I run my app on a real iOS 18.4 device through Testflight. My app displays a WebView (courtesy of Capacitor, url scheme capacitor://). Inside that Webview I'm using Firebase JS API (11.2.0) and calling signInWithEmailAndPassword, which works well in all other contexts, i.e. browser, Android webview, iOS webview in all other Simulator versions, and on real devices. Only when running in Simulator 18.4, I get a failed network request: cannot parse response Fetch API cannot load https://identitytoolkit.googleapis.com/v1/accounts:signInWithPassword?... due to access control checks. Failed to load resource: cannot parse reponse error: FirebaseError: (auth/network-request-failed) Everything is working correctly for both: Capacitor app webview installed on a real 18.4 device with Testflight Safari (non-webview) in the 18.4 Simulator The issue is severe for us, because we are unable to develop our app and tes
Replies
0
Boosts
0
Views
308
Activity
Apr ’25
Cannot submit App with "Default Translation Extension"
We developed a Default Translation App following the guide: https://developer.apple.com/documentation/translationuiprovider/preparing-your-app-to-be-the-default-translation-app. I have already configured everything that needs to be configured according to the document, but there is still this problem
Replies
0
Boosts
0
Views
41
Activity
Apr ’25