Trying to figure out how to send btc, eth, etc, phone to phone. Square app allows for the phone to turn into a payment terminal for fiat currencies and payment systems. the data encryption logic and sending p2p is similar but can that functionality be enabled for alternative purposes i.e. sending crypto phone to phone (designated wallet on one phone to designated wallet on another)?
Overview
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
I have the new iOS 26 SpeechTranscriber working in my application. The issue I am facing is how to determine if the device I am running on supports SpeechTranscriber. I was able to create code that tests if the device supports transcription but it takes a bit of time to run and thus the results are not available when the app launches. What I am looking for is a list of what iOS 26 devices it doesn't run on. I think its safe to assume any new devices will support it so if we can just have a list of what devices that can run iOS 26 and not able to do transcription it would be much faster for the app. I have determined it doesn't work on a SE 2nd Gen, it works on iPhone 12, SE 3rd Gen, iPhone 14 Pro, 15 Pro. As the SpeechTranscriber doesn't work in the simulator I can't determine that way. I have checked the docs and it doesn't list the devices it doesn't work on.
Adding both AVCaptureMovieFileOutput and AVCaptureVideoDataOutput is supported in AVCaptureSession as seen in documentation (copied snippet below) but then when AVCaptureDevice is configured with ProRes422 codec, it fails unless one of the two outputs is removed from the capture session. It is very much reproducible on iPhone 14 pro running iOS 26.0.
Prior to iOS 16, you can add an AVCaptureVideoDataOutput and an AVCaptureMovieFileOutput to the same session, but only one may have its connection active. If you attempt to enable both connections, the system chooses the movie file output as the active connection and disables the video data output’s connection. For apps that link against iOS 16 or later, this restriction no longer exists.
General:
Forums topic: Family Controls
Forums tag: Family Controls
Configuring Family Controls documentation
Screen Time Technology Frameworks documentation
FamilyControls documentation
What's new in Screen Time API video
Meet the Screen Time API video
Topic:
App & System Services
SubTopic:
General
Tags:
Entitlements
Signing Certificates
Family Controls
Screen Time
Hi everyone,
I recently paid for my Apple Developer Program membership, but my account status has been stuck on “Pending” ever since. It has now been 5+ more days and I still haven’t received any confirmation email or follow-up message from Apple.
I’ve checked my inbox, spam folder, and verified that the payment went through successfully. Everything seems correct on my end, but the activation hasn’t progressed.
Has anyone experienced this delay before, or is there something else I should do while waiting? Any guidance would be appreciated.
Thanks
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi! My app was rejected on Nov 4 due to 4.3 Design Spam. After communicating with the App Review team on Nov 4, the decision was still the same, so I submitted an appeal to the App Review Board that day.
They mentioned there are apps that are providing the same functionalities but I have done my research on App Store and reached out to potential users in London to gauge the interest. I believe my app and its direction is unique.
I still haven’t received any update and I totally understand that the appeals process can take some time. It's just that I am really passionate about getting this app into the public to get people on it and I have no idea on what is wrong and how I can help make it right.
Does anyone know how long it usually takes to get a response from the App Review Board?
Thank you for any help!
Hi, I'm currently using Metal Performance Shaders Graph (MPSGraphExecutable) to run neural network inference operations as part of a metal rendering pipeline.
I also tried to profile the usage of neural engine when running inference using MPSGraphExecutable but the graph shows no sign of neural engine usage. However, when I used the coreML model inspection tool in xcode and run performance report, it was able to use ANE.
Does MPSGraphExecutable automatically utilize the Apple Neural Engine (ANE) when running inference operations, or does it only execute on GPU?
My model (Core ML Package) was converted from a pytouch model using coremltools with ML program type and support iOS17.0+.
Any insights or documentation references would be greatly appreciated!
When the power button is pressed to turn off the alarm while the screen is locked, stopIntent will not be called.
Topic:
Developer Tools & Services
SubTopic:
General
Hi all, very new "developer" trying to build my own app. The app works, just trying to improve it.
I’m implementing a Live Activity in a widget extension (Swift/SwiftUI) for an app built with Flutter as the host app. ActivityKit is functioning correctly—activities start, update, and end normally, and the widget extension receives all state updates.
However, the Live Activity UI renders as a completely black capsule (both compact and expanded Dynamic Island, as well as the Lock Screen presentation). The system shows the Live Activity container, but none of the SwiftUI content displays.
Verified so far:
ActivityAttributes contains at least one stored property (previously empty).
ContentState fully Codable + Hashable.
All Dynamic Island regions return visible test UI (Text/Image).
No .containerBackground() usage.
Added explicit .activityBackgroundTint() + system foreground colors.
All Swift files included in widget extension target.
No runtime errors, no decode failures, no SwiftUI logs.
Widget previews work.
Clean build, app reinstall, device reboot.
Entitlements and Info.plist appear valid.
Problem:
The widget extension returns a completely black UI on-device, despite valid SwiftUI content in ActivityConfiguration. The Live Activity “shell” renders, so the activity is recognized, but the widget’s view hierarchy is visually empty.
Question:
Under what conditions would a widget extension produce a black, empty UI for a Live Activity even when ActivityKit, previews, and the SwiftUI layout are correct?
Are there known cases where:
Widget extension Info.plist misconfiguration,
Incorrect background/tint handling,
Rendering issues in Dynamic Island,
Host app integrations (Flutter),
Or extension isolation issues
cause valid SwiftUI to fail to render in a Live Activity?
Any guidance on deeper debugging steps or known system pitfalls would be appreciated.
[Submitted as FB21078443]
When using .matchedTransitionSource with .navigationTransition(.zoom), swiping back from the left edge to return from a detail view causes the source item to disappear once the transition finishes. It’s only a visual issue—the item is still there and can be tapped to open again.
This doesn’t happen when using the Back button; only the swipe-back gesture triggers it. Also, it only reproduces on a physical device, not in Simulator.
SYSTEM INFO
Xcode 26.1.1 (17B100)
macOS 26.1 (25B78)
iOS 26.1 (23B85)
iOS 26.2 (23C5044b)
REPRO STEPS
Run the code below on a physical device, tap an image, then swipe from the left edge to dismiss the detail view.
ACTUAL
The image zooms back to its origin, then disappears once the animation settles.
EXPECTED
The image card remains visible.
SCREENSHOTS
CODE
import SwiftUI
struct Item: Identifiable, Hashable {
let id = UUID()
let imageName: String
let title: String
}
struct ContentView: View {
@Namespace private var namespace
let items = [
Item(imageName: "SampleImage", title: "Sample Card 1"),
Item(imageName: "SampleImage2", title: "Sample Card 2")
]
var body: some View {
NavigationStack {
ScrollView {
VStack(spacing: 16) {
ForEach(items) { item in
NavigationLink(value: item) {
CardView(item: item)
.matchedTransitionSource(id: item.id, in: namespace)
}
.buttonStyle(.plain)
}
}
.padding()
}
.navigationTitle("Zoom Transition Issue")
.navigationSubtitle("Tap image, then swipe back from left edge")
.navigationDestination(for: Item.self) { item in
DetailView(item: item, namespace: namespace)
.navigationTransition(.zoom(sourceID: item.id, in: namespace))
}
}
}
}
struct CardView: View {
let item: Item
var body: some View {
GeometryReader { geometry in
ZStack(alignment: .bottom) {
Image(item.imageName)
.resizable()
.scaledToFill()
.frame(width: geometry.size.width, height: geometry.size.height)
.clipped()
}
}
.frame(height: 200)
.clipShape(RoundedRectangle(cornerRadius: 16))
}
}
struct DetailView: View {
let item: Item
let namespace: Namespace.ID
var body: some View {
Image(item.imageName)
.resizable()
.scaledToFill()
.clipped()
}
}
Topic:
UI Frameworks
SubTopic:
SwiftUI
hello
I am Asmaa Atine
I would like to suggest an improvement for the Apple Maps app.
My idea is to allow users to draw the general path they would like to follow directly on the map with their finger, and then have the app automatically generate an optimized route that follows the drawn trajectory as closely as possible.
This feature would be very useful in several situations, such as:
• when the user wants to pass through a specific area but the suggested routes don’t match,
• when they want to avoid certain places or include a particular spot,
• or when they simply want a more flexible, intuitive way to customize a route.
The concept would be:
1. the user draws a rough path on the map,
2. Apple Maps interprets the drawing,
3. and then proposes the best possible route based on that drawn line.
I believe this would greatly enhance the flexibility of Apple Maps and provide a more intuitive way to create personalized routes.
Thank you for considering this suggestion, and congratulations on the great work already done on the app.
Topic:
App & System Services
SubTopic:
Maps & Location
Subject: iOS Fails to Fetch AASA File for Internationalized Domain Names (IDN) in Unicode or Punycode Format
Dear Apple Developer Relations Team And Community Members,
We are reporting a critical bug in the iOS Associated Domains feature that prevents Universal Links from working for apps using Internationalized Domain Names (IDN).
Problem Description:
The iOS operating system does not attempt to download the apple-app-site-association (AASA) file for domains containing non-ASCII characters (e.g., diacritics, Cyrillic).
This failure occurs regardless of whether the domain is specified in the app's entitlements in its human-readable Unicode format (e.g., montréal.ca) or its encoded Punycode format (e.g., xn--montral-fya.ca, xn--e1afka0abm4b.xn--p1ai).
Without fetching and validating this file, Universal Links are not activated, and the system fails to establish a connection between the website and our app.
Steps to Reproduce:
Create an app with the Associated Domains entitlement enabled.
Add an IDN to the entitlement. We tested both formats:
Format A (Unicode): applinks://montréal.ca
Format B (Punycode): applinks://xn--montral-fya.ca
Host a valid AASA file on our server at the correct, accessible well-known URLs for both domain representations:
For montréal.ca: https://montréal.ca/.well-known/apple-app-site-association and https://xn--montral-fya.ca/.well-known/apple-app-site-association
Install the app on a device running the latest iOS version.
Monitor network traffic using a tool like Charles Proxy.
Observed Result:
No HTTP GET request is made to any of the AASA URLs for the domains montréal.ca (in either Unicode or Punycode format) upon app installation. The system does not initiate the domain validation process. In contrast, for a standard ASCII domain (e.g., applinks://example.com), the AASA fetch is triggered immediately and is observed in the network log.
Expected Result:
iOS should correctly resolve the Internationalized Domain Name (whether specified in Unicode or Punycode format in the entitlement) and perform an HTTP GET request to fetch the AASA file from the /.well-known path, identical to its behavior for ASCII domains.
Evidence & Configuration:
Our server is configured correctly: SSL certificates are valid, the AASA file is served with the correct application/json MIME type, and is directly accessible via a browser or curl.
The AASA file's syntax has been validated and is correct.
The issue is reproducible on the latest versions of iOS.
Impact:
This bug blocks a core platform feature for millions of users in regions that use non-Latin scripts (e.g., France, Russia, China, Arab states). It makes it impossible to use Universal Links with our primary domains, severely degrading the user experience and forcing us to seek suboptimal workarounds like registering separate ASCII domains.
Request:
We kindly request that you investigate and log this issue as a bug in iOS and forward it to the appropriate engineering team for a fix in an upcoming update. We are prepared to provide any additional information, demo projects, or server access to assist in diagnostics.
Thank you for your attention to this serious matter.
Archive failing while executing command:
xcodebuild -project Unity-VisionOS.xcodeproj -scheme Unity-VisionOS -destination generic/platform=xros archive -archivePath Unity-VisionOS.xcarchive -quiet > logs/visionos_archive.log
Next error happens:
2025-11-18 15:33:12.161 ibtoold[56062:4395005] [MT] IBPlatformTool: *** Failed to launch tool with description <IBVisionPlatformToolDescription: 0x600003278380> System content for IBCocoaTouchFramework-seventeenAndLater <IBScaleFactorDeviceTypeDescription: 0x600003278400> scaleFactor=2x, renderMode.identifier=(null): Failed to find or create execution context for description '<IBVisionPlatformToolDescription: 0x600003278380> System content for IBCocoaTouchFramework-seventeenAndLater <IBScaleFactorDeviceTypeDescription: 0x600003278400> scaleFactor=2x, renderMode.identifier=(null)'.
Device type: IBSimDeviceTypeiPad2x (com.apple.dt.Xcode.IBSimDeviceType.iPad-2x)
Sim runtime: visionOS 2.1 (2.1 - 22N580) - com.apple.CoreSimulator.SimRuntime.xrOS-2-1
Device: (null)
** Please also include the output of `xcrun simctl diagnose` and `xcode-select -p`.: Failed to find a suitable device for the type IBSimDeviceTypeiPad2x (com.apple.dt.Xcode.IBSimDeviceType.iPad-2x) with runtime visionOS 2.1 (2.1 - 22N580) - com.apple.CoreSimulator.SimRuntime.xrOS-2-1 (Failure reason: Failed to create new simulator device in set SimDeviceSet : /Users/user/Library/Developer/Xcode/UserData/IB Support/Simulator Devices that matches IBSimDeviceTypeiPad2x (com.apple.dt.Xcode.IBSimDeviceType.iPad-2x) for runtime visionOS 2.1 (2.1 - 22N580) - com.apple.CoreSimulator.SimRuntime.xrOS-2-1 (Incompatible device). Available devices: (
"IBSimDeviceTypeiPad3x (D1B76A51-0DB5-439F-B65D-891AB20C2B26, iOS 18.1, Shutdown)",
"IBSimDeviceTypeiPad2x (CA17569D-D3EA-4B11-A20D-3571D4A5E58A, iOS 18.1, Shutdown)",
"IBSimDeviceTypeiPad2x (E3EE1BC6-2F2F-48D3-A378-A819824F7082, iOS 18.1, Shutdown)"
)): Incompatible device
/* com.apple.ibtool.document.warnings */
/Users/xcode_dir_path/xcode/LaunchScreen-iPhone.storyboard:global: warning: Compiling Interface Builder products for visionOS will not be supported in a future version of Xcode. [9]
/* com.apple.ibtool.errors */
/Users/xcode_dir_path/LaunchScreen-iPhone.storyboard: error: Failed to find or create execution context for description '<IBVisionPlatformToolDescription: 0x600003278380> System content for IBCocoaTouchFramework-seventeenAndLater <IBScaleFactorDeviceTypeDescription: 0x600003278400> scaleFactor=2x, renderMode.identifier=(null)'.
Device type: IBSimDeviceTypeiPad2x (com.apple.dt.Xcode.IBSimDeviceType.iPad-2x)
Sim runtime: visionOS 2.1 (2.1 - 22N580) - com.apple.CoreSimulator.SimRuntime.xrOS-2-1
Device: (null)
** Please also include the output of `xcrun simctl diagnose` and `xcode-select -p`.
Underlying Errors:
Description: Failed to find a suitable device for the type IBSimDeviceTypeiPad2x (com.apple.dt.Xcode.IBSimDeviceType.iPad-2x) with runtime visionOS 2.1 (2.1 - 22N580) - com.apple.CoreSimulator.SimRuntime.xrOS-2-1
Failure Reason: Failed to create new simulator device in set SimDeviceSet : /Users/user/Library/Developer/Xcode/UserData/IB Support/Simulator Devices that matches IBSimDeviceTypeiPad2x (com.apple.dt.Xcode.IBSimDeviceType.iPad-2x) for runtime visionOS 2.1 (2.1 - 22N580) - com.apple.CoreSimulator.SimRuntime.xrOS-2-1 (Incompatible device). Available devices: (
"IBSimDeviceTypeiPad3x (D1B76A51-0DB5-439F-B65D-891AB20C2B26, iOS 18.1, Shutdown)",
"IBSimDeviceTypeiPad2x (CA17569D-D3EA-4B11-A20D-3571D4A5E58A, iOS 18.1, Shutdown)",
"IBSimDeviceTypeiPad2x (E3EE1BC6-2F2F-48D3-A378-A819824F7082, iOS 18.1, Shutdown)"
)
Underlying Errors:
Description: Incompatible device
It appears when command was executed from TeamCity. And when it launches manually from console, it works fine.
I tried reinstalling SDK, switching different Xcode versions, removing cache, clearing IB folder, nothing helps.
xcrun simctl list
command prints next:
== Device Types ==
iPhone 17 Pro (com.apple.CoreSimulator.SimDeviceType.iPhone-17-Pro)
iPhone 17 Pro Max (com.apple.CoreSimulator.SimDeviceType.iPhone-17-Pro-Max)
iPhone Air (com.apple.CoreSimulator.SimDeviceType.iPhone-Air)
iPhone 17 (com.apple.CoreSimulator.SimDeviceType.iPhone-17)
iPhone 16 Pro (com.apple.CoreSimulator.SimDeviceType.iPhone-16-Pro)
iPhone 16 Pro Max (com.apple.CoreSimulator.SimDeviceType.iPhone-16-Pro-Max)
iPhone 16e (com.apple.CoreSimulator.SimDeviceType.iPhone-16e)
iPhone 16 (com.apple.CoreSimulator.SimDeviceType.iPhone-16)
iPhone 16 Plus (com.apple.CoreSimulator.SimDeviceType.iPhone-16-Plus)
iPhone 15 Pro (com.apple.CoreSimulator.SimDeviceType.iPhone-15-Pro)
iPhone 15 Pro Max (com.apple.CoreSimulator.SimDeviceType.iPhone-15-Pro-Max)
iPhone 15 (com.apple.CoreSimulator.SimDeviceType.iPhone-15)
iPhone 15 Plus (com.apple.CoreSimulator.SimDeviceType.iPhone-15-Plus)
iPhone 14 Pro (com.apple.CoreSimulator.SimDeviceType.iPhone-14-Pro)
...
Apple Vision Pro (com.apple.CoreSimulator.SimDeviceType.Apple-Vision-Pro-4K)
Apple Vision Pro (com.apple.CoreSimulator.SimDeviceType.Apple-Vision-Pro)
iPod touch (7th generation) (com.apple.CoreSimulator.SimDeviceType.iPod-touch--7th-generation-)
== Runtimes ==
iOS 18.1 (18.1 - 22B81) - com.apple.CoreSimulator.SimRuntime.iOS-18-1
iOS 18.6 (18.6 - 22G86) - com.apple.CoreSimulator.SimRuntime.iOS-18-6
visionOS 2.1 (2.1 - 22N580) - com.apple.CoreSimulator.SimRuntime.xrOS-2-1
== Devices ==
-- iOS 18.1 --
iPhone 16 Pro (78683AE0-6238-4EF1-BD47-192FC0DC2559) (Shutdown)
iPhone 16 Pro Max (98BFE915-787D-4673-991A-5D3EC448CC51) (Shutdown)
iPhone 16 (D3E6FED8-9E65-4141-A31C-FC7EA3492111) (Shutdown)
...
iPad mini (A17 Pro) (10F1862E-E27B-42C5-9B02-1FA71D2F4707) (Shutdown)
iPad (10th generation) (E6E0D31F-91E4-4D3E-898D-516D79DDD58E) (Shutdown)
-- visionOS 2.1 --
Apple Vision Pro (E4D22A75-A38E-4135-9152-EF4D64BEC3F2) (Shutdown)
I am requesting assistance with an issue involving my Advanced App Clip Experience, which has remained in the “Received” state for more than few months, preventing the App Clip from becoming available when invoked via QR code.
App Details
App Name: Yellow Label Verification
App Store Bundle ID: com.acviss.demoindia
App Clip Bundle ID: com.acviss.demoindia.Clip
Team ID: F2RLQ4VV59
App Version (Live): 1.4
Domain: acviss.com
Issue Summary
My Advanced App Clip Experience is stuck in the “Received” status. The “Publish” and “Build Assignment” options never appear, even though:
The updated AASA file is correctly published at:
https://acviss.com/.well-known/apple-app-site-association
It contains the correct appclips → appID and paths entries
It is served with the correct application/json content type
Domain validation in App Store Connect shows Validated
The App Clip build is already approved and live on the App Store
Safari-based App Clip invocation works as expected
Despite this, the Advanced App Clip Experience has not transitioned from “Received” to “Processing” or “Published.”
Because of this, QR-based invocation consistently shows “App Clip Unavailable”, indicating that the App Clip Experience has not yet been activated on Apple’s backend.
Reproduction Steps
Publish correct AASA file with appclips array and paths
Validate domain (shows green “Validated” in App Store Connect)
Open the Advanced App Clip Experience in App Store Connect
Status stays as Received
“Build Assignment” or “Publish” buttons never appear
QR scanning of the App Clip URL continues to show App Clip Unavailable
Request
Could you please check the backend processing of my App Clip Experience and manually trigger the sync or processing
It appears that the App Clip Experience is not being processed even though all configuration, AASA, and domain validations are correct. I would greatly appreciate your assistance in resolving this so the App Clip can be invoked successfully via QR.
Thank you very much for your help and support.
Hello everyone, quick question. I have an app with subscriptions that limit some functionality until you subscribe.
The app detects if you have an active subscription and if you do not, when choosing a functionality that is behind the paywall, you get redirected to the upgrade screen where the subscriptions are located.
The reviewer replied "We have started the review of your app, but we are not able to continue because we cannot locate the in-app purchases. To help us proceed with the review of your app, please reply to this message providing the steps for locating the in-app purchases in your app."
I think the issue is that the app was tested before by the review team with a sandbox account, and now the app is not displaying that upgrade screen anymore because it detects that sandbox account as having an active subscription. And now the reviewer can't get to that screen anymore.
Has anyone encountered this type of issue in the past? If you erase purchase history from that particular sandbox account, you log out and log back into the sandbox account it works, but I'm not sure if the reviewers do that.
I've created a Snippet for my iOS app which I want to be able to run from the LockScreen via a Shortcuts widget.
All works fine except when I run the shortcut and the App Snippet appears, it doesn't always render the SwiftUI view in the same way.
Sometimes the width boundaries are respected and sometimes not.
I've tested this on iOS 26.1 and iOS 26.2 beta 3
I think this is a bug but it would be great if anyone could see what I might be doing wrong if it's not.
Incase it is a bug I've filed a feedback (FB21076429) and I've created a stripped down sample project showing the issue and added screenshots showing the issue.
Basic code to reproduce issue:
// Intent.swift
// SnippetBug
import AppIntents
import Foundation
import SwiftUI
struct SnippetEntryIntent: AppIntent {
static let title: LocalizedStringResource = "Open Snippet"
static let description = IntentDescription("Shows a snippet.")
// Don’t open the app – stay in the snippet surface.
static let openAppWhenRun: Bool = false
func perform() async throws -> some ShowsSnippetIntent {
.result(snippetIntent: TestSnippetIntent())
}
}
struct TestSnippetIntent: SnippetIntent {
static let title: LocalizedStringResource = "Snippet Intent"
static let description = IntentDescription("Action from snippet.")
@MainActor
func perform() async throws -> some IntentResult & ShowsSnippetView {
.result(view: SnippetView(model: SnippetModel.shared))
}
}
@MainActor
final class SnippetModel {
static let shared = SnippetModel()
private init() {
}
}
struct SnippetView: View {
let model: SnippetModel
var body: some View {
HStack {
Text("Test Snippet with information")
Spacer()
Image(systemName: "heart")
}.font(.headline)
}
}
struct Shortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: SnippetEntryIntent(),
phrases: [
"Snippet for \(.applicationName)",
"Test Snippet \(.applicationName)"
],
shortTitle: "Snippet",
systemImageName: "barcode"
)
}
}
You also need these lines in your main App entry point:
import AppIntents
@main
struct SnippetBugApp: App {
init() {
let model = SnippetModel.shared
AppDependencyManager.shared.add(dependency: model)
}
var body: some Scene {
WindowGroup {
ContentView()
}
}
}
This is correct
This is incorrect
Subject: Xcode Cloud not detecting ci_scripts/ci_post_clone.sh for Flutter iOS build Description: I'm attempting to build a Flutter iOS app using Xcode Cloud, but the build is failing because Xcode Cloud cannot detect my custom build script located at ci_scripts/ci_post_clone.sh in the repository root. Setup:
Repository: https://github.com/GlamTam2000/King-chi-app
Branch: ios-build-legacy
Xcode Project: flutter_application_1/ios/Runner.xcworkspace
Xcode Version specified: 15.4
Issue: The Xcode Cloud build logs consistently show:
Post-Clone script not found at ci_scripts/ci_post_clone.sh
However, the script file is confirmed to exist in the repository:
The file is committed and pushed to GitHub (commit 9bd3aa1)
Local git verification: git ls-tree HEAD ci_scripts/ shows the file exists
File permissions: 100755 (executable)
File location: Repository root /ci_scripts/ci_post_clone.sh
What I've tried:
Created ci_scripts/ci_post_clone.sh at repository root with executable permissions
Ensured Unix line endings (LF, not CRLF)
Removed macOS extended attributes
Tried both ci_post_clone.sh and ci_pre_xcodebuild.sh scripts
Created empty commits to force Xcode Cloud to fetch latest changes
Verified the file exists locally and in git history
Why I need this script: Flutter requires running flutter build ios --release --no-codesign before Xcode can build, which generates the FlutterGeneratedPluginSwiftPackage that Xcode depends on. Without this script running, the build fails with:
Could not resolve package dependencies: the package at '.../FlutterGeneratedPluginSwiftPackage' cannot be accessed
Question: Why is Xcode Cloud not detecting the ci_scripts/ci_post_clone.sh file even though it exists in the repository root? Is there a specific configuration in App Store Connect or a Xcode Cloud workflow setting that needs to be enabled for custom scripts to run? Additional files in repository (also not working):
.xcode-version at repository root (specifying 15.4)
.xcodecloud.yml at repository root (with workflow configuration)
Any guidance on how to make Xcode Cloud properly detect and execute custom build scripts would be greatly appreciated.
This gives Apple Support all the key information they need to help diagnose why the scripts aren't being detected.
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
Xcode Server
Swift
Xcode
Hi, everyone, I downloaded the source code EditingSpatialAudioWithAnAudioMix.zip from https://developer.apple.com/documentation/Cinematic/editing-spatial-audio-with-an-audio-mix, when I carried out one of the actions named "process" in command line the program crashed!!
Form the source code, I found that the value of componentType is set to kAudioUnitType_FormatConverter:
// The actual `AudioUnit`.
public var auAudioMix = AVAudioUnitEffect()
init() {
// Generate a component description for the audio unit.
let componentDescription = AudioComponentDescription(
componentType: kAudioUnitType_FormatConverter,
componentSubType: kAudioUnitSubType_AUAudioMix,
componentManufacturer: kAudioUnitManufacturer_Apple,
componentFlags: 0,
componentFlagsMask: 0)
auAudioMix=AVAudioUnitEffect(audioComponentDescription: componentDescription)
}
But in the document from https://developer.apple.com/documentation/avfaudio/avaudiouniteffect/init(audiocomponentdescription:), it seems that componentType can not be set to kAudioUnitType_FormatConverter and :
Has everyone encountered this problem?
I am an individual developer, and I want to create a demo. Do I need to develop an app for both iOS and Android to accomplish this?
Has Apple provided a simple demo or not?
Topic:
App & System Services
SubTopic:
iCloud & Data
In the header for workloop.h there is this note:
A dispatch workloop is a "subclass" of dispatch_queue_t which can be passed to all APIs accepting a dispatch queue, except for functions from the dispatch_sync() family. dispatch_async_and_wait() must be used for workloop objects. Functions from the dispatch_sync() family on queues targeting a workloop are still permitted but discouraged for performance reasons.
I have a couple questions related to this. First, I'd like to better understand what the alluded-to 'performance reasons' are that cause this pattern to be discouraged in the 'queues targeting a workloop' scenario. From further interrogation of the headers, I've found these explicit callouts regarding differences in the dispatch_sync and dispatch_async_and_wait API:
dispatch_sync:
Work items submitted to a queue with dispatch_sync() do not observe certain queue attributes of that queue when invoked (such as autorelease frequency and QOS class).
dispatch_async_and_wait:
Work items submitted to a queue with dispatch_async_and_wait() observe all queue attributes of that queue when invoked (inluding [sic] autorelease frequency or QOS class).
Additionally, dispatch_async_and_wait has a section of the headers devoted to 'Differences with dispatch_sync()', though I can't say I entirely follow the distinctions it attempts to draw.
Based on that, my best guess is that the 'performance reasons' are something about either QoS not being properly respected/observed or some thread context switching differences that can degrade performance, but I would appreciate insight from someone with more domain knowledge.
My second question is a bit more general – taking a step back, why exactly do these two API exist? It's not clear to me from the existing documentation I've found why I would/should prefer dispatch_sync over dispatch_async_and_wait (other than the aforementioned callout noting the former is unsupported on workloops). What is the motivation for preserving both these API vs deprecating dispatch_sync in favor of dispatch_async_and_wait (or functionally subsuming one with the other)?
Credit to Luna for originally posing/inspiring these questions.