Posts

Post not yet marked as solved
0 Replies
398 Views
I recently refactored a project to break out common code into a Swift Package Manager (SPM) package. The code itself references macros such as DEBUG that are defined by the host application via a Swift Active Compilation Condition. However from what I can tell, modules outside of the main app are unable to access these macros. E.g. if the SPM package does this: #if DEBUG print("debug mode!") #endif …the print() will never execute. Is there any way for an SPM package to access these macro definitions? If not, is there a recommended best practice for including "debug only" code in an SPM package? Thanks!
Posted
by scalo.
Last updated
.
Post not yet marked as solved
8 Replies
2.8k Views
My NEDNSProxyProvider subclass basically does this for each incoming UDPFLow:open()while moreDataComing { read() filter() // modify EDNS0 write()}closeWrites()closeReads()What I sometimes see though is that after the first read/write sequence, iOS appears to close the UDPFlow for me. I see these in the logs:default 16:12:09.186963 -0800 MyProxy (2552644817): Closing reads, not closed by plugindefault 16:12:09.187134 -0800 MyProxy (2552644817): Closing writes, not sending closeThen later when I attempt to close the flow explicitly I get:default 16:12:09.273912 -0800 MyProxy writeDatagrams finished with error: Optional(Error Domain=NEAppProxyFlowErrorDomain Code=1 "The operation could not be completed because the flow is not connected" UserInfo={NSLocalizedDescription=The operation could not be completed because the flow is not connected})Sometimes I also seeerror 17:01:55.396243 -0800 MyProxy (3009813469): flow is closed for writes, cannot write 111 bytes of dataI'm not 100% sure if this is actually a problem or not as far as the actual functioning of the proxy goes, but I'd like to understand why it's happening. Is there something I'm doing or not doing that's causing iOS (or CFNetwork or whatever) to close the socket before I can do so explicitly? My class has a strong reference to the NEAppProxyUDPFlow object so it's not like it's getting deallocated early or anything.
Posted
by scalo.
Last updated
.
Post not yet marked as solved
4 Replies
1.5k Views
I'm trying to verify receipts with Xcode 12 and a StoreKit config file, but keep getting a 21002 error. According to the docs - https://developer.apple.com/documentation/xcode/setting_up_storekit_testing_in_xcode, I need to use a different cert. I generated the cert but it's not clear what to do with it? #if DEBUG 		let certificate = “StoreKitTestCertificate” #else 		let certificate = “AppleIncRootCertificate” #endif That's great, but what uses certificate ?
Posted
by scalo.
Last updated
.
Post not yet marked as solved
0 Replies
467 Views
If I use AVAssetDownloadTask to download an encrypted audio-only HLS stream, is it possible to get access to the raw audio data, for the purposes of integrating with AVAudioEngine, AVAudioPlayerNode, and associated effects? If so, what's the process like for doing that? Thanks!
Posted
by scalo.
Last updated
.
Post not yet marked as solved
0 Replies
399 Views
I have an iOS music game (-ish) app that requires some basic synchronization between audio playback and other non-video events. To keep it simple, let's say I have some sound effects in files and I want to show a dot on the screen when a sound effect is played with a high degree of accuracy. Playing through the built-in speaker is fine, but users will often be using AirPods and the added latency is too noticeable. I don't need this to be super-low latency, I just need to know what the latency is so that I can show the dot exactly when the user hears the sound. I've mostly been experimenting in the AVAudioEngine/AVPlayerNode space. Those provide a variety of latency/ioBufferDuration properties (see below), but unfortunately they seem pretty inaccurate. Are they supposed to be? Is there a better way to sync non-video things with audio playback? func printLatencyInfo() { 		print("audioSession.inputLatency: \(audioSession.inputLatency)") 		print("audioSession.outputLatency: \(audioSession.outputLatency)") 		print("audioSession.ioBufferDuration: \(audioSession.ioBufferDuration)") 		print("engine.mainMixerNode.auAudioUnit.latency: \(engine.mainMixerNode.auAudioUnit.latency)") 		print("engine.inputNode.auAudioUnit.latency: \(engine.inputNode.auAudioUnit.latency)") 		print("engine.outputNode.auAudioUnit.latency: \(engine.outputNode.auAudioUnit.latency)") 		print("engine.mainMixerNode.outputPresentationLatency: \(engine.mainMixerNode.outputPresentationLatency)") 		print("engine.inputNode.outputPresentationLatency: \(engine.inputNode.outputPresentationLatency)") 		print("engine.outputNode.outputPresentationLatency: \(engine.outputNode.outputPresentationLatency)") 		print("engine.inputNode.auAudioUnit.latency: \(engine.inputNode.auAudioUnit.latency)") 		print("engine.outputNode.auAudioUnit.latency: \(engine.outputNode.auAudioUnit.latency)") 		print("playerNode.auAudioUnit.latency: \(playerNode.auAudioUnit.latency)") }
Posted
by scalo.
Last updated
.
Post not yet marked as solved
2 Replies
305 Views
The CloudKit Dashboard has been down for more than a day. Apple's System Status board - https://developer.apple.com/system-status/ shows a green circle, available. Hopefully someone on the CloudKit teams is aware of this, but could you a) update the system status page and b) reboot that server? 😀 Thanks
Posted
by scalo.
Last updated
.
Post marked as solved
4 Replies
502 Views
Our DNSProxyProvider network extension is currently provisioned for customers by a "siteKey" that they include in their MDM config. This works fine - when the app launches it pulls the siteKey from the providerConfiguration vended by NEDNSProxyManager.Of course, some of our customers would like the DNS proxy to run without making their users open the app even once. This is proving to be challenging. The proxy itself can't get the providerConfiguration from NEDNSProxyManager because, as we see in the console:NEDNSProxyManager objects cannot be instantiated from NEProvider processesOn macOS you can force settings into an app via the mobileConfig plist, which are then accessible via the "com.apple.configuration.managed" standard user defaults key. But as far as I can tell, this doesn't work on iOS. (Or maybe it does but the format is different? I can't find any documentation on this for iOS.)Is there any way to get this bit of information at the proxy level without ever launching the main app? Thanks!
Posted
by scalo.
Last updated
.
Post marked as solved
2 Replies
582 Views
I'm trying to convert a AVAudioPCMBuffer with a 44100 sample rate to one with a 48000 sample rate, but I always get an exception (-50 error) when converting. Here's the code: guard let deviceFormat = AVAudioFormat(standardFormatWithSampleRate: 48000.0, channels: 1) else { preconditionFailure() } // This file is saved as mono 44100 guard let lowToneURL = Bundle.main.url(forResource: "Tone220", withExtension: "wav") else { preconditionFailure() } guard let audioFile = try? AVAudioFile(forReading: lowToneURL) else { preconditionFailure() } let tempBuffer = AVAudioPCMBuffer(pcmFormat: audioFile.processingFormat, frameCapacity: AVAudioFrameCount(audioFile.length))! tempBuffer.frameLength = tempBuffer.frameCapacity do { try audioFile.read(into: tempBuffer) } catch { assertionFailure("*** Caught: \(error)") } guard let converter = AVAudioConverter(from: audioFile.processingFormat, to: deviceFormat) else { preconditionFailure() } guard let convertedBuffer = AVAudioPCMBuffer(pcmFormat: deviceFormat, frameCapacity: AVAudioFrameCount(audioFile.length)) else { preconditionFailure() } convertedBuffer.frameLength = tempBuffer.frameCapacity do { try converter.convert(to: convertedBuffer, from: tempBuffer) } catch { assertionFailure("*** Caught: \(error)") }Any ideas?
Posted
by scalo.
Last updated
.
Post marked as solved
3 Replies
1.5k Views
It seems like the topic of _where_ NEDNSProxyProvider is allowed is slimly documented. The Apple docs say> You usually do this in the context of managed devices, such as those owned by a school or an enterprise.But fall short of saying that you _can't_ use the API on general App Store apps. That's fine, I already know this to be the case. The question is, what if you're distributing your app through a custom B2B App Store? Is it then allowed, assuming the devices are managed through MDM?Thanks
Posted
by scalo.
Last updated
.
Post not yet marked as solved
1 Replies
590 Views
In iOS 11 CallKit introduced an "incremental" mode which presumably allows the extension to add phone numbers incrementally instead of all at once. Unfortunately it's not clear how this is supposed to work and there's little to no documentation. Specifically:* How does the app indicate that it wants to do an incremental update? CXCallDirectoryExtensionContext's isIncremental is read-only.* If the intention is that the extension will read the isIncremental property in beginRequest and only push an incremental update in that case a) what if the extension only has a full list here? b) under which conditions will isIncremental be true/false?
Posted
by scalo.
Last updated
.