Search results for

Popping Sound

19,350 results found

Post

Replies

Boosts

Views

Activity

'Invalid value for purchase intake' error
Hello, I recently saw this error from StoreKit in the Console - 'Invalid value for purchase intake' - while debugging a SKPayment subscription issue (where a valid receipt should be verified and restored, but isn't for one user). I haven't been able to find any documentation about this message and wondered if it was related at all. There were two other logs from StoreKit right before saying: 'Found 3 products in receipt with ID' 'Processing ad attribution purchase intake' Does anyone know what 'invalid value for purchase intake' is referencing? We don't have the AdAttributionKit implemented. It sounds like it might be related to that instead? Thank you
0
0
73
Jul ’25
Terminal Command or AppleScript to Set Audio Balance to Perfect Center?
Hi everyone, I'm looking for a way to programmatically set the left/right audio balance to perfect center (50/50) using either a Terminal command or AppleScript. Background: The audio balance slider in System Settings > Sound > Output & Input works functionally, but I have difficulty determining when it's positioned at the exact center point. The visual nature of the slider makes it challenging for me to achieve the precision I need, and I end up adjusting it repeatedly trying to get it perfectly centered. What I'm looking for: A Terminal command that can set the audio balance to exact center An AppleScript that accomplishes the same thing Any other programmatic method to ensure perfect 50/50 balance I've tried searching through the defaults command documentation and Core Audio frameworks but haven't found the right approach yet. Has anyone successfully automated this setting before? Any help would be greatly appreciated! Thanks in advance, Dylan
0
0
66
Jul ’25
Reply to App Store code signing show "Beta Profile"
[quote='793484021, TekMun, /thread/793484, /profile/TekMun'] I followed this instruction to manually re-sign my ipa [/quote] To be clear, DTS doesn’t support folks re-signing iOS apps. If you need to re-sign an app, the best path forward is to distribute an Xcode archive (.xcarchive) and then re-sign that using Xcode. However, I don’t think that’s the issue here. Rather, it sounds like you’re trying to run distribution-signed code. That won’t work. See Don’t Run App Store Distribution-Signed Code. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Jul ’25
Reply to Validation incorrectly detects need for web browser engine entitlement, causes "corrupted binaries" error
Are you specifically trying to use the new arm64e enhanced security that we enabled with iOS 26 beta? For more on that, see Enabling enhanced security for your app. The reason I ask is that arm64e on iOS was previously limited to folks creating a third-party browser engine using BrowserKit. But it sounds like you’re not in that business at all. If I’m right, then my advice is to disable arm64e until App Store Connect is set up to receive iOS 26 beta submissions. In the past that’s happened at the Release Candidate stage, although I can’t make concrete statements as to what’ll happen this year. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Jul ’25
Execution breakpoint when trying to play a music library file with AVAudioEngine
Hi all, I'm working on an audio visualizer app that plays files from the user's music library utilizing MediaPlayer and AVAudioEngine. I'm working on getting the music library functionality working before the visualizer aspect. After setting up the engine for file playback, my app inexplicably crashes with an EXC_BREAKPOINT with code = 1. Usually this means I'm unwrapping a nil value, but I think I'm handling the optionals correctly with guard statements. I'm not able to pinpoint where it's crashing. I think it's either in the play function or the setupAudioEngine function. I removed the processAudioBuffer function and my code still crashes the same way, so it's not that. The device that I'm testing this on is running iOS 26 beta 3, although my app is designed for iOS 18 and above. After commenting out code, it seems that the app crashes at the scheduleFile call in the play function, but I'm not fully sure. Here is the setupAudioEngine function: private func setupAudioEngine() { do { try AVAudioSessi
8
0
596
Jul ’25
Reply to Installer package is terminated after 600 seconds
Thanks for that. I see the documentation now on Sequoia in the man pages. It doesn't seem like I can specify the timeout for a package postinstall script, only if I specify a script for a specific bundle. Would you agree? Does a dictionary in a component plist need to refer to an app bundle? I don't really have a bundle to install for the package choice. To answer your question, there are very large libraries of audio content available for our program, and the install scripts help them with various options for installing/using this content - it can definitely take longer than 10 minutes.
Jul ’25
Reply to Seeking Confirmation on Picture-in-Picture Support for Audio Calls
Hi Kevin, what I’m trying to do is use Picture-in-Picture to display relevant call information to the user—such as the current call status and connection quality—when the app goes into the background. My read of that is that it seems like something that would be much better handled through Live Activities and/or some other mechanism like the notification system. Notably: Live Activities work in broader contexts, like on the lock screen. Live Activities can be initiated from the background, while starting PiP requires your app to be in the foreground. Using PiP for this is going to disrupt whatever other PiP activity the user was doing. Basically, I think using PiP for something like this is going to require significantly more work than LiveActivities and will only work in a much narrower context. The purpose is to keep the user informed during an ongoing call. Just to clarify something, you are planning to use CallKit, LiveCommunicationKit, or the PushToTalk framework, correct? My concern here is that you're
Jul ’25
Reply to "Captions" in the Accessibility Nutrition Label for text-based apps
[quote='843657022, DTS Engineer, /thread/788023?answerId=843657022#843657022'] It seems incorrect to indicate the presence of captions when they aren't there in an audio or video asset. [/quote] Perhaps, although I DO have captions for every single video in my app, (which is zero). The reverse mislead, that some of my content is inaccessible, seems like a worse error to me. I have no such inaccessible content. Clearly, what's needed here is something more than a 2-state response. We need to be able to say Not applicable - no videos to caption.
Jul ’25
Reply to CallKit does not activate audio session with higher probability after upgrading to iOS 18.4.1
Hi @DTS Engineer I would like to follow up this issue. From our metics, looks like after upgrading to iOS 18.4.1,** the incoming call no audio issue is increased**. we did not observe outbound call increased. Here is the incoming call process in our app incoming call from push notification, app call reportNewIncomingCall after report incoming call success, our app will configure audio session our app will post the configureAudioSession to another thread reportNewIncomingCall callback configureAudioSession in another thread (app will wait for the user press answer button, may take several seconds) user answer the call and AnswerCallAction will callback app will finish the signaling answering logic action.fulfill() callkit shall setActive Audio session didactivate audio session shall be triggerd, but it's not. our app detect no audio for 5 seconds try to setActive(YES) for workaround, sometimes it will success, sometimes it won't if the setActive(YES) failed, it will
Topic: App & System Services SubTopic: General Tags:
Jul ’25
App Sandbox and the loading of libraries written at runtime
We're interested in adopting App Sandbox in an app distributed outside of the Mac App Store. However, we're hitting a bit of a roadblock and it doesn't seem like either of the techniques described in that post can be used in a reasonable way. For background, this is a third-party launcher for a cross-platform Java game that, among other things, makes it easier for users to mod the game. Users generally download mods as .jar files and place them in a certain directory. In some cases, these mods contain native dynamic libraries (e.g. a .dylib) as part of their code. In general, the .dylib is extracted from the contents of the .jar to some temporary location, loaded, and then deleted once the game closes (the exact details, like the actual temporary location, depends on the mod). App Sandbox greatly interests us in this case because it can limit the damage that a compromised mod could do, and in my testing the functionality of most mods still works with it enabled. However, sandboxed apps quarantine every file t
5
0
251
Jul ’25
Reply to restore root file with tmutil
So, there are actually a few different things at work here. First of the errors here: /usr/local/mnt: No such file or directory ...means exactly what it sounds like. You specified the mount target (the point the new file system would attach on your file system) as /usr/local/mnt/ and that directory either does not exist or you don't have permission to it. That's what I'd expect, as that's not a standard macOS directory, so it won't exist unless you create it. The solution is to create the directory, change the permissions, or pick a new directory. I'm not familiar with how Time Machine on a NAS works I believe we still create disk images, which are then mounted and treated as local backup target volumes. That means there are actually two permission systems at work here: The permission of the smb volume the NAS device shares. The permission of the TimeMachine disk image. That difference is important, because it leads to the difference in behavior between this case: This was on a USB drive mounted with
Topic: App & System Services SubTopic: Core OS Tags:
Jul ’25
Reply to Seeking Confirmation on Picture-in-Picture Support for Audio Calls
Thank you for your response, Kevin. From what I can see, the document appears to focus only on video calls, while my use case involves audio-only communication. Could you kindly confirm whether it’s acceptable to display a Picture-in-Picture view for an audio-only call, even if the document primarily focuses on video calls? What are you actually trying to do? While the documentation primarily describes camera to camera streaming and the idea of two people looking at each other through their camera's, nothing in the API actually requires that. As far as the API concerned, it has no idea what content it's actually showing and will happily display whatever you tell it to. Similarly, many VoIP apps allow their users to send content other than the image of the person currently talking (screen captures, shared documents, white boards, etc.). I'm not aware of us ever having had any issue with that sort of thing, particularly when the user has direct control of what's happening and the value to the
Jul ’25
How can I find the user's "Favorite Songs" playlist?
It sounds simple but searching for the name Favorite Songs is a non-starter because it's called different names in different countries, even if I specify &l=en_us on the query. So is there another property, relationship or combination thereof which I can use to tell me when I've found the right playlist? Properties I've looked at so far: canEdit: will always be false so narrows things down a little inFavorites: not helpful as it depends on whether the user has favourite the favourites playlist, so not relevant hasCatalog: seems always true so again may narrow things down a bit isPublic: doesn't help Adding the catalog relationship doesn't seem to show anything immediately useful either. Can anyone help? Ideally I'd like to see this as a kind or type as it has different properties to other playlists, but frankly I'll take anything at this point.
0
0
184
Jul ’25