We selling pre-record attraction audio guide through our app, when we first submit for review we have been told that because the audio files would be use on the cunsumers' device therefore it classified as service on the apple platform we can not direct the payment process to third party, the payment has to be make through Apple. but recently we have just fund out that one of the app has very similar functions but they can sell their audio files through wechat payment system. can anyone tell me why is this?The App is can be fund through the link below:https://itunes.apple.com/cv/app/yu-yin-dao-you-dao-ting-tu-shuo/id1035532946?mt=8there first audio always free all others are chargeable.
Search results for
Popping Sound
19,602 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Some of our clients that have updated their devices to iOS 11 have reported that they are not receiving any sound or vibration when receiving notifications, after testing myself I can confirm that this is the case. The notification is constructed on our API with ping.aiff as the value for sound like so:string appleSound = ping.aiff;var appleNotificationMessage = ${{aps:{{alert:{message}, sound:{appleSound}}}}};I've tried to search around a bit but I haven't found anything, any ideas why this may have stopped working? Notifications are coming through fine they are just silent and as far as I can tell the notification settings on the app are fine.
There has been a bug affecting music streaming from iTunes to AppleTV that started with tvOS 13 and has persisted in each release, including the most recent beta (13.4.5 17L5543d). It seems to primarily affect those streaming music from a PC using iTunes for Windows. A detailed description of the behavior can be found below. If anyone has any suggestions, they would be appreciated.When playing music via iTunes to an AppleTV running tvOS 13 or higher, audio on the AppleTV drops out frequently during song playback. Audio drops about 20 seconds into the track and returns with about 10 seconds remaining before the end of the track. Sound continues playing on unaffected Airplay devices without interruption. It occurs every 1 to 5 songs and seems to be random. It is not track specific (ie it can happen to a specific song, but later that same song can complete without an audio drop). After the sound has dropped, deselecting the affected AppleTV as an audio destin
Whenever I get a notification, it lowers the audio that I’m currently listening to at the moment. Everywhere I look, the answer I see is just to mute my phone or put it on do not disturb, but I still wish to hear the ringer go off when I get a notification. is there a way I can stop the audio from lowering without needing to mute the ringer?
I've written an audio unit which I've been testing by loading it in the application from the sample code (https://developer.apple.com/library/content/samplecode/AudioUnitV3Example/Introduction/Intro.html).The good part All audio processing works as expected.The problem The audio unit extention's view doesn't show. The extension is registered as having a UI, by the NSExtensionPointIdentifier property being set to com.apple.AudioUnit-UI, and the sample code host will open the custom view pane, but the UI is simply blank.The frustration As said, the audio is being processed, but the UI doesn't appear. And to make matters worse; there is no errors reported, so it's difficult to troubleshoot.The question Why? — What is wrong, since the audio unit won't show its UI?There seems to be some discrepancies in the documentation as to why this could be happening.One place states that it's all about the plist, opting in or out for UI:The name of the main storyboard file for the
Hi all, I am developing a digital signal processing application using AudioToolbox to capture audio from an audio loop application (BlackHole). Environment: MacOS Sonoma 14.4.1 Xcode 15.4 Quicktime 10.5 (I also tested with JRive Media Center) BlackHole 2ch and 16ch Problem: All audio samples received are zero. Steps to recreate: Set Mac Settings Sound audio output to BlackHole 2ch. Set Mac Settings Sound audio input to BlackHole 2ch. Authorise Xcode to access Microphone. In Audio MIDI set Use this device for sound input and Use this device for sound output. Set volume of both to 1.0 . Play a 44.1 16-bit signed integer stereo FLAC file using Quicktime. Start C++ application . Key details of my code below... AudioStreamBasicDescription asbd = { 0 }; asbd.mFormatID = kAudioFormatLinearPCM; asbd.mFormatFlags = kLinearPCMFormatFlagIsFloat | kLinearPCMFormatFlagIsPacked; asbd.mSampleRate = 48000; asbd.mBitsPerChannel = 32; asb
I am using iPhone 8 Plus, IOS version 15.4.1, browser is Safari I played an audio in an audio playlist, when the audio is end it will automatically play the next track of the audio playlist. It will work fine if iPhone screen is not locked. But when the screen is locked, the audio will not play next track. Here is an example link: http://relaxinglive.com/album.php?id=1 Open this link, choose a song. Wait to the end of audio, if the screen is locked then the audio player will not play next track. Note: The issue only happens in my iPhone, It doesn't happen on my MacBook. The issue happen when my iPhone update version to 15.4.1 (the issue doesn't happen before) Is it possible to make the browser play next audio track when screen locked?
I am working on an iOS app for mixing an audio file with the user's voice input into a new file, and playing the content of this audio file simultaneously. You can just consider this app as a Karaoke player which records both the singer’s voice and the original soundtrack into file while playing the original soundtrack for the singer.I use AUGraph to establish the audio process flow as:One mixer AudioUnit (with type kAudioUnitType_Mixer and subtype kAudioUnitSubType_MultiChannelMixer) with 2 inputs;One resampler AudioUnit for necessary sample rate converting( kAudioUnitType_FormatConverter, kAudioUnitSubType_AUConverter) from the sample rate of the audio file to that of the mic input in order to make formats of the mixer's 2 input buses match;Then connect these nodes: Mic’s output(output element of IO node's input scope) —> Mixer’s input 0, Resampler’s input —> Mixer’s input 1, Mixer’s output —> Speaker’s input(input element of IO node's output scope);Set a render
I'm trying to play a sound after the NSTimer has counted down, so far I know the selector is being called but for some reason no sound is playing.let timer = NSTimer(fireDate: task.date, interval: 0, target: self, selector: Selector(update), userInfo: nil, repeats: false)NSRunLoop.mainRunLoop().addTimer(timer, forMode: NSRunLoopCommonModes)func update() { backgroundMusic = self.setupAudioPlayerWithFile(music, type:mp3) backgroundMusic.volume = 1 backgroundMusic.prepareToPlay() backgroundMusic.play() println(we made it!) }func setupAudioPlayerWithFile(file:NSString, type:NSString) -> AVAudioPlayer { / var path = NSBundle.mainBundle().pathForResource(file as String, ofType: type as String) var url = NSURL.fileURLWithPath(path!) var error: NSError? var audioPlayer:AVAudioPlayer? audioPlayer = AVAudioPlayer(contentsOfURL: url, error: &error) return audioPlayer! }The timer works perfectly, it executes the function update and I know that because I always see the println in the console, but
My app record audio in background mode. Sometimes the app crashes (I still don't know why) but then it restarts by the system itself. Before being closed, I pause audio recording (inside applicationWillTerminate) and I use play function, after the app reopened, inside a sync function that runs in my app every minute.On my test mobile (Iphone 5s, IOS 13.2.3), the app never crashes (to simulate I have to close the app manually) and always works resume the audio recording in the way that I have described. In the users' mobile (several Iphone models, but higher than 5c) I see in the log file that the app could not resume audio recording after the app has crashed.Sorry because I don't know much more information for now, but if someone could help understand why it is happening, I would be very grateful.Thanks
is there a way to fade out my audio file once the button is pressed, and only have the audio fade on the first loop? At the moment i have an action where the button plays audio and changes page, but the audio comes in two early, i'm trying to find away so that the audio only comes in when the page changes, so if theres an altenative way that would awesome 🙂
In an iOS app, it's possible to have an app that has been backgrounded continue to play audio if the Info.plist setting 'Required background modes includes the item: App plays audio or streams audio/video using AirPlay.Using this key in tvOS seems to have no effect, and audio stops when I open another applicaiton while my AVPlayerViewController is playing audio. Is this a bug with tvOS or is there a different key?
Hi Everyone,I would like to write an app that is able to record system audio (the audio that is to be output through the iphone's speakers). I've read conflictinginformation online so I am unsure what to expect. Some people say that due to sandboxing, this is not possible. Others state that usingAVAudioFoundation and AVAudioSession, it is possible?Thanks,Austin
I'm having difficulties with AudioContext.getOutputTimestamp on Apple devices, which i use for synchronizing animation data to Audio. the difference between Audio Context current time and getOutputTimestamp context time, ie a measure of Audio processing to speaker output latency, which can differ, depending on device. Since the original getOutputTimestamp introduction, I have not been able to use the output timestamp in Safari, but could use it in Chrome on Apple devices. However, since upgrading to iOS 15.1 getOutputTimestamp is giving contextTime values that is approx /10,000 smaller than Audio context current time, when they are supposed to be in the same units as audio context currentTime See the following modification of example from Web Audio examples which illustrates the problems with AudioContext.getOutputTimestamp on Apple devices getOutputTimestamp() reference Any help in understanding why I'm seeing such big differences here on Apple devices wo
I am about to start a new project on OS/X which will require streaming audio from an USB attached 16 channel mixer. Been doing lots of googling and am confused as to what tools I should be using. I downloaded the latest version of XCode version 11.3 and it does not seem to have Core Audio headers for C++. Has the C++ version been deprecated?Further searching brought me to these librariesAudioToolboxAVFoundationAre these the replacements? And are they only available in Objective C?I'm quite confused as some of the online Apple documentation also says This document is no longer being updated.The bottom line, is what tools should I be using to access the physical audio mixer streams?Any help greatly appreciatedSean