hi,I'm a freshman and I'm interested in audio streaming and would like to build an app that can stream songs of one band. something like the bandcamp app, so simple and clear audio streaming. do you have some recommendations concerning where to start, frameworks and online courses to set up an app where songs are streamed?best, jonas
Search results for
Popping Sound
19,600 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi, The system automatically shows the now playing button in the tab bar and nav bar when your app becomes the now playing app, and hides the button when another app starts playing audio. You are correct that, in order for the now playing button to show AND for now playing to populate with metadata, your app must start playing audio. In general, if your app is playing audio at the time when it's launched on the car screen, it's a best practice for your app to present its root template (e.g. a tab bar template) and then immediately push the now playing template on top. This is also done on behalf of your app if the user launches your app by way of the system 'Now Playing' icon on the CarPlay home screen. Also, is there a way to hide the Now Playing button after the queue of content has finished playing? I'm able to pop the Now Playing template, but the Now Playing button is still present and tapping it will navigate the user to the now blank Now Playing template. Not current
Topic:
UI Frameworks
SubTopic:
General
Tags:
I've observed USB audio seems to fail in El Capitan Developer Preview 1 using a CEntrance HiFi-M8. This device supports a range of 2 channel 24 bit formats (44.1KHz, 48.0KHz, 88.2KHz, 96.0KHz, 176.4KHz, 192.0KHz, etc.). While it worked just great in Yosemite, what seems to occur now is a grainy/garbled sound that seems a bit like the wrong format is being played even when 44.1KHz is selected in Audio MIDI setup and a 44.1KHz file is played in iTunes or in Safari.
is NotificationSound as a text which can play when notification come (sound will be text to speech)
I am working on a design that requires connecting an ios device to two audio output devices specifically headphones and a speaker. I want the audio driver to switch output device without user action. Is this manageable via ios SDK?
After the install of MacOS Mojave Developer Beta 3 I have a very poor sound thorugh my AirPods. The sound is able to compare with if you launch Siri on Mac and listen to music directly after closing Siri on MacOS. I don't has this problem on my iPhone running iOS 12 Developer Beta 3.
Today I was going to use Siri on my iPhone 5C and I found out something: There are no sounds, not even Siri's voice. That's something strange because in my iPad Mini 2 there are all Siri's sounds. Anyone else experiencing this?
Xcode 7.1 adds: Interface Builder supports enabling Peek & Pop for segues per <https://developer.apple.com/library/prerelease/watchos/releasenotes/DeveloperTools/RN-Xcode/Chapters/xc7_release_notes.html>I'm seeing the new Peek & Pop/Preview & Commit Segues for some segues, but not others. Where requirements need to be met for this option to appear in IB?
I have lost sound alerts when sending and receiving emails. How to do I fix this?
In my tvOS App I play video with the AVPlayerViewController and use the externelMetadatafor providing Info to the NowPlayingInfo.I also play audio with the AVQueuePlayer and my custom UI. For audio, I set the infodirectly to the MPNowPlayingInfoCenter.When i run my app an play some audio first, the NowPlayingInfo is displayed correctly.Then I play a video and the NowPlayingInfo is also displayed correctly.But after presenting a video with the AVPlayerViewController, the playback of audioand setting of the info via MPNowPlayingInfoCenter no longer works. I also noticedthat in the Remote-App the NowPlayingInfo still shows the info from the last video.Prior to the tvOS 11 version everything worked as expected. I could play audio andvideo in any order and the NowPlayingInfo was always showing correctly.Is this a bug of tvOS 11 or must i do something else?Thanks.
So I am working on an app that requires audio files to be played intermittently and it should duck audio coming from the background...(such as music from iPod or Pandora). I understand that you need to activate your audio session, then upon completion of the audio playback, you deactivate the audio session to get the background sound to return to its normal volume. My only problem I am having is that when there is no background audio playing, and the audio session is not active, the volume buttons on the side of the phone control the ringer volume and not the app audio volume. This is a problem for me, because I would like the user to be able use the volume buttons to control the volume of the upcoming audio alerts at all times. When the volume buttons are controlling the ringer, the user has no control over the app audio / alert volume. I know of at least two fitness apps in the app store that are able to duck
Ive got a 13 pro max, and I wonder if apple forgot that function of hearing the sound in the headsets while recording. Not just afterwards playing it, which takes time. Plus doing documentary film its not always possable to do a second simullar recording, if the sound is bad. Do anybody know what to do?
I'm covering my bases here.I have registered my app for remote notifications in my AppDelegate and have included the sound property in the aps object. an example aps is given here: aps = { alert = scrubbed out content-available = 1; latitude = scrubbed out longitude = scrubbed out scrubbed out = 1; sound = BLE_alarmNoti.caf; title = scrubbed out };}If i have a look at the main bundle, the file BLE_alarmNoti.caf is included.How do you add custom sounds to a remote notification? The guide tbh is very ambiguous on this as the section for custom sounds only gives you instructions by making a local notification.the program has to be able to play this custom sound as it is an emergency reminder - if someone is lost, or they have an accident the notification comes from them to another device.Once i've established that the code works, then I'll go onto the next part, which is that it might be a device issue.
Are you able to have background audio from an app continuously playing sound when you have other apps playing music or sound, such as YouTube and Spotify?
I'm looking into how to package an Audio Unit consisting of several other audio units. I came up with the following solution, calling the AUAudioUnit render function explicitly from the AUInternalRenderBlock. But the audio unit doesn't seem to run, since mData inside outputData remains uninitialized after the render function has been triggered.Any suggestions?From myAudioUnit.m inheriting from AUAudioUnit:- (AUInternalRenderBlock)internalRenderBlock { return ^AUAudioUnitStatus(AudioUnitRenderActionFlags *actionFlags, const AudioTimeStamp *timestamp, AVAudioFrameCount frameCount, NSInteger outputBusNumber, AudioBufferList *outputData, const AURenderEvent *realtimeEventListHead, AURenderPullInputBlock pullInputBlock) { [_eqAudioUnit.audioUnit renderBlock](actionFlags, timestamp, frameCount, outputBusNumber, outputData, pullInputBlock); return noErr; }; }_eqAudioUnit is an instance of EQAudioUnit (ensuring that allocateRenderResources is invoked before triggering the renderBlock):publi