Posts

Post not yet marked as solved
1 Replies
1.1k Views
I need to get the microphone output in a certain format that isn't equal to the hw format. To do this, I'm creating a AVAudioMixerNode which will have that format as it's output. However, I never receive any buffers when installing the tap on the mixer node. I thought the inputNode would flow upstream? Am I doing something wrong? Note that I'm not using AudioUnits or AudioQueues because I need to do some frequency filtering on the actual audio stream and thought this was the easiest way to do it.Here's the code:mixerNode = [[AVAudioMixerNode alloc]init]; //Attach node [theEngine attachNode:mixerNode]; //Then connect inputNode (mic) to mixer node [theEngine connect:theEngine.inputNode to:mixerNode format:[theEngine.inputNode outputFormatForBus:0]]; [theEngine startAndReturnError:&theError]; //Now set up the real audio format i want AudioStreamBasicDescription audioFormat; audioFormat.mSampleRate = 8000; audioFormat.mChannelsPerFrame = numberOfChannels; audioFormat.mFormatID = kAudioFormatLinearPCM; audioFormat.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger; audioFormat.mBitsPerChannel = 16; audioFormat.mBytesPerPacket = audioFormat.mBytesPerFrame = (audioFormat.mBitsPerChannel / 8) * audioFormat.mChannelsPerFrame; audioFormat.mFramesPerPacket = 1; //Now install the tap on the mixer so we get the correct format [mixerNode installTapOnBus:0 bufferSize:4096 format:[[AVAudioFormat alloc]initWithStreamDescription:&audioFormat] block:^(AVAudioPCMBuffer * _Nonnull buffer, AVAudioTime * _Nonnull when) { NSLog(@"got buff"); }];
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
1 Replies
1.3k Views
I am trying to use AudioUnits to take the Mic input, convert it to a different sample rate, and filter the any frequencies above 4000 hz. To do this, I try to have the following connection:RemoteIO -> Convert to Effects formats -> Filter frequencies using kAudioUnitSubType_LowPassFilter -> Convert to final formatI will then take the output and send it as a network stream over RTP. Because of this, I want to use the RenderCallback. My confusion is what do I set the callback on and what do I call AudioUnitRender on? From what I can tell, I have to set the callback on the RemoteIO unit. But then, do I call AudioUnitRender on that RemoteIO unit or do I call it on one of the other units? When I call it on other units, I get errors. When I call it on the RemoteIO unit, I do get audio but it's not clear to me if anything is getting filtered or not.Here is my code://This is the component for the Mic input AudioComponentDescription desc; desc.componentType = kAudioUnitType_Output; desc.componentSubType = kAudioUnitSubType_RemoteIO; desc.componentManufacturer = kAudioUnitManufacturer_Apple; desc.componentFlags = 0; desc.componentFlagsMask = 0; AudioComponent inputComponent = AudioComponentFindNext(NULL, &desc); CheckError(AudioComponentInstanceNew(inputComponent, &theAudioUnit), "Instance AU"); UInt32 flag = 1; CheckError(AudioUnitSetProperty(theAudioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input, 1, &flag, 1), "EnableIO"); //This is the final audio format I will use AudioStreamBasicDescription audioFormat; audioFormat.mSampleRate = 8000; //Need a sample rate of 8000 audioFormat.mChannelsPerFrame = numberOfChannels; audioFormat.mFormatID = kAudioFormatLinearPCM; audioFormat.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger; audioFormat.mBitsPerChannel = 16; audioFormat.mBytesPerPacket = audioFormat.mBytesPerFrame = (audioFormat.mBitsPerChannel / 8) * audioFormat.mChannelsPerFrame; audioFormat.mFramesPerPacket = 1; //Here is the effect filter for frequencies above 4000 AudioComponentDescription lowPass; lowPass.componentType = kAudioUnitType_Effect; lowPass.componentSubType = kAudioUnitSubType_LowPassFilter; lowPass.componentManufacturer = kAudioUnitManufacturer_Apple; lowPass.componentFlags = 0; lowPass.componentFlagsMask = 0; //This is the converter from Mic input to effect format AudioComponentDescription converter; converter.componentType = kAudioUnitType_FormatConverter; converter.componentSubType = kAudioUnitSubType_AUConverter; converter.componentManufacturer = kAudioUnitManufacturer_Apple; converter.componentFlags = 0; converter.componentFlagsMask = 0; AudioComponent converterComponent = AudioComponentFindNext(nil, &converter); CheckError(AudioComponentInstanceNew(converterComponent, &converterUnit), "Converter inst"); AudioComponent effectComponent = AudioComponentFindNext(nil, &lowPass); CheckError(AudioComponentInstanceNew(effectComponent, &lowPassUnit), "Effect Inst"); //Here is the audio format from the Mic AudioStreamBasicDescription theFormat; memset (&theFormat, 0, sizeof (theFormat)); UInt32 sizeofDesc = sizeof(AudioStreamBasicDescription); CheckError(AudioUnitGetProperty(theAudioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 1, &theFormat, &sizeofDesc), "Get IO Format"); //Now I set it as the input for the converter CheckError(AudioUnitSetProperty(converterUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &theFormat, sizeof(AudioStreamBasicDescription)), "Set Converter In fomrat"); //Here is the audio format from the Effect AudioStreamBasicDescription effectFormat; memset(&effectFormat, 0, sizeof(effectFormat)); CheckError(AudioUnitGetProperty(lowPassUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &effectFormat,&sizeofDesc), "Get Lowpass Format"); //Now i set that as the output for the converter CheckError(AudioUnitSetProperty(converterUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, &effectFormat, sizeof(AudioStreamBasicDescription)), "Set cConverter out format"); AudioUnitConnection connection; connection.sourceAudioUnit = theAudioUnit; connection.sourceOutputNumber = 1; connection.destInputNumber = 0; //Here I connect the Mic input to the converter CheckError(AudioUnitSetProperty(converterUnit, kAudioUnitProperty_MakeConnection, kAudioUnitScope_Input, 0, &connection, sizeof(AudioUnitConnection)), "Make Conn audio to convert"); AudioUnitConnection lowPassConnection; lowPassConnection.sourceAudioUnit = converterUnit; lowPassConnection.sourceOutputNumber = 0; lowPassConnection.destInputNumber = 0; //Here I connect the converter to the Effect filter CheckError(AudioUnitSetProperty(lowPassUnit, kAudioUnitProperty_MakeConnection, kAudioUnitScope_Input, 0, &lowPassConnection, sizeof(AudioUnitConnection)), "Make Conn conv to low pass"); //Here I set the Frequency cutoff Float32 lowPassFreq = 3999; CheckError(AudioUnitSetParameter(lowPassUnit, kLowPassParam_CutoffFrequency, kAudioUnitScope_Global, 0, lowPassFreq, sizeof(Float32)), "Set Low Pass freq"); //Here will be the converter from the Filter to the final output format I need AudioComponentDescription outConverter; outConverter.componentType = kAudioUnitType_FormatConverter; outConverter.componentSubType = kAudioUnitSubType_AUConverter; outConverter.componentManufacturer = kAudioUnitManufacturer_Apple; outConverter.componentFlags = 0; outConverter.componentFlagsMask = 0; AudioComponent outConverterComponent = AudioComponentFindNext(nil, &outConverter); CheckError(AudioComponentInstanceNew(outConverterComponent, &outConverterUnit), "Out Converter inst"); //Set the input of this to be the Effect format CheckError(AudioUnitSetProperty(outConverterUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &effectFormat, sizeof(AudioStreamBasicDescription)), "Out Convreter set input format"); //Set the output of this to be the final audio format I need CheckError(AudioUnitSetProperty(outConverterUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, &audioFormat, sizeof(AudioStreamBasicDescription)), "Out Converter output format"); AudioUnitConnection outConnection; outConnection.sourceAudioUnit = lowPassUnit; outConnection.sourceOutputNumber = 0; outConnection.destInputNumber = 0; //Set the connection between effect and converter CheckError(AudioUnitSetProperty(outConverterUnit, kAudioUnitProperty_MakeConnection, kAudioUnitScope_Input, 0, &outConnection, sizeof(AudioUnitConnection)), "Make connection out conv to low pass"); //Initialize everything AudioUnitInitialize(theAudioUnit); AudioUnitInitialize(converterUnit); AudioUnitInitialize(lowPassUnit); AudioUnitInitialize(outConverterUnit); AURenderCallbackStruct callbackStruct; callbackStruct.inputProc = recordingCallback; callbackStruct.inputProcRefCon = (__bridge void * _Nullable)(self); //I need a render callback, so I set it on the Mic input CheckError(AudioUnitSetProperty(theAudioUnit, kAudioOutputUnitProperty_SetInputCallback, kAudioUnitScope_Global, 1, &callbackStruct, sizeof(AURenderCallbackStruct)), "Set input callback"); //And start the Mic input CheckError(AudioOutputUnitStart(theAudioUnit), "Out converter start");And here is the Render Callback: AudioBufferList bufferList; bufferList.mNumberBuffers = 1; bufferList.mBuffers[0].mData = NULL; NSLog(@"Number frames: %u, bus nubmer: %u", (unsigned int)inBusNumber, (unsigned int)inNumberFrames); //This is where I'm confused. What should the unit be here? Is it the audio unit? Converter unit? OSStatus status = AudioUnitRender(theAudioUnit, ioActionFlags, inTimeStamp, 1, inNumberFrames, &bufferList);
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
1 Replies
380 Views
I was working with some Apple Engineers today but we ran out of time in the session. They asked that I post the feedback number with some information: FB7763992. We were trying to figure out why I was having slow incremental compilation. In our testing, we found 1.4gb of .d files - the Apple Engineer believed the processing of these files slowed down the Xcode build process.
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
1 Replies
218 Views
I was working with two Apple Engineers during the labs today but ran out of time. They asked that I post this Feedback number with some information from our session. They believed there was an Xcode issue where skipped compile jobs were showing up incorrectly in the build output.
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
1 Replies
297 Views
On all iOS 13 versions, I'm able to execute a "Restricted While Locked" Siri Shortcut even while the device is locked. Is this a known issue in iOS 13? I added a sample project to FB7381961 to demonstrate this issue.
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
3 Replies
2.6k Views
According to https://developer.apple.com/videos/play/wwdc2019/705/, apps who want to support region monitoring should request always permissions on iOS 13. The user will only initially be prompted to choose "When in Use". The user shouldthen receive a 2nd alert once the device crosses a region to confirm they want an app to user their location in the background.However, on all iOS 13 betas, I am unable to ever get that dialog. This means that our app never receives the region crossing. To reproduce: 1. Request Always locations permission on iOS 13 2. Accept the prompt with "While in Use" and start monitoring a region 3. Cross the region with the device 4. Expected behavior is you will eventually get a dialog box asking the user to allow this app to use your location in the background. 5. Current behavior is you never recieve the dialog box and the app never receives region crossed delegate methods I have attached a basic sample project in FB7216407. I am on the latest iOS 13 beta and Xcode 10.2.1
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
3 Replies
2.1k Views
Is it recommended to only use one AudioUnit for both input and output? GStreamer makes this really hard and I am trying to use two AudioUnits with subtype = kAudioUnitSubType_VoiceProcessingIO. However, the volume of the audio output is very quiet and I am wondering if that has to do with the multiple AudioUnit objects.
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
0 Replies
747 Views
Is there any easy way to reset the complcation transfer budget while I'm testing? I'm working on implementing PushKit complication updates but keep running out of my budget as I'm testing.
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
2 Replies
692 Views
I have complication images for the Circular image set. According to https://developer.apple.com/design/human-interface-guidelines/watchos/icons-and-images/complication-images/, the Simple size should be 40x40 for the 44m watch. However, adding a 40x40 image leads to an error on Xcode stating that the image should be 36x36 (the same size as 40mm and 42mm). Which is correct?
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
0 Replies
789 Views
Is there any documentation on how Custom Intent's authentication affects executing the intent on HomePod? Right now, if you try to execute an intent on HomePod with "Restricted While Locked" and your iPhone is locked, the HomePod will just ignore it without any feedback. If the paired iPhone is unlocked, the intent will execute. I would expect HomePod to at least provide feedback on why it can't execute the intent.
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
0 Replies
1.4k Views
Summary: I am attempting to use self-sizing cells on iOS 12 / Xcode 10. Unfortunately, they are not working as expected.Steps to Reproduce: Run attached project on iOS 12. Run attached project on iOS 11.Expected Results: The cells should be 1/3 of the screen width.Actual Results: On iOS 11, the cells are 1/3 of the screen width. On iOS 12, they are the width of their prototype cell.Version/Build: Xcode 10 beta 6, iOS 12 beta 7Base project attached at rdar://43260298
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
9 Replies
5.7k Views
Summary: I am attempting to use self-sizing cells on iOS 12 / Xcode 10. Unfortunately, they are not working as expected.Steps to Reproduce: Run attached project on iOS 12. Run attached project on iOS 11.Expected Results: The cells should be 1/3 of the screen width.Actual Results: On iOS 11, the cells are 1/3 of the screen width. On iOS 12, they are the width of their prototype cell.Version/Build: Xcode 10 beta 6, iOS 12 beta 7Base project attached at rdar://43260298
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
27 Replies
21k Views
Summary: When turning on DWARF with dSYM File, my Xcode build hangs on the "Generating dSYM" step. If I let the build continue, that step will complete in about 20 minutes but my system freezes. On Xcode 9, this step took about 30 seconds.Steps to Reproduce: Turn on DWARF with dSYM File in your Build Settings. Start a build.Expected Results: "Generating dSYM" step takes a reasonable amount of time and doesn't freeze your Mac.Actual Results: "Generating dSYM" step takes 20 minutes and freezes my Mac.Version/Build: I only tested on Xcode 10 Beta 4 and Xcode 10 Beta 6 and both experience this issue. I am also using the Legacy Build system.Files: I have attached a log in rdar://43332192
Posted
by jml5qh.
Last updated
.
Post not yet marked as solved
1 Replies
1.2k Views
With the new build system on Xcode 10, I'm running into an issue. In our project we have our main app, a Today View Extension, and a Notification Service Extension.We previously set our Product Module Name to the same value in all targets. We also use NSKeyedArchiver / NSKeyedUnarchiver to cache Swift objects that can be accessed by all targets using app groups. Since the Product Module Name was the same in all targets, NSKeyedUnarchiver worked since all Swift classes were prefixed with the same module.However, on the new build system I receive an error when the Product Module Name is the same for embedded targets. I get the following::-1: duplicate output file '/[Product_Module_Name].swiftmodule/x86_64.swiftdoc' on task: Ditto [Product_Module_Name].swiftmodule/x86_64.swiftdoc [Product Module Name].swiftdoc (in target 'Notification Service Extension'):-1: duplicate output file '[Product Module Name].swiftmodule/x86_64.swiftmodule' on task: Ditto [Product Mdoule Name].swiftmodule/x86_64.swiftmodule [Product Module Name].build/Dev Debug Production-iphonesimulator/Notification Service Extension.build/Objects-normal/x86_64/[Product Module Name].swiftmodule (in target 'Notification Service Extension')I am able to build by setting a distinct Product Module Name for each target, but that breaks using NSKeyedUnarchiver across app groups. I get the following crash:Terminating app due to uncaught exception 'NSInvalidUnarchiveOperationException', reason: '*** -[NSKeyedUnarchiver decodeObjectForKey:]: cannot decode object of class ([Product Module Name].SharedEventHistoryItem) for key (NS.object.0); the class may be defined in source code or a library that is not linked'
Posted
by jml5qh.
Last updated
.