Post not yet marked as solved
In Interface Builder, setting the constraints for a UI element (let's say a button) doesn't change if I make the aligment proportional to the Safe Area or proportional to the Superview.
I have a button which I set its horizontal alignment to be:
I have another button which I set its horizontal alignment to be:
Both buttons end up being aligned horizontally:
I would have expected the button aligned to the Safe Area to be shifted to the right as the Safe Area's leading edge is shifted to the right from the one of the Superview.
I'm probably missing something but can't quite understand what is going on here.
The problem is that heights and widths proportional to the Safe Area are honored, so the size of UI elements does change if you make them proportional to the Safe Area or to the Superview. So when you try to layout something with Safe Area proportional heights and widths, and also use Safe Area proportional horizontal and vertical placements, UI Elements don't line up for iPhones with a notch. They kind of lineup for devices like iPads and iPhones with no notch where the Safe Area is very close to the Superview area.
Post not yet marked as solved
When rendering a scene using environment lighting and the physically based lighting model, I have a need for an object to reflect another object. As I understand it, in this type of rendering, reflections are only based on the environment lighting and nothing else. As a solution I was intending to use a light probe placed between the object to be reflected and the reflecting object. My scene has been developed programatically and not through an XCode scene file. From Apple's WWDC 2016 presentation on SceneKit I could gather that light probes can be updated programatically through the use of the updateProbes method of the SCNRenderer class. I have the following code, where I am trying to initialize a light probe by using the updateProbes method:let sceneView = SCNView(frame: self.view.frame)
self.view.addSubview(sceneView)
let scene = SCNScene()
sceneView.scene = scene
let lightProbeNode = SCNNode()
lightProbe = SCNLight()
lightProbeNode.light = lightProbe
lightProbe.type = .probe
scene.rootNode.addChildNode(lightProbeNode)
var initLightProbe = true
func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
if initLightProbe {
initLightProbe = false
let scnRenderer = SCNRenderer(device: sceneView.device, options: nil)
scnRenderer.scene = scene
scnRenderer.updateProbes([lightProbeNode], atTime: time)
print ("Initializing light probe")
}
}I don't seem to get any light coming from this light probe. My question is simple, can the updateProbes method be used to initialize a lightProbe? If not, how can you initialize a light probe programatically?
Post not yet marked as solved
I am trying to understand how timestamping works for an AUv3 MIDI plug-in of type "aumi", where the plug-in sends MIDI events to a host. I cache the MIDIOutputEventBlockand the transportStateBlock properties into _outputEventBlock and _transportStateBlock in the allocateRenderResourcesAndReturnError method and use them in the internalRenderBlockmethod:
(AUInternalRenderBlock)internalRenderBlock {
		
		// Capture in locals to avoid Obj-C member lookups. If "self" is captured in render, we're doing it wrong. See sample code.
	
		return ^AUAudioUnitStatus(AudioUnitRenderActionFlags *actionFlags, const AudioTimeStamp *timestamp, AVAudioFrameCount frameCount, NSInteger outputBusNumber, AudioBufferList *outputData, const AURenderEvent *realtimeEventListHead, AURenderPullInputBlock pullInputBlock) {
				// Transport State
				if (_transportStateBlock) {
						AUHostTransportStateFlags transportStateFlags;
						_transportStateBlock(&transportStateFlags, nil, nil, nil);
						
						if (transportStateFlags & AUHostTransportStateMoving) {
								if (!playedOnce) {
										// NOTE On!
										unsigned char dataOn[] = {0x90,69,96};
										_outputEventBlock(timestamp->mSampleTime, 0, 3, dataOn);
										playedOnce = YES;
										
										// NOTE Off!
										unsigned char dataOff[] = {0x80,69,0};
										_outputEventBlock(timestamp->mSampleTime+96000, 0, 3, dataOff);
								}
						}
						else {
								
								playedOnce = NO;
								
						}
				}
				
				return noErr;
		};
}
What this code is meant to do is to play the A4 note in a synthesizer at the host for 2 seconds (the sampling rate is 48KHz). What I get is a click sound. Experimenting some, I have tried delaying the start of the note on MIDI event by offsetting the _outputEventBlock AUEventSampleTime, but it plays the click sound as soon as the play button is pressed on the host.
Now, if I change the code to generate the note off MIDI event when the _transportStateFlags are indicating the state is "not moving" instead, then the note plays as soon as the play button is pressed and stops when the pause button is pressed, which would be the correct behavior. This tells me that my understanding of the AUEventSampleTime property in MIDIOutputEventBlock is flawed and that it cannot be used to schedule MIDI events for the host by adding offsets to it.
I see that there is another property scheduleMIDIEventBlock, and tried using this property instead but when I use it, there isn't any sound played.
Any clarification of how this all works would be greatly appreciated.
Post not yet marked as solved
I am using the AUv3 template that gets created by XCode to implement a MIDI AUv3 plugin of type "aumi". For the plugin to be able to send MIDI to a host it needs to be able to access the MIDIOutputEventBlock provided by the host. I have done some research and found that this is done by caching the MIDIOutputEventBlock in the allocateRenderResourcesAndReturnError method:
_midiOut = self.MIDIOutputEventBlock;
and then using \_midiOut in the internalRenderBlock method.
The first thing is that in the template that gets created isn't an allocateRenderResourcesAndReturnError method, there is only an allocateRenderResources method. When I apply that code in this method I get a compile error that basically says that this property is not found in the object of type xxxDSPKernelAdapter. I've seen in other examples (like Gene de Lisa's "Audio Units (AUv3) MIDI extension", a wonderful tutorial, by the way!) that the initial template from a couple years ago was very different from what I have and that MIDIOutputEventBlock is actually defined in the AUAudioUnit.h header file, but in this case self is also a different class.
I am very new at working with Objective-C, C++ and Swift in the same project, so I know my understanding of how this all works is minimal and very shallow. Any insight anybody could provide on this would be greatly appreciated.
Post not yet marked as solved
I can use NSMutableAttributedString to generate subscripts and superscripts for strings for a UILabel. But what if I want the subscript and the superscript to be vertically aligned. Is there a way to do this with NSMutableAttributedString ?
I am generating audio with an AVAudioEngine. I am then installing a tap on the mainMixerNode output of the engine which provides an AVAudioPCMBuffer which is then written into an MPEG4ACC AVAudioFile. The input Audio nodes to the engine are AVAudioUnitSampler nodes. The issue I have is that the audio played on the resulting .m4a file is slower in speed than the one that you can hear on the device output itself (speakers, headphones). This is the code I am implementing:// Audio Format
let audioFormat = AVAudioFormat(standardFormatWithSampleRate: 44100, channels: 2)
// Engine
var engine = AVAudioEngine()
// Two AVAudioNodes are hooked up to the AVAudioEngine
engine.connect(myAVAudioNode0, to: engine.mainMixerNode, format: audioFormat)
engine.connect(myAVAudioNode1, to: engine.mainMixerNode, format: audioFormat)
// Function to Write Audio to a File
func writeAudioToFile() {
// File to write
let documentsDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
let audioURL = documentsDirectory.appendingPathComponent("share.m4a")
// Format parameters
let sampleRate = Int(audioFormat!.sampleRate)
let channels = Int(audioFormat!.channelCount)
// Audio File settings
let settings = [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: Int(audioFormat!.sampleRate),
AVNumberOfChannelsKey: Int(audioFormat!.channelCount),
AVEncoderAudioQualityKey: AVAudioQuality.max.rawValue
]
// Audio File
var audioFile = AVAudioFile()
do {
audioFile = try AVAudioFile(forWriting: audioURL, settings: settings, commonFormat: .pcmFormatFloat32, interleaved: false)
}
catch {
print ("Failed to open Audio File For Writing: \(error.localizedDescription)")
}
// Install Tap on mainMixer
// Write into buffer and then write buffer into AAC file
engine.mainMixerNode.installTap(onBus: 0, bufferSize: 8192, format: nil, block: { (pcmBuffer, when) in
do {
try audioFile.write(from: pcmBuffer)
}
catch {
print("Failed to write Audio File: \(error.localizedDescription)")
}
})
}
I am trying to implement MIDI Out for an app. The app will generate a MIDI sequence that could be directed to be played on another app (synthesizer or as an input to apps like AUM). I noticed that certain functions in the CoreMIDI framework like MIDIReceived and MIDISend have been deprecated. I couldn't find any new functions that replace these. CoreMIDI documentation is very sparse and lacking. Anybody know of new functions that replace these?
I am setting up an app to send MIDI data to another app using the CoreMIDI framework. I will not be using the AudioKit framework in this app.
var destRef = MIDIEndpointRef()
destRef = MIDIGetDestination(destIndex)
// Create Client
var midiClientRef = MIDIClientRef()
MIDIClientCreate("Source App" as CFString, nil, nil, &midiClientRef)
// Create MIDI Source Endpoint Ref
var virtualSrcEndpointRef = MIDIEndpointRef()
MIDISourceCreate(midiClientRef, "Source App Endpoint" as CFString, &virtualSrcEndpointRef)
// Create MIDI Output port
var outputPortRef = MIDIPortRef()
MIDIOutputPortCreate(midiClientRef, "Source App Output Port" as CFString, &outputPortRef)
After that I use the MIDIReceived function to send MIDI packets to the Source Endpoint. This works, but the issue is that if there are several destination apps open, the MIDI gets played in all of them. This makes sense because there isn't an explicit connection between the client's output port and the destination endpoint. When it is the opposite, when you create a Destination Endpoint and you are receiving MIDI there is a function called MIDIPortConnectSource which establishes a connection from a source to a client's input port. I cannot find an equivalent MIDIPortConnectDestination in the CoreMIDI MIDI services APIs.
How does one make that direct connection?
Again, I will not be using AudioKit in this app.
I am trying to use the AVAudioRecorder method to record the output of the mainMixerNode of an AVAudioEngine instance and save it to an MPEG4ACC file. From what I have been reading the default input to AVAudioRecorder is the microphone. I have everything setup so I can record to a file but how can I change the AVAudioRecorder input to be the mainMixerNode output?
Post not yet marked as solved
I have been successful at issuing score challenges in Game Center by using the challengeComposeController method of GKScore. This GKScore object has a context property which holds a specific seed to be used to start a specific game when the challenged player accepts the challenge. My question is, when the challenged player presses the Play Now button in the Challenges screen of the Game Center View Controller, how can the game view controller know that the player accepted the challenge and which challenge he/she accepted when the GKGameCenterViewController is dismissed? I know there is a GKLocalPlayerListener protocol with different methods to manage challenges but it isn't very well documented when these methods fire or should be used.
We are trying to generate an SKTexture (texture) from the rendering of an SKNode (node) in an SKScene presented to an SKView (view)let origin = CGPoint (x:0, y:0)
let rect = CGRect(origin:origin, size:CGSize(width:100, height:100))
let texture = view?.texture(from:node, crop:rect)The resulting texture is always the same regardless of the origin value.Has anybody ran into this issue before?
We don‘t see any information for the sale of our apps from 4/6/19 to 4/8/19 in App Store Connect. The last date with sales information is 4/5/19. Is anybody else seeing this?
Post not yet marked as solved
We have an app that uses MTL to render. This app works correctly on devices running iOS11. When using the same app on devices running iOS12, we started getting glitches and sometimes hangs in the rendering. We also tried recompiling for iOS12 and are getting the same bad behavior. On the console we are getting the following different messages:2018-09-22 09:22:29.508576-0500 OurApp [1286:84481] Execution of the command buffer was aborted due to an error during execution. Discarded (victim of GPU error/recovery) (IOAF code 5)2018-09-22 09:29:55.654426-0500 OurApp [1286:84625] Execution of the command buffer was aborted due to an error during execution. Caused GPU Hang Error (IOAF code 3)2018-09-22 09:34:37.718054-0500 OurApp [1286:87354] Execution of the command buffer was aborted due to an error during execution. Ignored (for causing prior/excessive GPU errors) (IOAF code 4)With the first two messages the rendering seems glitchy, where a blank screen is presented and then finally the rendering occurs on screen. With the last message the rendering doesn't actually occur and the message continues being displayed until we move to a different view.This app uses SceneKit, instantiates a SCNView and uses a default CIContext. It also uses the Physically Based Lighting model, which forces the MTL renderer to be used. The app has a simple SCNNode geometry, a cylinder. Each geometry object of the cylinder gets a normal texture (3 in total). The same diffuse, metalness and roughness values are applied to all the geometry objects of the cylinder.Has anybody ran into this problem? If so, how did you solve it?ThanksUPDATE: After a lot of debugging it was found that using an image for environment lighting in a scene (for more realistic reflections) is what is causing the GPU errors. Without environment lighting the GPU errors do not exist. This isn't really a workaround as our app needs realistic reflections.
Post not yet marked as solved
Our App has an in-app purchase (IAP) which works correctly using sandbox tester accounts. Our App was rejected with the following message:"We found that your in-app purchase products exhibited one or more bugs when reviewed on iPad running iOS 11.3.1 on Wi-Fi connected to an IPv6 network.Specifically, we are unable to buy the IAP due to the buttons not responding to touch."Our best guess is that Apple reviewers are getting an invalid product ID and that is why the buttons don't respond. We believe this is so because when we change the product ID inside our app we get the same behavior where the IAP buttons don't do anything (will now set an alert). Anybody else encountered this issue before. We have tried our app with many different scenarios and sandbox tester accounts and IAPs are working correctly.