Tech Talks 2021:

Join us for over 100 live online sessions and connect with Apple experts at office hours to help you create your best apps yet. Tech Talks kicks off on October 25 and runs through December 17.

Learn more

Posts

Sort by:
Post not yet marked as solved
6 Views

Can't fill real address in Bank account settings

When I fill Bank account owner address and press Save button, it give me an error, that I filled wrong address. I think it because address in Russia, and Apple don't have it in their services. Please help me to resolve my problem.
Asked
by fame_app.
Post not yet marked as solved
9 Views

Code

Hi ..: I Can’t access to the app because they’re asking for a code
Asked
Post not yet marked as solved
6 Views

Not able to see 6.5" screenshot on 12 pro max

Updated the app with the new size screenshots but not able to see the screenshot for the app. Has anyone facing issues with app screenshots not showing on 12 series phones? Thanks for the reply
Asked
Post not yet marked as solved
8 Views

Want to develop

I want to develop my new App Store
Asked
Post not yet marked as solved
7 Views

Broadcast Extension receives broadcastFinished prematurely

I have a Broadcast Extension that's running a system wide screen capture. I can start the broadcast and receive buffers and the like.. Buffers are successfully getting to our back and and distributed to all the peers (webrtc). The problem is when I navigate from the host app to apps that play video with audio. And then try to go landscape, sometimes during the rotation, the system will send a broadcastFinished() message to my RPBroadcastSampleHandler subclass. It doesn't occur all the time, but when it does .. it looks like the rotation glitches some. Hard to explain. But I've noticed that it only sends the finished message when it glitches. I would love to get a screen grab of it occurring, but my capture is running.. A quick google/forum search has revealed not much help. Has anyone heard of such a thing?
Asked
Post not yet marked as solved
6 Views

Privacy Policy

I have changed my app privacy policy. How to replace it with my previous website?
Asked
Post not yet marked as solved
8 Views

Can't remove in-app purchase in App Store Connect

My developer account has already expired. I was trying to remove apps from App Store Connect. I manage to remove some but one of them cannot be removed because it says it has in-app purchases for sale. Then I go to the manage the in-app purchases, I uncheck the "Cleared for Sale" checkbox, but then I cannot "Save" it because it says I need to update the Agreement. I guess that means I need to Renew the account, but I don't want to pay $99 just to remove the apps. Is there a way I can do this?
Asked
by juankpro.
Post not yet marked as solved
15 Views

Dylib with bitcode missing

Hi there, Currently we are developing a dynamic library written in C++, target to run on arm64 ios. Things become wired when we try to archive the final project. It says, bitcode bundle could not be generated because 'the_path_to.dylib' was built without full bitcode. Some brief arch of this project lists below, we use cmake and this toolchain to build this dylib. It has an argument named ENABLE_BITCODE. It would append -fembed-bitcode to C_FLAGS & CXX_FLAGS. BTW, we use clang & clang++ as C and C++ compiler. this dylib has several dependencies, such as libcurl, libffi. We download them from vcpkg then build with bitcode enabled. As mentioned above, running it directly on the phone works as we expected. But we can't archive it. using otool -l the_path_to.dylib | grep bitcode shows nothing. We still want to make it support bitcode feature. Is there anything we miss to do to enable this? About bitcode, is there anything to learn? Is there an accurate way to find out that the dylib support bitcode or not? Thanks in advance.
Asked
Post not yet marked as solved
11 Views

Apple Pay

We are having a problem with Apple Pay since we are experiencing an issue with one of our customers from Germany.  He cannot use the Apple Pay. What we are using an Apple Pay button provided by stripe. On other account the Apple Pay button works. He is encountering the message that the BW-BankCard is not accepted by this webpage. At the banks official webpage you can see that Apple Pay is supported it is a German bank too: https://www.bw-bank.de/de/home/privatkunden/kreditkarte/mobile-payment/apple-pay.html Thanks for your thoughts on this.
Asked
by Chris2022.
Post not yet marked as solved
7 Views

How to Playing A Generated Sine Wave in Swift

AVFoundation or AVAudioPlayer I use this code but does not work. Compile is ok, not throw error, manner mode off, but it does not play tone import Foundation import AudioUnit import AVFoundation final class ToneOutputUnit: NSObject { var auAudioUnit: AUAudioUnit! = nil // placeholder for RemoteIO Audio Unit var avActive = false // AVAudioSession active flag var audioRunning = false // RemoteIO Audio Unit running flag var sampleRate : Double = 44100.0 // typical audio sample rate var f0 = 880.0 // default frequency of tone: 'A' above Concert A var v0 = 16383.0 // default volume of tone: half full scale var toneCount : Int32 = 1 // number of samples of tone to play. 0 for silence private var phY = 0.0 // save phase of sine wave to prevent clicking private var interrupted = false // for restart from audio interruption notification func setFrequency(freq : Double) { // audio frequencies below 500 Hz may be f0 = freq // hard to hear from a tiny iPhone speaker. } func setToneVolume(vol : Double) { // 0.0 to 1.0 v0 = vol * 32766.0 } func setToneTime(t : Double) { toneCount = Int32(t * sampleRate); } func enableSpeaker() { if audioRunning { print("returned") return } // return if RemoteIO is already running do { // not running, so start hardware let audioComponentDescription = AudioComponentDescription( componentType: kAudioUnitType_Output, componentSubType: kAudioUnitSubType_RemoteIO, componentManufacturer: kAudioUnitManufacturer_Apple, componentFlags: 0, componentFlagsMask: 0 ) if (auAudioUnit == nil) { try auAudioUnit = AUAudioUnit(componentDescription: audioComponentDescription) let bus0 = auAudioUnit.inputBusses[0] let audioFormat = AVAudioFormat( commonFormat: AVAudioCommonFormat.pcmFormatInt16, // short int samples sampleRate: Double(sampleRate), channels:1, interleaved: false ) // interleaved stereo try bus0.setFormat(audioFormat ?? AVAudioFormat()) // for speaker bus auAudioUnit.outputProvider = { ( // AURenderPullInputBlock? actionFlags, timestamp, frameCount, inputBusNumber, inputDataList ) -> AUAudioUnitStatus in self.fillSpeakerBuffer(inputDataList: inputDataList, frameCount: frameCount) return(0) } } auAudioUnit.isOutputEnabled = true toneCount = 0 try auAudioUnit.allocateRenderResources() // v2 AudioUnitInitialize() try auAudioUnit.startHardware() // v2 AudioOutputUnitStart() audioRunning = true } catch /* let error as NSError */ { print("error 2 \(error)") } } // helper functions private func fillSpeakerBuffer( // process RemoteIO Buffer for output inputDataList : UnsafeMutablePointer<AudioBufferList>, frameCount : UInt32 ) { let inputDataPtr = UnsafeMutableAudioBufferListPointer(inputDataList) let nBuffers = inputDataPtr.count if (nBuffers > 0) { let mBuffers : AudioBuffer = inputDataPtr[0] let count = Int(frameCount) // Speaker Output == play tone at frequency f0 if ( self.v0 > 0) && (self.toneCount > 0 ) { // audioStalled = false var v = self.v0 ; if v > 32767 { v = 32767 } let sz = Int(mBuffers.mDataByteSize) var a = self.phY // capture from object for use inside block let d = 2.0 * Double.pi * self.f0 / self.sampleRate // phase delta let bufferPointer = UnsafeMutableRawPointer(mBuffers.mData) if var bptr = bufferPointer { for i in 0..<(count) { let u = sin(a) // create a sinewave a += d ; if (a > 2.0 * Double.pi) { a -= 2.0 * Double.pi } let x = Int16(v * u + 0.5) // scale & round if (i < (sz / 2)) { bptr.assumingMemoryBound(to: Int16.self).pointee = x bptr += 2 // increment by 2 bytes for next Int16 item bptr.assumingMemoryBound(to: Int16.self).pointee = x bptr += 2 // stereo, so fill both Left & Right channels } } } self.phY = a // save sinewave phase self.toneCount -= Int32(frameCount) // decrement time remaining } else { // audioStalled = true memset(mBuffers.mData, 0, Int(mBuffers.mDataByteSize)) // silence } } } func stop() { if (audioRunning) { auAudioUnit.stopHardware() audioRunning = false } } }
Asked
by koyakei.
Post not yet marked as solved
12 Views

afet update xcode 13 color Literal previews not showing

Previews of colors disappeared after xcode 13.1 update. Preview works when set as variable. but the previews directly in the code are no longer visible. What can I do to see color previews? Version 13.1 (13A1030d)
Asked
by ursan_.
Post not yet marked as solved
24 Views

Inconsistent results when performing inference in CPU vs Metal

Hello, everyone, I have been testing tensorflow-metal in my 2020 Macbook Pro (M1) running macOS 12.0.1 by performing the inference of a pre-trained model on a known dataset. To my surprise, Tensorflow produces different (wrong) results when performing the inference using the Metal pluggable device GPU vs when performing it in the CPU. I might very well be doing something wrong, but my test program is fairly simple: #!/usr/bin/env python3 import pathlib import numpy as np import tensorflow as tf from tensorflow import keras def main(model_path, dataset_path): # Print some system info print('Tensorflow configuration:') print(f'\tVersion: {tf.__version__}') print('\tDevices usable by Tensorflow:') for device in tf.config.get_visible_devices(): print(f'\t\t{device}') # Load the model & the input data model = keras.models.load_model(model_path) matrix_data = np.genfromtxt(dataset_path) matrix_data = matrix_data.reshape([1, matrix_data.shape[0], matrix_data.shape[1]]) # Perform inference in CPU with tf.device('/CPU:0'): prediction = model.predict(matrix_data)[1] print('Model Evaluation on CPU') print(f'\tPrediction: {prediction[0, 0]}') # Perform inference in GPU with tf.device('/GPU:0'): prediction = model.predict(matrix_data)[1] print('Model Evaluation on GPU') print(f'\tPrediction: {prediction[0, 0]}') if __name__ == "__main__": main('model/model.h5', 'dataset/01.csv') The CPU path produces a result of 4.890502452850342 and this is coherent with the results I'm seeing in Ubuntu Linux using CPU & GPU (CUDA) based inference. The GPU code path results in a prediction of 3.1839447021484375, which is way off. I have set up a GitLab repo with all the resources required for replicating the problem here This is quite concerning for me, since the big difference in results is something that I was not expecting and -if confirmed- makes me not trust the results provided by the Metal backend. Am I doing something wrong? Is there any place where I can report this as a bug?
Asked
by josebagar.
Post not yet marked as solved
15 Views

Macos 12.1 beta 4

I am on beta 3 but software update does not show beta 4. Is there a link for a direct download of beta 4? I am on the developer beta program.
Asked
by BAB69.
Post not yet marked as solved
8 Views

Exception reason in Xcode Organizer

Usually when a crash is happening on iOS when the simulator or device is attached to Xcode I get a crash reason like the following example: *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** +[NSJSONSerialization dataWithJSONObject:options:error:]: value parameter is nil' When I look at the crashes of my published app in the Xcode Organizer using "Open in Project..." I see on which line of code the crash did happen. But is it also possible to get a reason for the crash like the above? I'm asking this about Xcode Version 13.0.
Asked
by l00.
Post not yet marked as solved
14 Views

Sign in with apple REST api keep getting [Request failed with status code 400] error

I am implementing apple sign in on my website On my backend(Nodejs), I need to request an authentication token using https://appleid.apple.com/auth/token REST api. I used Axios and coded as following export const createSignWithAppleSecret = () => { const token = jwt.sign({}, signWithApplePrivateKey, { algorithm: 'ES256', expiresIn: '1h', audience: APPLE_DOMAIN, issuer: APPLE_TEAM_ID, subject: APPLE_SERVICE_ID, keyid: APPLE_KEY_ID, }); return token; }; export const getAppleToken = async (code: string) => axios.post( 'https://appleid.apple.com/auth/token', qs.stringify({ grant_type: 'authorization_code', code, client_secret: createSignWithAppleSecret(), client_id: APPLE_SERVICE_ID, redirect_uri: APPLE_REDIRECT_URI, }), { headers: { 'Content-Type': 'application/x-www-form-urlencoded', }, } ); But I am getting Reqest failed with status code 400 Error: Request failed with status code 400\n at createError (/home/ubuntu/sooldamhwa/www/node_modules/axios/lib/core/createError.js:16:15)\n at settle (/home/ubuntu/sooldamhwa/www/node_modules/axios/lib/core/settle.js:17:12)\n at IncomingMessage.handleStreamEnd (/home/ubuntu/sooldamhwa/www/node_modules/axios/lib/adapters/http.js:260:11)\n at IncomingMessage.emit (events.js:327:22)\n at IncomingMessage.EventEmitter.emit (domain.js:485:12)\n at endReadableNT (_stream_readable.js:1201:12)\n at processTicksAndRejections (internal/process/task_queues.js:84:21) The api endpoint is correct, and I have configured header as document instructed( https://developer.apple.com/documentation/sign_in_with_apple/generate_and_validate_tokens) Could someone please let me know what I did wrong?
Asked