Overview

Post

Replies

Boosts

Views

Activity

StoreKit issue
Hey everyone! I’m currently working on a new app called Kept – a simple and elegant journaling app designed to help you capture your thoughts and ideas effortlessly. However, I’ve hit a bit of a snag with the TestFlight distribution of the app. When I test the in-app purchases locally, everything works perfectly. But once the app is pushed to TestFlight, users only see "Loading products..." indefinitely and are unable to make purchases. Here are the details: The app works locally with sandbox accounts. Product identifiers and configurations have been double-checked. All in-app purchases are correctly set up and approved in App Store Connect. Using correct sandbox account settings on the device.
0
0
11
40m
VisionOS 2 dev beta control center call
Hello I've just updated my Vision Pro to the newest and greatest 2.0, and I see that the way to call out control center has been changed to the hand gesture which I'm assuming that is powered by computer vision? using the cams, for me there is a use case of watching apple TV shows at night where my gf would like to have the lights turned off, and that is the time where it fails, the new method is greate and the responsiveness are crazily good, but I would like this to be a toggle so that we can select our own method? like for example during the day where lights are sufficient we use the new gestrue recogn way, and when low light we can switch it back to look up? or as a fellow programmer who are just learning, I think it is possible to just make it automatic toggle, whenever the lighting condition can't make the hand gestrue work, hope to see that fixed :) cheers
0
0
19
2h
MBP with MacOS 15 dev beta apache
Hello I'm a computer engineering student who needs to build some apps from time to time, and when I do I need to use services like xampp which runs apache, and after I updated the OS to dev beta, the apache were never able to start again, and here is the error code from log, I'm just posting here in case apple fixed it,I see that the 15 beta 2 that rolled out today has no mention of it, so hope to see it fixed, so I can local host again in my mac with out reinstalling my OS or rollback. cheers
0
0
20
2h
New IPAD PRO M4 unknown system mem usagage
Hello I've just got the M4 ipad pro 13 inch,256G and I've updated the ipad with ipadOS 18 for the greate new features, also my phone, I've noticed a few days after the installation my system data has been growing like crazy, it started around 120 G when I first noticed, but has grown to 185.88GB as of today, leaving my ipad with bearly any storage left, and the same case happened for my iphone 15 pro max, but when I check today iphone storage have went down, but Ipad is still having almost 200G of unknown stuff, Am I the only one getting this issue?
0
0
28
3h
Are there cases where a response to an error returned from the AppStore Server API is considered a subscription cancellation by the user?
According to the following document, when the AppStore Server API is executed and the account does not exist, the response "errorCode: 4040001" is returned. If this response is returned, is it safe to assume that the account has been deleted and treat the subscription as cancelled? Also, please let us know if there are any other error codes other than this error code that can be used to determine that the subscription has been cancelled. https://developer.apple.com/documentation/appstoreserverapi/accountnotfounderror
0
0
30
3h
Obscuration of Specific Content in Screenshots on visionOS
Hello, I am currently developing an app using visionOS, and I am looking for a way to obscure specific content in screenshots. My app includes certain content that is exclusive to premium users, and I need to hide these parts when screenshots are shared on social media. In iOS, I was able to achieve this by extending SecureField to hide specific Views or Images, but this method does not work in visionOS. I am seeking advice on the best approach to implement this feature in visionOS. https://github.com/yoxisem544/ScreenshotPreventing-iOS Specifically, I would appreciate guidance on the following points: The optimal method for obscuring specific Views or Images in screenshots on visionOS Any tips or tricks when using existing components similar to SecureField Techniques or approaches used by other developers to implement similar features Your assistance would be greatly appreciated. Thank you for your help.
0
0
21
3h
Trying to Update Macbook, Not enough Storage
I’ve been trying to update my Macbook pro for a while now. I’m on version 10.14.6, I’ve been able to get away with not having enough storage for the newer updates but now it is unavoidable as my laptop is nearly unusable. My issue is that I do not have enough storage. I’ve done everything, deleted old files, cleared my trash, deleted apps, etc. Still, the two things taking up the majority of my storage are applications the laptop will not let me delete and the system itself. It’s become incredibly frustrating, especially since the majority of the apps that I can’t delete I don’t even use. Is there a way to bypass this? Or any other possible solution?
0
0
23
3h
*Please bring back OG live wallpapers!*
Please bring back live wallpapers from previous iPhone models. I have an iPhone 8plus & I have the beautiful purple smoky live wallpaper. I was so obsessed with how pretty it looked when live but now it’s just the pic of the purple smokiness and no live movement. ): I would also like to point out that I cannot change my home-screen wallpaper without having to change my lock screen as well! What a bummer really bc i would at least like to keep the purple post-live wallpaper as my Lock Screen and be able to change my Home Screen!
0
0
17
4h
FinanceKit requestAuthorization denied
Hello all 👋 I'm getting unexpected behavior when testing FinanceKit for my app and was hoping to get assistance. Pre-reqs (defined at https://developer.apple.com/documentation/financekit): I've been given the FinanceKit entitlement I have the com.apple.developer.financekit entitlement set to financial-data I also have NSFinancialDataDescription set in Info.plist. I am targeting iOS 17.4 (a physical device) When I call FinanceStore.shared.requestAuthorization(), I immediately get a denied status without any alert dialogs. No data about my app is listed in Settings > Privacy & Security > Wallet Any idea what else is needed here? Thanks so much for the help! Code import SwiftUI import FinanceKit @main struct myApp: App { var body: some Scene { WindowGroup { ContentView() } } } struct ContentView: View { @State private var status: AuthorizationStatus? @State private var dataAvail: Bool? var body: some View { VStack { Text("Data available \(String(describing: dataAvail))") Text("Auth status \(String(describing: status))") Button("Get Status") { Task{ dataAvail = FinanceStore.isDataAvailable(.financialData) status = try await FinanceStore.shared.authorizationStatus() } }.buttonStyle(.borderedProminent) Button("Request Auth") { Task{ do{ status = try await FinanceStore.shared.requestAuthorization() }catch{ print(error) } } }.buttonStyle(.borderedProminent) } .padding() } } #Preview { ContentView() } app.entitlements: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>com.apple.developer.financekit</key> <array> <string>financial-data</string> </array> </dict> </plist> Info.plist: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>NSFinancialDataDescription</key> <string>Budget across all your accounts</string> </dict> </plist> VIdeo demo:
0
1
38
5h
Data storage for a Matrix struct when working with Accelerate
I have a Matrix structure as defined below for working with 2D numerical data in Accelerate. The underlying numerical data in this Matrix struct is stored as an Array. struct Matrix<T> { let rows: Int let columns: Int var data: [T] init(rows: Int, columns: Int, fill: T) { self.rows = rows self.columns = columns self.data = Array(repeating: fill, count: rows * columns) } init(rows: Int, columns: Int, source: (inout UnsafeMutableBufferPointer<T>) -> Void) { self.rows = rows self.columns = columns self.data = Array(unsafeUninitializedCapacity: rows * columns) { buffer, initializedCount in source(&buffer) initializedCount = rows * columns } } subscript(row: Int, column: Int) -> T { get { return self.data[(row * self.columns) + column] } set { self.data[(row * self.columns) + column] = newValue } } } Multiplication is implemented by the functions shown below. import Accelerate infix operator .* func .* (lhs: Matrix<Double>, rhs: Matrix<Double>) -> Matrix<Double> { precondition(lhs.rows == rhs.rows && lhs.columns == rhs.columns, "Matrices must have same dimensions") let result = Matrix<Double>(rows: lhs.rows, columns: rhs.columns) { buffer in vDSP.multiply(lhs.data, rhs.data, result: &buffer) } return result } func * (lhs: Matrix<Double>, rhs: Matrix<Double>) -> Matrix<Double> { precondition(lhs.columns == rhs.rows, "Number of columns in left matrix must equal number of rows in right matrix") var a = lhs.data var b = rhs.data let m = lhs.rows // number of rows in matrices A and C let n = rhs.columns // number of columns in matrices B and C let k = lhs.columns // number of columns in matrix A; number of rows in matrix B let alpha = 1.0 let beta = 0.0 // matrix multiplication where C ← αAB + βC let c = Matrix<Double>(rows: lhs.rows, columns: rhs.columns) { buffer in cblas_dgemm(CblasRowMajor, CblasNoTrans, CblasNoTrans, m, n, k, alpha, &a, k, &b, n, beta, buffer.baseAddress, n) } return c } I can also define a Matrix structure where the underlying data is an UnsafeMutableBufferPointer. The buffer is handled by the MatrixData class. struct Matrix<T> { let rows: Int let columns: Int var data: MatrixData<T> init(rows: Int, columns: Int, fill: T) { self.rows = rows self.columns = columns self.data = MatrixData(count: rows * columns, fill: fill) } init(rows: Int, columns: Int) { self.rows = rows self.columns = columns self.data = MatrixData(count: rows * columns) } subscript(row: Int, column: Int) -> T { get { return self.data.buffer[(row * self.columns) + column] } set { self.data.buffer[(row * self.columns) + column] = newValue } } } class MatrixData<T> { var buffer: UnsafeMutableBufferPointer<T> var baseAddress: UnsafeMutablePointer<T> { get { self.buffer.baseAddress! } } init(count: Int, fill: T) { let start = UnsafeMutablePointer<T>.allocate(capacity: count) self.buffer = UnsafeMutableBufferPointer(start: start, count: count) self.buffer.initialize(repeating: fill) } init(count: Int) { let start = UnsafeMutablePointer<T>.allocate(capacity: count) self.buffer = UnsafeMutableBufferPointer(start: start, count: count) } deinit { self.buffer.deinitialize() self.buffer.deallocate() } } Multiplication for this approach is implemented by the functions shown here. import Accelerate infix operator .* func .* (lhs: Matrix<Double>, rhs: Matrix<Double>) -> Matrix<Double> { precondition(lhs.rows == rhs.rows && lhs.columns == rhs.columns, "Matrices must have same dimensions") let result = Matrix<Double>(rows: lhs.rows, columns: lhs.columns) vDSP.multiply(lhs.data.buffer, rhs.data.buffer, result: &result.data.buffer) return result } func * (lhs: Matrix<Double>, rhs: Matrix<Double>) -> Matrix<Double> { precondition(lhs.columns == rhs.rows, "Number of columns in left matrix must equal number of rows in right matrix") let a = lhs.data.baseAddress let b = rhs.data.baseAddress let m = lhs.rows // number of rows in matrices A and C let n = rhs.columns // number of columns in matrices B and C let k = lhs.columns // number of columns in matrix A; number of rows in matrix B let alpha = 1.0 let beta = 0.0 // matrix multiplication where C ← αAB + βC let c = Matrix<Double>(rows: lhs.rows, columns: rhs.columns) cblas_dgemm(CblasRowMajor, CblasNoTrans, CblasNoTrans, m, n, k, alpha, a, k, b, n, beta, c.data.baseAddress, n) return c } Both of these approaches give me similar performance. The only difference that I have noticed is the matrix buffer approach allows for reference semantics. For example, the code below uses half the memory with the matrix buffer approach compared to the matrix array approach. This is because b acts as a reference to a using the matrix buffer approach; otherwise, the matrix array approach makes a full copy of a. let n = 10_000 let a = Matrix<Double>(rows: n, columns: n, fill: 0) var b = a b[0, 0] = 99 b[0, 1] = 22 Other than reference semantics, are there any reasons to use one of these approaches over the other?
0
0
23
5h
The status of "waiting for review" has been waiting for almost 2 weeks
Dear App Review: We are an image processing software that undergoes regular iterations and updates with new features every month. At present, our updated version has been waiting for almost 2 weeks in the "waiting for review" state. During this waiting process, we have made the following actions but are unable to enter the review process: We requested an expedited review and informed that it was successful, but no progress was made; Contacted "App Review Status" via email; Please inform us that everything is normal and wait for the review; Refused by oneself, resubmit for review; Our users have been anxiously asking us when to update the new features, and they are very eager to use them. Apple ID of the App:6447539504 Looking forward to your reply, thank you!
0
0
27
5h
Can TestFlight be used to test build for Mac Sequoia
I have an app that I would like to test on Mac Sequoia. But when I build in Xcode Cloud, it seems to fail with a This bundle is invalid. Apple is not currently accepting applications built with this version of the OS. It also complains about the release train being closed, but I created a new release (set to manually be released). And it is pulling an old CFBundleShortVersionString somehow, but the other app versions (iOS and VisIonOS) in the same Xcode Cloud build work fine.
0
0
22
5h
ImmersiveSpaceを切り替えるとAVAudioPlayerで再生していたBGMの音が聞こえなくなる / When switching ImmersiveSpace, background music played by AVAudioPlayer is not heard.
【手順】 1.アプリを起動する。 2.ImmersiveSpace1がopenされ、3Dオブジェクトのアニメーションが再生される。 3.アニメーションが終了するとImmersiveSpace1をdismissしてImmersiveSpace2をopenする。 【期待値】 ImmersiveSpace1をopenするとBGMが再生され、ImmersiveSpace2がopenしても引き続きBGMが再生されていること。 【結果】 ImmersiveSpace1をopenするとBGMが再生され、ImmersiveSpace2がopenするとBGMの再生が止まる。 【環境】 ・実機(VisionOS2)にて発生。 ・シミュレータでは発生しない。 ・Xcode:Version 15.2 (15C500b) 【ログ】 ImmersiveSpace2をopenした際に実機で出力されている。シミュレータでは出力されない。 AVAudioSession_iOS.mm:2223 Server returned an error from destroySession:. Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service with pid 39 named com.apple.audio.AudioSession was invalidated from this process." UserInfo={NSDebugDescription=The connection to service with pid 39 named com.apple.audio.AudioSession was invalidated from this process.} 【Procedure】 Launch the application. ImmersiveSpace1 is opened and the animation of the 3D object is played. When the animation finishes, ImmersiveSpace1 is dismissed and ImmersiveSpace2 is opened. 【Expected value】 When ImmersiveSpace1 is opened, the background music should play, and when ImmersiveSpace2 is opened, the background music should continue to play. 【Result】 When ImmersiveSpace1 is opened, the BGM is played, and when ImmersiveSpace2 is opened, the BGM stops playing. 【Environment】 This problem occurs on an actual machine (VisionOS2). It does not occur on the simulator. Xcode: Version 15.2 (15C500b) 【Log】 Output on actual device when ImmersiveSpace is opened. It is not output on the simulator. AVAudioSession_iOS.mm:2223 Server returned an error from destroySession:. Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service with pid 39 named com.apple.audio.AudioSession was invalidated from this UserInfo={NSDebugDescription=The connection to service with pid 39 named com.apple.audio.AudioSession was invalidated from this AudioSession was invalidated from this process.}
0
1
32
6h
Air drop
If receive an file by the air drop when the iPhone is unlocked only with the face id ( i mean like when you are only looking at the time or the notification) can’t accept the file also when you slide down the notification you can’t accept it also . Thx
0
0
28
6h
Problème de son Bêta IOS 18 Version 2
J’ai remarqué que seul mon haut parleur supérieur fonctionne et non le principal, que ce soit pour les sonneries, vidéo, musique ou autres. J’ai essayer de le corriger en redémarrant mon IPhone mais sans succè, et connecté mes AirPods pour les déconnecter par la suite, cela a fais fonctionner le haut parleur principal mais au même volume et qualité que le supérieur c’est à dire pas grand chose. Suis-je le seul à avoir cela ?? J’ai un iPhone 15 pro sous IOS 18 version 2
1
0
32
7h
Concurrency-friendly version of `ArchiveByteStream` and `ArchiveStream`?
The AppleArchive module is pretty cool, but it relies almost entirely on two stream types, ArchiveByteStream and ArchiveStream, which don't really work well in a Swift Concurrency-based workflow since they all use thread-blocking mechanisms like pthread mutexes for synchronization, and threads in the cooperative pool should not be blocked. They also use their own thread pools for processing, independently of the cooperative thread pool, making it easy to end up with more threads than one has cores. (Perhaps even more so if you try to compress/decompress multiple files at once? The documentation isn't clear on whether separate archive operations share the same thread pool or not, but since it allows you to choose the size of the pool, it seems that these may be separate from each other as well.) Are there any plans for an interface to Apple Archive that would fit better with the structured concurrency model?
1
0
37
7h