Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

iPhone Pro Model - RefreshRate
I want to display the refresh rate on the screen of the iPhone (14 Pro Max), but each app of the App Store's "Floating Clock", "Floating Clock Premium" and "Screen Renewal Rate - RefreshRate Realtime" can only display two of 60Hz and 120Hz, and I wonder if it can display numbers of 1Hz, 10Hz, tens of Hz, or more than 3 levels. Does anyone know how to do it?
1
0
306
Sep ’25
Testing Gamecenter Accounts
HI all, I'm in the early stages of testing a game that will have real-time device-to-device communication using GameCenter. I've read something about setting up GameCenter "sandbox" accounts to test prior to the game's release, but I can't find anything reliable about how to do so. I've found lots of web pages which purport to describe how to do this task, but what they describe seems to be outdated, as the steps don't appear to be available in modern versions of iOS. I'm testing on iOS 18 and 26. Document search isn't helping--I mostly find information on how to set up StoreKit sandbox accounts, which I assume are different things altogether--so if you could point me towards an article that allows me to test a new, unreleased GameKit app between devices, I'd appreciate it. People are developing GameCenter apps, so I know it's possible... advTHANKSance
1
0
356
Sep ’25
Inquiry About Game Center Integration Analytics
Dear Apple Developer Support, I hope this message finds you well. I am a game developer looking to integrate Game Center into our game. Before proceeding, I would like to understand the analytical capabilities available post-integration. Specifically, I am interested in tracking detailed metrics such as: The number of new player downloads driven by Game Center features (e.g., friend challenges, leaderboards, or achievements). Data on user engagement and conversions originating from Game Center interactions. I have explored App Store Connect’s App Analytics but would appreciate clarification on whether Game Center-specific data (e.g., referrals from challenges or social features) is available to measure growth and optimize our strategies. If such data is accessible, could you guide me on how to view it? If not, are there alternative methods or plans to incorporate this in the future? Thank you for your time and assistance. I look forward to your response. Best regards, Bella
1
0
911
Sep ’25
macOS 26 Games app – Achievement description shows incorrect text before unlocking
Hello, I found an issue with the Games app on macOS 26 (Tahoe) when viewing achievements: In App Store Connect, each achievement has different values set for the pre-earned description and the post-earned description. When testing with GameKit directly (GKAchievementDescription), both values are returned correctly. However, in the macOS Games app, the post-earned description is shown even before the achievement is earned. This seems to be a display issue specific to the Games app on macOS. Could you confirm if this is a known bug in the Games app, or if there is a reason why pre-earned descriptions are not being shown? Thank you.
1
0
310
Sep ’25
ReplayKit Issue on iOS 26
When previewing the recording of gameplay the buttons to exit or save are unclickable behind the top bar clock and Wi-Fi/5G status bar. Which means that you have to quit the game in order to continue. Tested on multiple devices. Does anyone have a solution to this? At the moment we have disabled it altogether for iOS 26 users.
1
2
304
2w
Can't remove annotations from PdfView
Hi everyone, I faced an issue that on IOS 26 removeAnnotation method doesn't remove annotation. This code worked on previous versions (IOS 18, 17) but suddenly stopped working on IOS 26. Has anyone faced this issue? guard let document = await pdfView.document else { return } for pageIndex in 0..<document.pageCount { guard let page = document.page(at: pageIndex) else { continue } let annotations = page.annotations for annotation in annotations { page.removeAnnotation(annotation) } }
1
0
184
Oct ’25
Value of type 'SCRecordingOutput' has no member 'delegate'
Hello, I am trying to capture screen recording ( output.mp4 ) using ScreenCaptureKit and also the mouse positions during the recording ( mouse.json ). The recording and the mouse positions ( tracked based on mouse movements events only ) needs to be perfectly synced in order to add effects in post editing. I started off by using the await stream?.startCapture() and after that starting my mouse tracking function :- try await captureEngine.startCapture(configuration: config, filter: filter, recordingOutput: recordingOutput) let captureStartTime = Date() mouseTracker?.startTracking(with: captureStartTime) But every time I tested, there is a clear inconsistency in sync between the recorded video and the recorded mouse positions. The only thing I want is to know when exactly does the recording "actually" started so that I can start the mouse capture at that same time, and thus I tried using the Delegates, but being able to set them up perfectly. import Foundation import AVFAudio import ScreenCaptureKit import OSLog import Combine class CaptureEngine: NSObject, @unchecked Sendable { private let logger = Logger() private(set) var stream: SCStream? private var streamOutput: CaptureEngineStreamOutput? private var recordingOutput: SCRecordingOutput? private let videoSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.VideoSampleBufferQueue") private let audioSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.AudioSampleBufferQueue") private let micSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.MicSampleBufferQueue") func startCapture(configuration: SCStreamConfiguration, filter: SCContentFilter, recordingOutput: SCRecordingOutput) async throws { // Create the stream output delegate. let streamOutput = CaptureEngineStreamOutput() self.streamOutput = streamOutput do { stream = SCStream(filter: filter, configuration: configuration, delegate: streamOutput) try stream?.addStreamOutput(streamOutput, type: .screen, sampleHandlerQueue: videoSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .audio, sampleHandlerQueue: audioSampleBufferQueue) try stream?.addStreamOutput(streamOutput, type: .microphone, sampleHandlerQueue: micSampleBufferQueue) self.recordingOutput = recordingOutput recordingOutput.delegate = self try stream?.addRecordingOutput(recordingOutput) try await stream?.startCapture() } catch { logger.error("Failed to start capture: \(error.localizedDescription)") throw error } } func stopCapture() async throws { do { try await stream?.stopCapture() } catch { logger.error("Failed to stop capture: \(error.localizedDescription)") throw error } } func update(configuration: SCStreamConfiguration, filter: SCContentFilter) async { do { try await stream?.updateConfiguration(configuration) try await stream?.updateContentFilter(filter) } catch { logger.error("Failed to update the stream session: \(String(describing: error))") } } func stopRecordingOutputForStream(_ recordingOutput: SCRecordingOutput) throws { try self.stream?.removeRecordingOutput(recordingOutput) } } // MARK: - SCRecordingOutputDelegate extension CaptureEngine: SCRecordingOutputDelegate { func recordingOutputDidStartRecording(_ recordingOutput: SCRecordingOutput) { let startTime = Date() logger.info("Recording output did start recording \(startTime)") } func recordingOutputDidFinishRecording(_ recordingOutput: SCRecordingOutput) { logger.info("Recording output did finish recording") } func recordingOutput(_ recordingOutput: SCRecordingOutput, didFailWithError error: any Error) { logger.error("Recording output failed with error: \(error.localizedDescription)") } } private class CaptureEngineStreamOutput: NSObject, SCStreamOutput, SCStreamDelegate { private let logger = Logger() override init() { super.init() } func stream(_ stream: SCStream, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, of outputType: SCStreamOutputType) { guard sampleBuffer.isValid else { return } switch outputType { case .screen: break case .audio: break case .microphone: break @unknown default: logger.error("Encountered unknown stream output type:") } } func stream(_ stream: SCStream, didStopWithError error: Error) { logger.error("Stream stopped with error: \(error.localizedDescription)") } } I am getting error Value of type 'SCRecordingOutput' has no member 'delegate' Even though I am targeting macOs 15+ ( macOs 26 actually ) and macOs only. What is the best way to achieving the desired result? Is there any other / better way to do it?
1
0
187
Oct ’25
Request low-latency streaming for iOS/iPadOS
Just found out this key available for visionOS https://developer.apple.com/documentation/bundleresources/entitlements/com.apple.developer.low-latency-streaming It seems to keep video streaming from being interrupted by AWDL, our community needs it badly for self-hosted game streaming (PC to iPhone / iPad). Related apps: Moonlight / VoidLink / SteamLink. Can we expect this on iOS/iPadOS 26, or even iOS/iPadOS 18 ?
1
3
249
2w
Why does game mode not get triggered for my App?
I think I really have tried everything and I did all according to official documentation to support game mode on iOS or iPadOS but it doesn't matter what I do it just doesn't get triggered. Funny enough it works during development when I install it via Xcode but as soon as it is live on the store and when I install it from there game mode doesn't get triggered anymore. What I have atm I have added (even though it is deprecated) <key>GCSupportsGameMode</key> <true/> I have set the (but it seems only supported for macOS) <key>LSApplicationCategoryType</key> <string>public.app-category.games</string> I have added <key>LSSupportsGameMode</key> <true/> It just doesn't work. Is there anything else what needs to be done? Should the flag LSSupportsGameMode not be enough normally? The reason why this is so annoying is that my app is a real time streaming app and I want to profit from minimised background activities for smoother gameplay and more consistent frame rates like mentioned in the documentation.
1
0
408
1w
Fail to Create CGImageMetadata by key "HDRGainMap:HDRGainMapHeadroom"
I am trying to create an empty metadata, and set the HDRGainMapHeadroom at xxx. However the final returned mutableMetadata doesn't contain the HDRGainMap:HDRGainMapVersion or HDRGainMap:HDRGainMapHeadroom. But iio:hasXMP exist. why? Is that the reason that the namespace HDRGainMap is private? func createHDRGainMapMetadata(version: Int, headroom: Double) -> CGImageMetadata? { // Create a mutable metadata object let mutableMetadata = CGImageMetadataCreateMutable() // Define the namespace for HDRGainMap let namespace = "HDRGainMap" let xmpKeyPath = "iio:hasXMP" let xmpValue = String(true) // Set the HDRGainMapVersion item let versionKeyPath = "\(namespace):HDRGainMapVersion" let versionValue = String(version) // Set the version value let xmpSetResult = CGImageMetadataSetValueWithPath(mutableMetadata, nil, xmpKeyPath as CFString, xmpValue as CFString) if xmpSetResult == false { print("Failed to set xmp") } // Set the version value let versionSetResult = CGImageMetadataSetValueWithPath(mutableMetadata, nil, versionKeyPath as CFString, versionValue as CFString) if versionSetResult == false { print("Failed to set HDRGainMapVersion") } // Set the HDRGainMapHeadroom item let headroomKeyPath = "\(namespace):HDRGainMapHeadroom" let headroomValue = String(headroom) // Set the headroom value let headroomSetResult = CGImageMetadataSetValueWithPath(mutableMetadata, nil, headroomKeyPath as CFString, headroomValue as CFString) if headroomSetResult == false { print("Failed to set HDRGainMapHeadroom") } return mutableMetadata }
0
0
487
Dec ’24
Snap to Item with Assistive Touch does not work when building an Game from Unity.
Hi all, I have been trying to get Apple's assistive touch's snap to item to work for a unity game built using Apple's Core & Accessibility API. The switch control recognises these buttons however, eye tracking will not snap to them. The case in which it needs to snap is when an external eye tracking device is connected and utilises assistive touch & assistive touch's snap to item. All buttons in the game have a AccessibilityNode with the trait 'Button' on them & an appropriate label, which, following the documentation and comments on the developer forum, should allow them to be recognised by snap to item. This is not the case, devices (iPads and iPhones) do not recognise the buttons as a snap to target. Does anyone know why this is the case, and if this is a bug?
0
0
577
Dec ’24
AVAssetReaderTrackOutput read HDR frame from a video file.
Hello, I am trying to read video frames using AVAssetReaderTrackOutput. Here is the sample code: //prepare assets let asset = AVURLAsset(url: some_url) let assetReader = try AVAssetReader(asset: asset) guard let videoTrack = try await asset.loadTracks(withMediaCharacteristic: .visual).first else { throw SomeErrorCode.error } var readerSettings: [String: Any] = [ kCVPixelBufferIOSurfacePropertiesKey as String: [String: String]() ] //check if HDR video var isHDRDetected: Bool = false let hdrTracks = try await asset.loadTracks(withMediaCharacteristic: .containsHDRVideo) if hdrTracks.count > 0 { readerSettings[AVVideoAllowWideColorKey as String] = true readerSettings[kCVPixelBufferPixelFormatTypeKey as String] = kCVPixelFormatType_420YpCbCr10BiPlanarFullRange isHDRDetected = true } //add output to assetReader let output = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: readerSettings) guard assetReader.canAdd(output) else { throw SomeErrorCode.error } assetReader.add(output) guard assetReader.startReading() else { throw SomeErrorCode.error } //add writer ouput settings let videoOutputSettings: [String: Any] = [ AVVideoCodecKey: AVVideoCodecType.hevc, AVVideoWidthKey: 1920, AVVideoHeightKey: 1080, ] let finalPath = "//some URL oath" let assetWriter = try AVAssetWriter(outputURL: finalPath, fileType: AVFileType.mov) guard assetWriter.canApply(outputSettings: videoOutputSettings, forMediaType: AVMediaType.video) else { throw SomeErrorCode.error } let assetWriterInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoOutputSettings) let sourcePixelAttributes: [String: Any] = [ kCVPixelBufferPixelFormatTypeKey as String: isHDRDetected ? kCVPixelFormatType_420YpCbCr10BiPlanarFullRange : kCVPixelFormatType_32ARGB, kCVPixelBufferWidthKey as String: 1920, kCVPixelBufferHeightKey as String: 1080, ] //create assetAdoptor let assetAdaptor = AVAssetWriterInputTaggedPixelBufferGroupAdaptor( assetWriterInput: assetWriterInput, sourcePixelBufferAttributes: sourcePixelAttributes) guard assetWriter.canAdd(assetWriterInput) else { throw SomeErrorCode.error } assetWriter.add(assetWriterInput) guard assetWriter.startWriting() else { throw SomeErrorCode.error } assetWriter.startSession(atSourceTime: CMTime.zero) //prepare tranfer session var session: VTPixelTransferSession? = nil guard VTPixelTransferSessionCreate(allocator: kCFAllocatorDefault, pixelTransferSessionOut: &session) == noErr, let session else { throw SomeErrorCode.error } guard let pixelBufferPool = assetAdaptor.pixelBufferPool else { throw SomeErrorCode.error } //read through frames while let nextSampleBuffer = output.copyNextSampleBuffer() { autoreleasepool { guard let imageBuffer = CMSampleBufferGetImageBuffer(nextSampleBuffer) else { return } //this part copied from (https://developer.apple.com/videos/play/wwdc2023/10181) at 23:58 timestamp let attachment = [ kCVImageBufferYCbCrMatrixKey: kCVImageBufferYCbCrMatrix_ITU_R_2020, kCVImageBufferColorPrimariesKey: kCVImageBufferColorPrimaries_ITU_R_2020, kCVImageBufferTransferFunctionKey: kCVImageBufferTransferFunction_SMPTE_ST_2084_PQ, ] CVBufferSetAttachments(imageBuffer, attachment as CFDictionary, .shouldPropagate) //now convert to CIImage with HDR data let image = CIImage(cvPixelBuffer: imageBuffer) let cropped = "" //here perform some actions like cropping, flipping, etc. and preserve this changes by converting the extent to CGImage first: //this part copied from (https://developer.apple.com/videos/play/wwdc2023/10181) at 24:30 timestamp guard let cgImage = context.createCGImage( cropped, from: cropped.extent, format: .RGBA16, colorSpace: CGColorSpace(name: CGColorSpace.itur_2100_PQ)!) else { continue } //finally convert it back to CIImage let newScaledImage = CIImage(cgImage: cgImage) //now write it to a new pixelBuffer let pixelBufferAttributes: [String: Any] = [ kCVPixelBufferCGImageCompatibilityKey as String: true, kCVPixelBufferCGBitmapContextCompatibilityKey as String: true, ] var pixelBuffer: CVPixelBuffer? CVPixelBufferCreate( kCFAllocatorDefault, Int(newScaledImage.extent.width), Int(newScaledImage.extent.height), kCVPixelFormatType_420YpCbCr10BiPlanarFullRange, pixelBufferAttributes as CFDictionary, &pixelBuffer) guard let pixelBuffer else { continue } context.render(newScaledImage, to: pixelBuffer) //context is a CIContext reference var pixelTransferBuffer: CVPixelBuffer? CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, pixelBufferPool, &pixelTransferBuffer) guard let pixelTransferBuffer else { continue } // Transfer the image to the pixel buffer. guard VTPixelTransferSessionTransferImage(session, from: pixelBuffer, to: pixelTransferBuffer) == noErr else { continue } //finally append to taggedBuffer } } assetWriterInput.markAsFinished() await assetWriter.finishWriting() The result video is not in correct color as the original video. It turns out too bright. If I play around with attachment values, it can be either too dim or too bright but not exactly proper as the original video. What am I missing in my setup? I did find that kCVPixelFormatType_4444AYpCbCr16 can produce proper video output but then I can't convert it to CIImage and so I can't do the CIImage operations that I need. Mainly cropping and resizing the CIImage
0
0
651
Dec ’24
OCR does not work
Hi, I'm working with a very simple app that tries to read a coordinates card and past the data into diferent fields. The card's layout is COLUMNS from 1-10, ROWs from A-J and a two digit number for each cell. In my app, I have field for each of those cells (A1, A2...). I want that OCR to read that card and paste the info but I just cant. I have two problems. The camera won't close. It remains open until I press the button SAVE (this is not good because a user could take 3, 4, 5... pictures of the same card with, maybe, different results, and then? Which is the good one?). Then, after I press save, I can see the OCR kinda works ( the console prints all the date read) but the info is not pasted at all. Any idea? I know is hard to know what's wrong but I've tried chatgpt and all it does... just doesn't work This is the code from the scanview import SwiftUI import Vision import VisionKit struct ScanCardView: UIViewControllerRepresentable { @Binding var scannedCoordinates: [String: String] var useLettersForColumns: Bool var numberOfColumns: Int var numberOfRows: Int @Environment(.presentationMode) var presentationMode func makeUIViewController(context: Context) -&gt; VNDocumentCameraViewController { let scannerVC = VNDocumentCameraViewController() scannerVC.delegate = context.coordinator return scannerVC } func updateUIViewController(_ uiViewController: VNDocumentCameraViewController, context: Context) {} func makeCoordinator() -&gt; Coordinator { return Coordinator(self) } class Coordinator: NSObject, VNDocumentCameraViewControllerDelegate { let parent: ScanCardView init(_ parent: ScanCardView) { self.parent = parent } func documentCameraViewController(_ controller: VNDocumentCameraViewController, didFinishWith scan: VNDocumentCameraScan) { print("Escaneo completado, procesando imagen...") guard scan.pageCount &gt; 0, let image = scan.imageOfPage(at: 0).cgImage else { print("No se pudo obtener la imagen del escaneo.") controller.dismiss(animated: true, completion: nil) return } recognizeText(from: image) DispatchQueue.main.async { print("Finalizando proceso OCR y cerrando la cámara.") controller.dismiss(animated: true, completion: nil) } } func documentCameraViewControllerDidCancel(_ controller: VNDocumentCameraViewController) { print("Escaneo cancelado por el usuario.") controller.dismiss(animated: true, completion: nil) } func documentCameraViewController(_ controller: VNDocumentCameraViewController, didFailWithError error: Error) { print("Error en el escaneo: \(error.localizedDescription)") controller.dismiss(animated: true, completion: nil) } private func recognizeText(from image: CGImage) { let request = VNRecognizeTextRequest { (request, error) in guard let observations = request.results as? [VNRecognizedTextObservation], error == nil else { print("Error en el reconocimiento de texto: \(String(describing: error?.localizedDescription))") DispatchQueue.main.async { self.parent.presentationMode.wrappedValue.dismiss() } return } let recognizedStrings = observations.compactMap { observation in observation.topCandidates(1).first?.string } print("Texto reconocido: \(recognizedStrings)") let filteredCoordinates = self.filterValidCoordinates(from: recognizedStrings) DispatchQueue.main.async { print("Coordenadas detectadas después de filtrar: \(filteredCoordinates)") self.parent.scannedCoordinates = filteredCoordinates } } request.recognitionLevel = .accurate let handler = VNImageRequestHandler(cgImage: image, options: [:]) DispatchQueue.global(qos: .userInitiated).async { do { try handler.perform([request]) print("OCR completado y datos procesados.") } catch { print("Error al realizar la solicitud de OCR: \(error.localizedDescription)") } } } private func filterValidCoordinates(from strings: [String]) -&gt; [String: String] { var result: [String: String] = [:] print("Texto antes de filtrar: \(strings)") for string in strings { let trimmedString = string.replacingOccurrences(of: " ", with: "") if parent.useLettersForColumns { let pattern = "^[A-J]\\d{1,2}$" // Letras de A-J seguidas de 1 o 2 dígitos if trimmedString.range(of: pattern, options: .regularExpression) != nil { print("Coordenada válida detectada (letras): \(trimmedString)") result[trimmedString] = "Valor" // Asignación de prueba } } else { let pattern = "^[1-9]\\d{0,1}$" // Solo números, de 1 a 99 if trimmedString.range(of: pattern, options: .regularExpression) != nil { print("Coordenada válida detectada (números): \(trimmedString)") result[trimmedString] = "Valor" } } } print("Coordenadas finales después de filtrar: \(result)") return result } } }
0
0
545
Jan ’25
Core Image recipe for QR code icon image
Create the QRCode CIFilter<CIBlendWithMask> *f = CIFilter.QRCodeGenerator; f.message = [@"Message" dataUsingEncoding:NSASCIIStringEncoding]; f.correctionLevel = @"Q"; // increase level CIImage *qrcode = f.outputImage; Overlay the icon CIImage *icon = [CIImage imageWithURL:url]; CGAffineTransform *t = CGAffineTransformMakeTranslation( (qrcode.extent.width-icon.extent.width)/2.0, (qrcode.extent.height-icon.extent.height)/2.0); icon = [icon imageByApplyingTransform:t]; qrcode = [icon imageByCompositingOver:qrcode]; Round off the corners static dispatch_once_t onceToken; static CIWarpKernel *k; dispatch_once(&onceToken, ^ { k = [CIWarpKernel kernelWithFunctionName:name fromMetalLibraryData:metalLibData() error:nil]; }); CGRect iExtent = image.extent; qrcode = [k applyWithExtent:qrcode.extent roiCallback:^CGRect(int i, CGRect r) { return CGRectInset(r, -radius, -radius); } inputImage:qrcode arguments:@[[CIVector vectorWithCGRect:qrcode.extent], @(radius)]]; …and this code for the kernel should go in a separate .ci.metal source file: float2 bend_corners (float4 extent, float s, destination dest) { float2 p, dc = dest.coord(); float ratio = 1.0; // Round lower left corner p = float2(extent.x+s,extent.y+s); if (dc.x < p.x && dc.y < p.y) { float2 d = abs(dc - p); ratio = min(d.x,d.y)/max(d.x,d.y); ratio = sqrt(1.0 + ratio*ratio); return (dc - p)*ratio + p; } // Round lower right corner p = float2(extent.x+extent.z-s, extent.y+s); if (dc.x > p.x && dc.y < p.y) { float2 d = abs(dc - p); ratio = min(d.x,d.y)/max(d.x,d.y); ratio = sqrt(1.0 + ratio*ratio); return (dc - p)*ratio + p; } // Round upper left corner p = float2(extent.x+s,extent.y+extent.w-s); if (dc.x < p.x && dc.y > p.y) { float2 d = abs(dc - p); ratio = min(d.x,d.y)/max(d.x,d.y); ratio = sqrt(1.0 + ratio*ratio); return (dc - p)*ratio + p; } // Round upper right corner p = float2(extent.x+extent.z-s, extent.y+extent.w-s); if (dc.x > p.x && dc.y > p.y) { float2 d = abs(dc - p); ratio = min(d.x,d.y)/max(d.x,d.y); ratio = sqrt(1.0 + ratio*ratio); return (dc - p)*ratio + p; } return dc; }
0
0
85
Mar ’25
Score range of ImageAestheticsScoresObservation in Vision framework
Hi everyone, I'm using the Vision framework’s ImageAestheticsScoresObservation class (https://developer.apple.com/documentation/vision/imageaestheticsscoresobservation). I noticed that the overallScore returned sometimes gives negative values. Could someone confirm whether the expected range of the score is from -1.0 to 1.0? The documentation doesn’t explicitly state the possible score range, so I’d appreciate any clarification or insights. Thanks in advance!
0
0
78
Apr ’25
Apple Unity plugin issue
I use unity 2020.3.48f1 to develop a game; trying to implement Apple Services integration I use Apple unity plugins(https://github.com/apple/unityplugins) Using latest version of unity plugins I getting error in Unity project after plugin import It say "Not allowed platform VisionOS" When I tryed to use older version of the plugins I getting error on runtime when calling "var fetchItemsResponse = await GKLocalPlayer.Local.FetchItems();" in line 42 it drop EXC_BAD_ACCESS(code=257, address=0x0000...) error I tryed to use different commits from official repositorys and even custom branches of apple unity plugins like (https://github.com/muZZkat/unityplugins/tree/muzzkat/fix-fetch-items) but it did not help There is whole my script which trying to use apple unuity plugins using System.Threading.Tasks; using UnityEngine; using System.Collections; using System; using Apple.GameKit; using UnityEngine.UI; public class TheScript : MonoBehaviour { [SerializeField] InputField otp; string Signature; string TeamPlayerID; string Salt; string PublicKeyUrl; string Timestamp; void Start() { StartCoroutine(Call()); } private IEnumerator Call() { yield return new WaitForSeconds(5); Login(); } public async Task Login() { otp.text += $"Loginig... "; if (!Apple.GameKit.GKLocalPlayer.Local.IsAuthenticated) { try { var player = await GKLocalPlayer.Authenticate(); var localPlayer = GKLocalPlayer.Local; TeamPlayerID = localPlayer.TeamPlayerId; var fetchItemsResponse = await GKLocalPlayer.Local.FetchItems(); Signature = Convert.ToBase64String(fetchItemsResponse.GetSignature()); PublicKeyUrl = fetchItemsResponse.PublicKeyUrl; otp.text += $"Team Player ID: {TeamPlayerID} "; otp.text += $"PublicKeyUrl: {PublicKeyUrl} "; } catch(Exception e) { otp.text += $"Error: " + e.Message; } } else { Debug.Log("AppleGameCenter player already logged in."); } } async Task SignInWithAppleGameCenterAsync(string signature, string teamPlayerId, string publicKeyURL, string salt, ulong timestamp) { } }
0
1
147
May ’25