Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Game Rejected as Spam
My app is being rejected and all I'm being told is that it is spam. I've tried improving various aspects of the game, but I just receive the same copy and paste rejection message each time. I have no idea if I'm moving in the right direction or what part of my game needs to be changed or improved. Is there a game quality benchmark document or some kind of resource I can use to better understand why my game is being rejected and how to bring it to a level that meets apple's standards.
1
0
85
Apr ’25
How to Enable Game Mode
What is Game Mode? Game Mode optimizes your gaming experience by giving your game the highest priority access to your CPU and GPU, lowering usage for background tasks. And it doubles the Bluetooth sampling rate, which reduces input latency and audio latency for wireless accessories like game controllers and AirPods. See Use Game Mode on Mac See Port advanced games to Apple platforms How can I enable Game Mode in my game? Add the Supports Game Mode property (GCSupportsGameMode) to your game’s Info.plist and set to true Correctly identify your game’s Application Category with LSApplicationCategoryType (also Info.plist) Note: Enabling Game Mode makes your game eligible but is not a guarantee; the OS decides if it is ok to enable Game Mode at runtime An app that enables Game Mode but isn’t a game will be rejected by App Review. How can I disable Game Mode? Set GCSupportsGameMode to false. Note: On Mac Game Mode is automatically disabled if the game isn’t running full screen.
1
0
308
Jul ’25
GCController.shouldMonitorBackgroundEvents = true broken?
I am suspecting that setting GCController.shouldMonitorBackgroundEvents = true does not actually make the game controllers inputs accessible to the app when it is in the background. About this value the official documentation says: A Boolean value that indicates whether the app needs to respond to controller events when it isn’t the frontmost app. Now the behavior is that when the app is in focus the users inputs do get correctly recognized but as soon as the app enters the background no inputs get recognized. The controller does not get reported as disconnecting and still works for example in launchpad. I am sure that about 2 months ago when I first used this it did work as one would expect. I also have seen that an app which lets users execute certain actions using their controller has stoped working recently, adding to my suspicion of the feature being broken. Here is a minimum reproducible example: import SwiftUI import GameController @main struct TestingControllerConnectionApp: App { @NSApplicationDelegateAdaptor(AppDelegate.self) var appDelegate var body: some Scene { WindowGroup { ContentView() } } } class AppDelegate: NSObject, NSApplicationDelegate { var statusItem: NSStatusItem? var controller: GCController? func applicationDidFinishLaunching(_ notification: Notification) { setupMenuBar() GCController.shouldMonitorBackgroundEvents = true NotificationCenter.default.addObserver( self, selector: #selector(controllerDidConnect), name: .GCControllerDidConnect, object: nil ) NotificationCenter.default.addObserver( self, selector: #selector(controllerDidDisconnect), name: .GCControllerDidDisconnect, object: nil ) } @objc private func setupMenuBar() { let menu = NSMenu() menu.addItem(NSMenuItem(title: "Quit", action: #selector(quitApp), keyEquivalent: "q")) statusItem = NSStatusBar.system.statusItem(withLength: NSStatusItem.variableLength) statusItem?.button?.image = NSImage(resource: .controllerBar) statusItem?.menu = menu } @objc private func quitApp() { NSApp.terminate(nil) } @objc private func controllerDidConnect(_ notification: Notification) { if let controller = notification.object as? GCController { print("Controller connected") self.controller = controller if let gamepad = controller.extendedGamepad { gamepad.buttonA.pressedChangedHandler = { _, _, pressed in print("Button A pressed: \(pressed)") } } } } @objc private func controllerDidDisconnect(_ notification: Notification) { print("Controller disconnected") } } This is created in a completely fresh Xcode project and NSHumanInterfaceDeviceUsageDescription has been added. I am using a PS5 Controller and a Mac running MacOS 15.4.1 which has been restarted and only Xcode and the app have been opened. I have tested this with setting a multitude of different entitlements and capabilities including: NSHumanInterfaceDeviceUsageDescription Supports Controller User Interaction Required background modes -> App communicates with an accessory com.apple.security.device.bluetooth com.apple.security.device.hid com.apple.security.device.usb I have also set this value at different points in the code with no change of effect. Does anybody see if there is any fault in my code or my understanding of the effect of the value 'shouldMonitorBackgroundEvents'? Or is this the functionality actually being broken on Apples part?
1
0
112
Apr ’25
How to obtain frame rate for iOS proMotion devices
Due to the release of ProMotion devices, the system may switch frame rates in certain scenarios, resulting in the loss of reference value for data collected through CADisplayLink callbacks at a fixed 60Hz frame rate. We cannot distinguish whether the slow callback of CADisplayLink is due to a stutter or a system switch in frame rate. I know Hitch Time Ratio, but I can't use this scheme for some reasons. How can I distinguish between stuck and frame rate gear shift in CADisplaylink callback? In iOS 15, CADisplayLink.preferredFrameRateRange.preferred always returns 0, while minimum and maximum do change. Can I use these minimum and maximum range values as criteria to distinguish between frame rate switching and stuttering?
1
0
111
May ’25
Embedded links not clickable in PDFs for iOS devices
I have a SPFx React application where I am printing the HTML page content using the javascript default window.print() functionality. Once I save the page as pdf from the print preview window and open it using Adobe Acrobat, the links(for eg -> Google) within the content are not clickable and appearing as plain text. I have tried to print random pages post searching with any keywords in Google and saved the files as pdfs, but, unfortunately, the links are still not clickable there as well. To check whether it is an Adobe Acrobat issue, I have performed the same print functionality from Android devices and shared the pdf file across the iOS devices and in that case, when opened using Adobe Acrobat, the links are appearing to be clickable. I am wondering whether it is something related to how the default print functionality works for iPadOS and iOS devices. Any insights on this would be really helpful. Thanks!!! Note: The links are clickable for MacOS as well as for Windows. #ios #ipados #javascript #spfx #react
2
0
106
May ’25
Trouble with MDLMesh.newBox()
I'm trying to build an MDLMesh then add normals let mdlMesh = MDLMesh.newBox(withDimensions: SIMD3<Float>(1, 1, 1), segments: SIMD3<UInt32>(2, 2, 2), geometryType: MDLGeometryType.triangles, inwardNormals:false, allocator: allocator) mdlMesh.addNormals(withAttributeNamed: MDLVertexAttributeNormal, creaseThreshold: 0) When I render the mesh, some normals are (0,0,0). I don't know if the problem is in the mesh, or in the conversion to MTKMesh. Is there a way to examine an MDLMesh with the geometry viewer? When I look at the variable values for my mdlMesh I get this: Not too useful. I don't know how to track down the normals. What's the best way to find out where the normals getting broken?
1
0
96
May ’25
Does OLED burn-in affect iPad Pro displays, and should I implement a screen saver in my always-on app?
Hi everyone, I’m developing an iPad app that will be running continuously with the screen always on — similar to a restaurant ordering system. I understand that some of the newer iPad Pro models are equipped with OLED displays. I'm concerned about the potential risk of screen burn-in due to static UI elements being displayed for extended periods. Does burn-in occur on the OLED iPad Pro models under such usage? Would it be advisable to implement a screen saver or periodically animate/change parts of the UI to prevent this? Any insights or best practices would be greatly appreciated. Thank you!
1
0
83
Jun ’25
CGContext PDF/A intents
let dic : [AnyHashable:Any] = [ kCGPDFXRegistryName: "http://www.color.org" as CFString, kCGPDFXOutputConditionIdentifier: "FOGRA43" as CFString, kCGPDFContextOutputIntent: "GTS_PDFX" as CFString, kCGPDFXOutputIntentSubtype: "GTS_PDFX" as CFString, kCGPDFContextCreateLinearizedPDF: "" as CFString, kCGPDFContextCreatePDFA: "" as CFString, kCGPDFContextAuthor: "Placeholder" as CFString, kCGPDFContextCreator: "Placeholder" as CFString ] Hello, Now I would like to export my PDF's as PDF/A. In my opinion, there is also the right option for this under Core Graphics. Unfortunately, the documentation does not show what is 'kCGPDFContextCreatePDFA' or 'kCGPDFContextLinearizedPDF' for a stringvalue is required. What I have already tried: GTS_PDFA1 , PDF/A-1, true as CFString. (Above my CFDictionary. ...Author e.g are working perfectly.) In the Finder you can see these two options, which I would also like to implement in my app. Thank you in advance!
1
0
123
Jun ’25
What good is NSBitmapFormatAlphaNonpremultiplied?
If I create a bitmap image and then try to get ready to draw into it, like so: NSBitmapImageRep* newRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes: nullptr pixelsWide: 128 pixelsHigh: 128 bitsPerSample: 8 samplesPerPixel: 4 hasAlpha: YES isPlanar: NO colorSpaceName: NSDeviceRGBColorSpace bitmapFormat: NSBitmapFormatAlphaNonpremultiplied | NSBitmapFormatThirtyTwoBitBigEndian bytesPerRow: 4 * 128 bitsPerPixel: 32]; [NSGraphicsContext setCurrentContext: [NSGraphicsContext graphicsContextWithBitmapImageRep: newRep]]; then the log shows this error: CGBitmapContextCreate: unsupported parameter combination: RGB 8 bits/component, integer 512 bytes/row kCGImageAlphaLast kCGImageByteOrderDefault kCGImagePixelFormatPacked Valid parameters for RGB color space model are: 16 bits per pixel, 5 bits per component, kCGImageAlphaNoneSkipFirst 32 bits per pixel, 8 bits per component, kCGImageAlphaNoneSkipFirst 32 bits per pixel, 8 bits per component, kCGImageAlphaNoneSkipLast 32 bits per pixel, 8 bits per component, kCGImageAlphaPremultipliedFirst 32 bits per pixel, 8 bits per component, kCGImageAlphaPremultipliedLast 32 bits per pixel, 10 bits per component, kCGImageAlphaNone|kCGImagePixelFormatRGBCIF10|kCGImageByteOrder16Little 64 bits per pixel, 16 bits per component, kCGImageAlphaPremultipliedLast 64 bits per pixel, 16 bits per component, kCGImageAlphaNoneSkipLast 64 bits per pixel, 16 bits per component, kCGImageAlphaPremultipliedLast|kCGBitmapFloatComponents|kCGImageByteOrder16Little 64 bits per pixel, 16 bits per component, kCGImageAlphaNoneSkipLast|kCGBitmapFloatComponents|kCGImageByteOrder16Little 128 bits per pixel, 32 bits per component, kCGImageAlphaPremultipliedLast|kCGBitmapFloatComponents 128 bits per pixel, 32 bits per component, kCGImageAlphaNoneSkipLast|kCGBitmapFloatComponents See Quartz 2D Programming Guide (available online) for more information. If I don't use NSBitmapFormatAlphaNonpremultiplied as part of the format, I don't get the error message. My question is, why does the constant NSBitmapFormatAlphaNonpremultiplied exist if you can't use it like this? If you're wondering why I wanted to do this: I want to extract the RGBA pixel data from an image, which might have non-premultiplied alpha. And elsewhere online, I saw advice that if you want to look at the pixels of an image, draw it into a bitmap whose format you know and look at those pixels. And I don't want the process of drawing to premultiply my alpha.
3
0
106
Jun ’25
Firebase not initializing in iOS Unreal Engine 5.6 app (using MetaHuman + FirebaseFeatures)
Hi everyone, I'm building a native iOS app using Unreal Engine 5.6 with Firebase for authentication and Firestore. The app uses a MetaHuman avatar and is meant to run as a standalone UE app on iPhone. I'm using this Firebase wrapper: 👉 https://pandoa.github.io/FirebaseFeatures/ I've followed all the steps, including: Adding GoogleService-Info.plist to the Xcode project and ensuring it’s in the correct target Calling FIRApp.configure() in AppDelegate Verifying the plist is bundled correctly However, the app crashes on launch, and Firebase does not initialize properly. Crash log shows: [FirebaseCore][I-COR000005] No app has been configured yet. Setup details: Unreal Engine: 5.6 (source build, macOS) iOS Deployment: 17.5 MetaHuman character packaged correctly and app launches fine without Firebase Has anyone here managed to get Firebase working inside a native Unreal Engine iOS app with this setup? I'd love to hear if there’s something I’m missing — maybe something with initialization timing or module loading? Thanks so much in advance 🙏
1
0
721
Jul ’25
Xcode compile stucks(stops) when add new source files or add functions to previous files
Hello, we are working on a iOS game project, as progress, the project grows larger and larger. Because we are using other game dependencies and libraries, here larger and larger refers to the whole project, and our source files integrated and compiled by Xcode are not many. Now, it seems we hit a bottleneck, when I add new files or functions to the previous files to implement a new feature, Xcode compile stucks(stops), it's Indexing | Initializing datastore forever, cannot produce a final build. macOS 15.1, Xcode 16.2 Can you provide any solutions to solve this problem? Also submitted Feedback ID #FB18432749
4
0
641
Aug ’25
CIBumpDistortion filter not working on my view
I'm trying to apply a CIBumpDistortion Core Image filter to a view that contains a UILabel (my storyLabel). The goal is to create a visual bump/magnifying glass effect over the text. However, despite my attempts, the filter doesn't seem to render at all. The view and the label appear as normal, with no distortion effect. I've tried adjusting the filter parameters and reviewing the view hierarchy, but without success. I also haven't been able to find clear documentation or examples for applying this filter to a UIView's layer. // // TVView.swift // Mistery // // Created by Joje on 31/07/25. // import CoreImage import CoreImage.CIFilterBuiltins import UIKit import AVFoundation final class TVView: UIView { // propriedades animacao texto private var textAnimationTimer: Timer? private var fullTextToAnimate: String = "" private var currentCharIndex: Int = 0 // propriedades video estatica private var player: AVQueuePlayer? private var playerLayer: AVPlayerLayer? private var playerLooper: AVPlayerLooper? var onNextButtonTap: () -> Void = {} // MARK: - Subviews // imagem da TV private(set) lazy var tvImageView: UIImageView = { let imageView = UIImageView() imageView.translatesAutoresizingMaskIntoConstraints = false imageView.image = UIImage(named: "tvFinal") imageView.contentMode = .scaleAspectFit return imageView }() // texto que passa dentro da TV private(set) lazy var storyLabel: UILabel = { let label = UILabel() label.translatesAutoresizingMaskIntoConstraints = false //label.backgroundColor = .gray label.textColor = .red label.font = UIFont(name: "MeltedMonster", size: 30) label.textAlignment = .left label.numberOfLines = 0 label.text = "" return label }() private(set) lazy var nextButton: UIButton = { let button = UIButton(type: .system) button.translatesAutoresizingMaskIntoConstraints = false //button.backgroundColor = .darkGray button.addTarget(self, action: #selector(didPressNextButton), for: .touchUpInside) return button }() // MARK: - Lifecycle override init(frame: CGRect) { super.init(frame: frame) backgroundColor = .black setupVideoPlayer() addSubviews() setupConstraints() } override func layoutSubviews() { super.layoutSubviews() playerLayer?.frame = tvImageView.frame.insetBy(dx: tvImageView.frame.width * 0.05, dy: tvImageView.frame.height * 0.18) setupFisheyeEffect() } private func setupFisheyeEffect() { // cria o filtro guard let filter = CIFilter(name: "CIBumpDistortion") else {return print("erro")} storyLabel.layer.shouldRasterize = true storyLabel.layer.rasterizationScale = UIScreen.main.scale // define os parametros filter.setDefaults() // centro do efeito let center = CIVector(x: storyLabel.bounds.midX, y: storyLabel.bounds.midY) filter.setValue(center, forKey: kCIInputCenterKey) // raio de distorção filter.setValue(storyLabel.bounds.width, forKey: kCIInputRadiusKey) // intensidade de distorção filter.setValue(7, forKey: kCIInputScaleKey) storyLabel.layer.filters = [filter] } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } // MARK: - Button actions @objc private func didPressNextButton() { onNextButtonTap() } @objc private func animateNextCharacter() { guard currentCharIndex < fullTextToAnimate.count else { textAnimationTimer?.invalidate() return } let currentTextIndex = fullTextToAnimate.index(fullTextToAnimate.startIndex, offsetBy: currentCharIndex) let partialText = String(fullTextToAnimate[...currentTextIndex]) storyLabel.text = partialText currentCharIndex += 1 } public func updateStoryText(with text: String) { textAnimationTimer?.invalidate() storyLabel.text = "" fullTextToAnimate = text currentCharIndex = 0 textAnimationTimer = Timer.scheduledTimer(timeInterval: 0.12, target: self, selector: #selector(animateNextCharacter), userInfo: nil, repeats: true) } // MARK: - Setup methods private func setupVideoPlayer() { guard let videoURL = Bundle.main.url(forResource: "static-video", withExtension: "mov") else { print("Erro: Não foi possível encontrar o arquivo de vídeo static-video.mov") return } let playerItem = AVPlayerItem(url: videoURL) player = AVQueuePlayer(playerItem: playerItem) // LINHA COM POSSIVEL ERRO playerLooper = AVPlayerLooper(player: player!, templateItem: playerItem) playerLayer = AVPlayerLayer(player: player) playerLayer?.videoGravity = .resizeAspectFill if let layer = playerLayer { self.layer.addSublayer(layer) } player?.play() } private func addSubviews() { self.addSubview(storyLabel) self.addSubview(tvImageView) self.addSubview(nextButton) } private func setupConstraints() { NSLayoutConstraint.activate([ // TV Image tvImageView.centerXAnchor.constraint(equalTo: centerXAnchor), tvImageView.centerYAnchor.constraint(equalTo: centerYAnchor), tvImageView.widthAnchor.constraint(equalTo: widthAnchor), // TV Text storyLabel.centerXAnchor.constraint(equalTo: tvImageView.centerXAnchor, constant: -50), storyLabel.centerYAnchor.constraint(equalTo: tvImageView.centerYAnchor, constant: -25), storyLabel.widthAnchor.constraint(equalTo: tvImageView.widthAnchor, multiplier: 0.35), storyLabel.heightAnchor.constraint(equalTo: tvImageView.heightAnchor, multiplier: 0.42), //TV Button nextButton.topAnchor.constraint(equalTo: tvImageView.centerYAnchor, constant: -25), nextButton.centerXAnchor.constraint(equalTo: self.centerXAnchor, constant: 190), nextButton.widthAnchor.constraint(equalToConstant: 100), nextButton.heightAnchor.constraint(equalToConstant: 160) ]) } } #Preview{ ViewController() }
3
0
113
Sep ’25
Low Power Mode on MacOS 26 Tahoe + Vsync fullscreen limits application to 30 fps
I'm experiencing a specific issue where when using any of the MacOS 26 Tahoe betas with Low Power Mode enabled and using Vsync in fullscreen, my application framerate gets limited to a hard 30 fps. I have not experienced this on any older OS. For example Low Power Mode on 13.6 Ventura with Vsync fullscreen lets my application run at full 60 fps without issues. Is this a bug or a change in behavior of Low Power Mode on Tahoe? My application is 3D, runs at 60 fps and is sensitive to tearing, so I need Vsync and it is mostly utilized in fullscreen. And Low Power Mode is a default for many Macs, so default experience on Tahoe currently is a halved 30 fps. However there also seems to be inconsistencies of on which machines this happens, but older OSes are always fine.
1
0
200
Aug ’25
Will OpenGL API and Drivers be removed after appleOS 26?
Hi, I am a Multimedia and Graphics researcher and I am wondering if OpenGL API and drivers will be removed after appleOS 26? macOS 26 iOS 26 iPadOS 26 visionOS 26 I am asking this because most of the libraries I use depends on OpenGL. Like CGAL, libigl, immediate mode ui, nanovg, nanogui, bullet physics. Transitioning from Vulkan and metal while using and learning those libraries is just not viable. I would like to ask you that. I am the sole developer and I just want to ask you that. Regards.
1
0
180
Aug ’25
Is there any future for screensavers on macOS?
I haven't been looking at screensavers for a long time because of Apple's lack of will (or resources?) to provide a public version of the private modern SDK used by Apple for a very long time now. I'm now looking at the Screen Saver pane in System Settings (the What-If version of System Preferences in an alternate universe where all screens are in portrait mode). In macOS Sequoia, it seems like 3rd party screensavers are not welcome considering that they are relegated to the "Other" section at the bottom of the list and you have to click Show All to start seeing 3rd party screen savers. I also had a quick look at macOS Tahoe Beta 3 and it looks like that all the real screensavers are gone (3rd party and the ones from Apple: Hello, Message, Flurry, etc.) or at least it requires to be a Nobel Prize to find them (and the Search field is not useful). I tried to install a 3rd party screen saver on macOS Tahoe Beta 3, it doesn't show up in the list. To summarize: No public access to modern APIs AFAIK. UI that is hostile to 3rd party screen savers on macOS Sequoia. Apparently only screensavers that are slideshows or movies curated by Apple in macOS Tahoe b3. Hence the question: Is there any future for screen savers on macOS? Because if there's none, I won't waste my time trying to update some old screen savers.
3
0
478
Aug ’25
Request: API to temporarily defer Reachability for fullscreen game swipes (bottom edge)
In swipe-driven games, a first downward swipe starting near the home indicator can trigger Reachability, even when using preferredScreenEdgesDeferringSystemGestures = .bottom and prefersHomeIndicatorAutoHidden = true. This causes the app to slide down to half screen and breaks gameplay. Please consider an API to temporarily defer Reachability while a custom gesture is active (similar to existing system gesture deferral), without disabling accessibility globally. Environment: Devices: iPhones with Home Indicator (Face ID) Why this matters: Bottom-origin swipes are core in many games (flick shots, slingshot, physics toss, bottom sheets). Current workarounds degrade UX and discoverability, and players still accidentally trigger Reachability. Feedback Assistant Post
2
0
626
Aug ’25
Unity PolySpatia
Hello, We are experiencing an issue with apparent lag or latency when interacting with objects in Unity using the XR Interaction Toolkit. Even when interactables are configured with the Movement Type set to Instantaneous, objects exhibit a noticeable delay when following hand movement. This issue is particularly apparent when the hand moves at higher speeds. We would like to know: is any additional configuration required, or is there something else we need to do to resolve this? Thank you very much for your assistance.
2
0
420
Aug ’25
iPhone Pro Model - RefreshRate
I want to display the refresh rate on the screen of the iPhone (14 Pro Max), but each app of the App Store's "Floating Clock", "Floating Clock Premium" and "Screen Renewal Rate - RefreshRate Realtime" can only display two of 60Hz and 120Hz, and I wonder if it can display numbers of 1Hz, 10Hz, tens of Hz, or more than 3 levels. Does anyone know how to do it?
1
0
288
Sep ’25
Testing Gamecenter Accounts
HI all, I'm in the early stages of testing a game that will have real-time device-to-device communication using GameCenter. I've read something about setting up GameCenter "sandbox" accounts to test prior to the game's release, but I can't find anything reliable about how to do so. I've found lots of web pages which purport to describe how to do this task, but what they describe seems to be outdated, as the steps don't appear to be available in modern versions of iOS. I'm testing on iOS 18 and 26. Document search isn't helping--I mostly find information on how to set up StoreKit sandbox accounts, which I assume are different things altogether--so if you could point me towards an article that allows me to test a new, unreleased GameKit app between devices, I'd appreciate it. People are developing GameCenter apps, so I know it's possible... advTHANKSance
1
0
322
Sep ’25