Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics
Posts under Graphics & Games topic

Post

Replies

Boosts

Views

Activity

Scene with over 12,000 duplicate entities
I am creating a RealityKit scene that will contain over 12,000 duplicate cubes arranged in a circle (see image below). This is for some high-energy physical simulations I am doing. I accomplish this scene by creating a single cube and cloning it a bunch of times. So, I there is a single MeshResource and Material even though there are a lot of entities. I have confirmed this by checking with Swift's === operator. Even with this, the program is unworkably slow. Any suggestions or tricks that could help with this type of scene? Using a single geometry was the trick to getting SceneKit to work fast with geometries like this. I've been updating my software to RealityKit because I far prefer the structure of RealityKit over SceneKit.
4
0
1.3k
Jan ’25
Feature Request: Support .reality File Export in Reality Composer Pro for Mac
I am an AR developer working on Apple Silicon Macs. Currently, Reality Composer Pro does not allow exporting .reality files, and Reality Composer (classic) is not available for Apple Silicon. This creates a gap in the workflow for ARKit/RealityKit developers who need interactive .reality files for use in Xcode projects. Having the ability to export .reality files directly from Reality Composer Pro on Mac would greatly streamline development and enable a fully native workflow on modern Macs. Alternatively, bringing Reality Composer (classic) to Apple Silicon would also resolve this issue. I have submitted this as a feature request via Feedback Assistant (FB17900386). I encourage others with similar needs to reply or submit feedback as well. Thank you!
4
1
158
Jul ’25
Background GPU Access availability
I would love to use Background GPU Access to do some video processing in the background. However the documentation of BGContinuedProcessingTaskRequest.Resources.gpu clearly states: Not all devices support background GPU use. For more information, see Performing long-running tasks on iOS and iPadOS. Is there a list available of currently released devices that do (or don't) support GPU background usage? That would help to understand what part of our user base can use this feature. (And what hardware we need to test this on as developers.) For example it seems that it isn't supported on an iPad Pro M1 with the current iOS 26 beta. The simulators also seem to not support the background GPU resource. So would be great to understand what hardware is capable of using this feature!
4
0
831
Jul ’25
Xcode compile stucks(stops) when add new source files or add functions to previous files
Hello, we are working on a iOS game project, as progress, the project grows larger and larger. Because we are using other game dependencies and libraries, here larger and larger refers to the whole project, and our source files integrated and compiled by Xcode are not many. Now, it seems we hit a bottleneck, when I add new files or functions to the previous files to implement a new feature, Xcode compile stucks(stops), it's Indexing | Initializing datastore forever, cannot produce a final build. macOS 15.1, Xcode 16.2 Can you provide any solutions to solve this problem? Also submitted Feedback ID #FB18432749
4
0
641
Aug ’25
Sample code for WWDC25 Session
Hi! I watched the WWDC25 session "Bring your SceneKit project to RealityKit" which seemed like a great resource for those of us transitioning from the now-deprecated SceneKit framework. The session mentioned that the full sample code for the project would be available to download, but I haven't been able to find it in the Code section of the video page or in the Sample Code Library. Has the sample code been released yet? Having the project code would make it much easier to follow along with the RealityKit changes shown in the video. Thanks again for the great session.
4
4
247
Jun ’25
Metal runtime shader library compilation and linking issue
In my project I need to do the following: In runtime create metal Dynamic library from source. In runtime create metal Executable library from source and Link it with my previous created Dynamic library. Create compute pipeline using those two libraries created above. But I get the following error at the third step: Error Domain=AGXMetalG15X_M1 Code=2 "Undefined symbols: _Z5noisev, referenced from: OnTheFlyKernel " UserInfo={NSLocalizedDescription=Undefined symbols: _Z5noisev, referenced from: OnTheFlyKernel } import Foundation import Metal class MetalShaderCompiler { let device = MTLCreateSystemDefaultDevice()! var pipeline: MTLComputePipelineState! func compileDylib() -> MTLDynamicLibrary { let source = """ #include <metal_stdlib> using namespace metal; half3 noise() { return half3(1, 0, 1); } """ let option = MTLCompileOptions() option.libraryType = .dynamic option.installName = "@executable_path/libFoundation.metallib" let library = try! device.makeLibrary(source: source, options: option) let dylib = try! device.makeDynamicLibrary(library: library) return dylib } func compileExlib(dylib: MTLDynamicLibrary) -> MTLLibrary { let source = """ #include <metal_stdlib> using namespace metal; extern half3 noise(); kernel void OnTheFlyKernel(texture2d<half, access::read> src [[texture(0)]], texture2d<half, access::write> dst [[texture(1)]], ushort2 gid [[thread_position_in_grid]]) { half4 rgba = src.read(gid); rgba.rgb += noise(); dst.write(rgba, gid); } """ let option = MTLCompileOptions() option.libraryType = .executable option.libraries = [dylib] let library = try! self.device.makeLibrary(source: source, options: option) return library } func runtime() { let dylib = self.compileDylib() let exlib = self.compileExlib(dylib: dylib) let pipelineDescriptor = MTLComputePipelineDescriptor() pipelineDescriptor.computeFunction = exlib.makeFunction(name: "OnTheFlyKernel") pipelineDescriptor.preloadedLibraries = [dylib] pipeline = try! device.makeComputePipelineState(descriptor: pipelineDescriptor, options: .bindingInfo, reflection: nil) } }
4
0
849
Feb ’25
Performance Regression: In iOS 18.2 3D rotation in volume rendering is not smooth as compared to iOS 18.1
We as a team of engineers work on an app intended to visualize medical images. The type of situations where the app is used involves time critical decision making for acute clinical conditions. Stability of the app and performance are of utmost importance and can directly help timely treatment action. The app we are developing uses multiple libraries and tools like vtk, webgl, opengl, webkit, gl-matrix etc. The problem specifically can be described as follows, it has been observed that when 3D volume is rendered in the app and we try to rotate the volume the rotation is slow, unresposive and laggy. Specifically, we have noticed that iOS 18.1 the volume rotation is much smoother as compared to latest iOS 18.2. Eariler, we have faced somewhat similar issue with iOS 17 but it got improved in iOS 18.1. This performance regression is affecting the user experience in our healthcare application. We have taken reference from the cornerstone.js code and you can reproduce the issue using the following example: https://www.cornerstonejs.org/live-examples/volumeviewport3d Steps to Reproduce: Load the above mentioned test example on an iPhone running version 18.2 using safari. Perform volume rendering using the provided dataset. Measure the time taken by volume for each rotate or drag action. Repeat the same steps on an iPhone running version 18.1 for comparison. Additional Information: Device Model Tested: iPhone12, iPhone13, iPhone14 iOS Version With Issue: 18.2 18.3(Beta) I would appreciate any insights or suggestions on how to address this performance regression. If additional information is needed, please let me know. Thank you.
4
6
939
Feb ’25
Apple's Choice: USDZ over Other 3D File Formats like GLTF
Hello Dev Community, I've been thinking over Apple's preference for USDZ for AR and 3D content, especially when there's the widely used GLTF. I'm keen to discuss and hear your insights on this choice. USDZ, backed by Apple, has seen a surge in the AR community. It boasts advantages like compactness, animation support, and ARKit compatibility. In contrast, GLTF too is a popular format with its own merits, like being an open standard and offering flexibility. Here are some of my questions toward the use of USDZ: Why did Apple choose USDZ over other 3D file formats like GLTF? What benefits does USDZ bring to Apple's AR and 3D content ecosystem? Are there any limitations of USDZ compared to other file formats? Could factors like compatibility, security, or integration ease have influenced Apple's decision? I would love to hear your thoughts on this. Feel free to share any experiences with USDZ or other 3D file formats within Apple's ecosystem!
4
1
6k
Oct ’24
GKMatch.chooseBestHostingPlayer(_:) always returns nil player
I'm building a game with a client-server architecture. Using GKMatch.chooseBestHostingPlayer(_:) rarely works. When I started testing it today, it worked once at the very beginning, and since then it always succeeds on one client and returns nil on the other client. I'm testing with a Mac and an iPhone. Sometimes it fails on the Mac, sometimes on the iPhone. On the device that it succeeds on, the provided host can be the device itself or the other one. I created FB9583628 in August 2021, but after the Feedback Assistant team replied that they are not able to reproduce it, the feedback never went forward. import SceneKit import GameKit #if os(macOS) typealias ViewController = NSViewController #else typealias ViewController = UIViewController #endif class GameViewController: ViewController, GKMatchmakerViewControllerDelegate, GKMatchDelegate { var match: GKMatch? var matchStarted = false override func viewDidLoad() { super.viewDidLoad() GKLocalPlayer.local.authenticateHandler = authenticate } private func authenticate(_ viewController: ViewController?, _ error: Error?) { #if os(macOS) if let viewController = viewController { presentAsSheet(viewController) } else if let error = error { print(error) } else { print("authenticated as \(GKLocalPlayer.local.gamePlayerID)") let viewController = GKMatchmakerViewController(matchRequest: defaultMatchRequest())! viewController.matchmakerDelegate = self GKDialogController.shared().present(viewController) } #else if let viewController = viewController { present(viewController, animated: true) } else if let error = error { print(error) } else { print("authenticated as \(GKLocalPlayer.local.gamePlayerID)") let viewController = GKMatchmakerViewController(matchRequest: defaultMatchRequest())! viewController.matchmakerDelegate = self present(viewController, animated: true) } #endif } private func defaultMatchRequest() -> GKMatchRequest { let request = GKMatchRequest() request.minPlayers = 2 request.maxPlayers = 2 request.defaultNumberOfPlayers = 2 request.inviteMessage = "Ciao!" return request } func matchmakerViewControllerWasCancelled(_ viewController: GKMatchmakerViewController) { print("cancelled") } func matchmakerViewController(_ viewController: GKMatchmakerViewController, didFailWithError error: Error) { print(error) } func matchmakerViewController(_ viewController: GKMatchmakerViewController, didFind match: GKMatch) { self.match = match match.delegate = self startMatch() } func match(_ match: GKMatch, player: GKPlayer, didChange state: GKPlayerConnectionState) { print("\(player.gamePlayerID) changed state to \(String(describing: state))") startMatch() } func startMatch() { let match = match! if matchStarted || match.expectedPlayerCount > 0 { return } print("starting match with local player \(GKLocalPlayer.local.gamePlayerID) and remote players \(match.players.map({ $0.gamePlayerID }))") match.chooseBestHostingPlayer { host in print("host is \(String(describing: host?.gamePlayerID))") } } }
4
0
320
Apr ’25
Transferring Apps with iCloud KVS
Hi All! I'm being asked to migrate an app which utilizes iCloud KVS (Key Value Storage). This ability is a new-ish feature, and the documentation about this is sparse [1]. Honestly, the entire documentation about the new iCloud transfer functionality seems to be missing. Same with Game Center / GameKit. While the docs say that it should work, I'd like to understand the process in more detail. Has anyone migrated an iCloud KVS app? What happens after the transfer goes through, but before the first release? Do I need to do anything special? I see that the Entitlements file has the TeamID in the Key Value store - is that fine? <key>com.apple.developer.ubiquity-kvstore-identifier</key> <string>$(TeamIdentifierPrefix)$(CFBundleIdentifier)</string> Can someone please share their experience? Thank you! [1] https://developer.apple.com/help/app-store-connect/transfer-an-app/overview-of-app-transfer
4
0
2.1k
Feb ’25
What is the earliest supported MacOS version that the Unity plugins will work on?
Hi. The earliest version of MacOS that Unity supports is 10.13. However, it seems that running a game using Unity Plugins on 10.13 causes DLL loading exceptions whenever you try to access part of the GameKit API. The errors look like this: Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/GameKitWrapper Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.dylib Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.dylib Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.so Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.bundle DllNotFoundException: GameKitWrapper assembly:<unknown assembly> type:<unknown type> member:(null) at (wrapper managed-to-native) Apple.GameKit.DefaultNSErrorHandler+Interop.DefaultNSErrorHandler_Set(Apple.Core.Runtime.NSExceptionCallback) at Apple.GameKit.DefaultNSErrorHandler.Init () [0x00001] in ./Library/PackageCache/com.apple.unityplugin.gamekit@e3d4ad5a2c8e/Source/DefaultHandlers.cs:35 (Filename: ./Library/PackageCache/com.apple.unityplugin.gamekit@e3d4ad5a2c8e/Source/DefaultHandlers.cs Line: 35) Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/GameKitWrapper Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.dylib Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.dylib Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.so Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.bundle DllNotFoundException: GameKitWrapper assembly:<unknown assembly> type:<unknown type> member:(null) at (wrapper managed-to-native) Apple.GameKit.DefaultNSExceptionHandler+Interop.DefaultNSExceptionHandler_Set(Apple.Core.Runtime.NSExceptionCallback) at Apple.GameKit.DefaultNSExceptionHandler.Init () [0x00001] in ./Library/PackageCache/com.apple.unityplugin.gamekit@e3d4ad5a2c8e/Source/DefaultHandlers.cs:14 These errors do not appear on 10.15 or later, which is why I am assuming it's a problem with this particular version of MacOS. Have not been able to test 10.14 so not sure how it handles there. So, here is my question - what is the earliest version of MacOS that the Apple Unity plugins support? It's not documented anywhere on the GitHub page. // Love
4
0
815
Dec ’24
Game Center challenges and activities issues for released game
We have a released game in the App Store with Game Center Challenges. The challenges were working well in testing of the final version before the Game Center challenges went live. After release the activity that should take the player to the challenge in game results in this printout and we do not get informed by GKGameActivityListener of the activity happening in our code as we did before the Game Center challenges went live. “Invalid game activity definition. Failed to kick activity notification to GameKit. Error: Error Domain=GKErrorDomain Code=17 "(null)"” Also, Game Center reports that the are no challenges in GKChallengeDefinition.all even though there are ongoing challenges. The leaderboard results do get reported to Game Center though and show up as challenge entries in the Games app. At the moment we are trying to figure out if this is a problem on our end or a Game Center server issue.
4
1
517
1w
GKLocalPlayer save and fetch data to iCloud issue
Hi all I have two mystic issues with saving and fetching data to and from iCloud. Both repro only after first launch of an app. 1. [GKLocalPlayer fetchSavedGamesWithCompletionHandler:] After first attempt I can see 0 saved games (but i know that there is at least one saved game) and there is no any error. In case if I try fetch one more time (without any additional actions) even immediately after first attempt I receive saved games correctly (not 0) 2. [GKLocalPlayer saveGameData: withName: completionHandler:] After first attempt I can see error The requested operation could not be completed because local player has not been authenticated. In case if I try save one more time (without any additional actions) even immediately after first attempt I can save data successfully without any error I found the same issue in StackOverflow, but there are no fixes...
4
1
1.4k
Feb ’25
Roblox freezing on MacOS 26.1 Beta
After updating to MacOS 26.1 I encountered an issue that Roblox tends to freeze quite often for 10 - 60 seconds at most, this is really annoying that it is doing this as i play the game a lot. My theory is that it is like a driver issue with metal or something, I have reinstalled MacOS, reinstalled the game and lowed the performance manually but nothing is working. Wondering if you could help, when it will be fixed and if others are having the same issue. Many thanks, William.
4
2
503
1d
CIBumpDistortion filter not working on my view
I'm trying to apply a CIBumpDistortion Core Image filter to a view that contains a UILabel (my storyLabel). The goal is to create a visual bump/magnifying glass effect over the text. However, despite my attempts, the filter doesn't seem to render at all. The view and the label appear as normal, with no distortion effect. I've tried adjusting the filter parameters and reviewing the view hierarchy, but without success. I also haven't been able to find clear documentation or examples for applying this filter to a UIView's layer. // // TVView.swift // Mistery // // Created by Joje on 31/07/25. // import CoreImage import CoreImage.CIFilterBuiltins import UIKit import AVFoundation final class TVView: UIView { // propriedades animacao texto private var textAnimationTimer: Timer? private var fullTextToAnimate: String = "" private var currentCharIndex: Int = 0 // propriedades video estatica private var player: AVQueuePlayer? private var playerLayer: AVPlayerLayer? private var playerLooper: AVPlayerLooper? var onNextButtonTap: () -> Void = {} // MARK: - Subviews // imagem da TV private(set) lazy var tvImageView: UIImageView = { let imageView = UIImageView() imageView.translatesAutoresizingMaskIntoConstraints = false imageView.image = UIImage(named: "tvFinal") imageView.contentMode = .scaleAspectFit return imageView }() // texto que passa dentro da TV private(set) lazy var storyLabel: UILabel = { let label = UILabel() label.translatesAutoresizingMaskIntoConstraints = false //label.backgroundColor = .gray label.textColor = .red label.font = UIFont(name: "MeltedMonster", size: 30) label.textAlignment = .left label.numberOfLines = 0 label.text = "" return label }() private(set) lazy var nextButton: UIButton = { let button = UIButton(type: .system) button.translatesAutoresizingMaskIntoConstraints = false //button.backgroundColor = .darkGray button.addTarget(self, action: #selector(didPressNextButton), for: .touchUpInside) return button }() // MARK: - Lifecycle override init(frame: CGRect) { super.init(frame: frame) backgroundColor = .black setupVideoPlayer() addSubviews() setupConstraints() } override func layoutSubviews() { super.layoutSubviews() playerLayer?.frame = tvImageView.frame.insetBy(dx: tvImageView.frame.width * 0.05, dy: tvImageView.frame.height * 0.18) setupFisheyeEffect() } private func setupFisheyeEffect() { // cria o filtro guard let filter = CIFilter(name: "CIBumpDistortion") else {return print("erro")} storyLabel.layer.shouldRasterize = true storyLabel.layer.rasterizationScale = UIScreen.main.scale // define os parametros filter.setDefaults() // centro do efeito let center = CIVector(x: storyLabel.bounds.midX, y: storyLabel.bounds.midY) filter.setValue(center, forKey: kCIInputCenterKey) // raio de distorção filter.setValue(storyLabel.bounds.width, forKey: kCIInputRadiusKey) // intensidade de distorção filter.setValue(7, forKey: kCIInputScaleKey) storyLabel.layer.filters = [filter] } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } // MARK: - Button actions @objc private func didPressNextButton() { onNextButtonTap() } @objc private func animateNextCharacter() { guard currentCharIndex < fullTextToAnimate.count else { textAnimationTimer?.invalidate() return } let currentTextIndex = fullTextToAnimate.index(fullTextToAnimate.startIndex, offsetBy: currentCharIndex) let partialText = String(fullTextToAnimate[...currentTextIndex]) storyLabel.text = partialText currentCharIndex += 1 } public func updateStoryText(with text: String) { textAnimationTimer?.invalidate() storyLabel.text = "" fullTextToAnimate = text currentCharIndex = 0 textAnimationTimer = Timer.scheduledTimer(timeInterval: 0.12, target: self, selector: #selector(animateNextCharacter), userInfo: nil, repeats: true) } // MARK: - Setup methods private func setupVideoPlayer() { guard let videoURL = Bundle.main.url(forResource: "static-video", withExtension: "mov") else { print("Erro: Não foi possível encontrar o arquivo de vídeo static-video.mov") return } let playerItem = AVPlayerItem(url: videoURL) player = AVQueuePlayer(playerItem: playerItem) // LINHA COM POSSIVEL ERRO playerLooper = AVPlayerLooper(player: player!, templateItem: playerItem) playerLayer = AVPlayerLayer(player: player) playerLayer?.videoGravity = .resizeAspectFill if let layer = playerLayer { self.layer.addSublayer(layer) } player?.play() } private func addSubviews() { self.addSubview(storyLabel) self.addSubview(tvImageView) self.addSubview(nextButton) } private func setupConstraints() { NSLayoutConstraint.activate([ // TV Image tvImageView.centerXAnchor.constraint(equalTo: centerXAnchor), tvImageView.centerYAnchor.constraint(equalTo: centerYAnchor), tvImageView.widthAnchor.constraint(equalTo: widthAnchor), // TV Text storyLabel.centerXAnchor.constraint(equalTo: tvImageView.centerXAnchor, constant: -50), storyLabel.centerYAnchor.constraint(equalTo: tvImageView.centerYAnchor, constant: -25), storyLabel.widthAnchor.constraint(equalTo: tvImageView.widthAnchor, multiplier: 0.35), storyLabel.heightAnchor.constraint(equalTo: tvImageView.heightAnchor, multiplier: 0.42), //TV Button nextButton.topAnchor.constraint(equalTo: tvImageView.centerYAnchor, constant: -25), nextButton.centerXAnchor.constraint(equalTo: self.centerXAnchor, constant: 190), nextButton.widthAnchor.constraint(equalToConstant: 100), nextButton.heightAnchor.constraint(equalToConstant: 160) ]) } } #Preview{ ViewController() }
3
0
114
Sep ’25
Unable to play .aivu with VideoPlayerComponent
I’m trying to play an Apple Immersive video in the .aivu format using VideoPlayerComponent using the official documentation found here: https://developer.apple.com/documentation/RealityKit/VideoPlayerComponent Here is a simplified version of the code I'm running in another application: import SwiftUI import RealityKit import AVFoundation struct ImmersiveView: View { var body: some View { RealityView { content in let player = AVPlayer(url: Bundle.main.url(forResource: "Apple_Immersive_Video_Beach", withExtension: "aivu")!) let videoEntity = Entity() var videoPlayerComponent = VideoPlayerComponent(avPlayer: player) videoPlayerComponent.desiredImmersiveViewingMode = .full videoPlayerComponent.desiredViewingMode = .stereo player.play() videoEntity.components.set(videoPlayerComponent) content.add(videoEntity) } } } Full code is here: https://github.com/tomkrikorian/AIVU-VideoPlayerComponentIssueSample But the video does not play in my project even though the file is correct (It can be played in Apple Immersive Video Utility) and I’m getting this error when the app crashes: App VideoPlayer+Component Caption: onComponentDidUpdate Media Type is invalid Domain=SpatialAudioServicesErrorDomain Code=2020631397 "xpc error" UserInfo={NSLocalizedDescription=xpc error} CA_UISoundClient.cpp:436 Got error -4 attempting to SetIntendedSpatialAudioExperience [0x101257490|InputElement #0|Initialize] Number of channels = 0 in AudioChannelLayout does not match number of channels = 2 in stream format. Video I’m using is the official sample that can be found here but tried several different files shot from my clients and the same error are displayed so the issue is definitely not the files but on the RealityKit side of things: https://developer.apple.com/documentation/immersivemediasupport/authoring-apple-immersive-video Steps to reproduce the issue: - Open AIVUPlayerSample project and run. Look at the logs. All code can be found in ImmersiveView.swift Sample file is included in the project Expected results: If I followed the documentation and samples provided, I should see my video played in full immersive mode inside my ImmersiveSpace. Am i doing something wrong in the code? I'm basically following the documentation here. Feedback ticket: FB19971306
3
0
734
Aug ’25
Moving from SceneKit - fog missing
I am rewriting an unfinished SceneKit project as RealityKit (NonAR). As far as I can see, RealityKit is missing basic fog functionality? Fog was simple & easy to implement in SCeneKit (fogStartDistance / fogEndDistance / fogDensityExponent / fogColor). Are there any plans to implement something like this in RealityKit? Are there any simple workarounds?
3
1
551
Aug ’25
MDLAsset loads texture in usdz file loaded with wrong colorspace
I have a very basic usdz file from this repo I call loadTextures() after loading the usdz via MDLAsset. Inspecting the MDLTexture object I can tell it is assigning a colorspace of linear rgb instead of srgb although the image file in the usdz is srgb. This causes the textures to ultimately render as over saturated. In the code I later convert the MDLTexture to MTLTexture via MTKTextureLoader but if I set the srgb option it seems to ignore it. This significantly impacts the usefulness of Model I/O if it can't load a simple usdz texture correctly. Am I missing something? Thanks!
3
3
810
Dec ’24
MetalFx
Recently, I adopted MetalFX for Upscale feature. However, I have encountered a persistent build failure for the iOS Simulator with the error message, 'MetalFX is not available when building for iOS Simulator.' To address this, I modified the MetalFX.framework status to 'Optional' within Build Phases > Link Binary With Libraries, adding the linker option (-weak_framework). Despite this adjustment, the build process continues to fail. Furthermore, I observed that the MetalFX sample application provided by Apple, specifically the one found at https://developer.apple.com/documentation/metalfx/applying-temporal-antialiasing-and-upscaling-using-metalfx, also fails to build for the iOS Simulator target. Has anyone encountered this issue?
3
0
686
Mar ’25