Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics

Post

Replies

Boosts

Views

Activity

Xcode memory meter goes to high while converting USDZ file to scn file by programatically
We have requirement adding usdz file to UIView and showing the it’s content and archive the data and save to file. When user open the file, we need to unarchive that usdz content and binding with UIView and showing it to user. Initially, we created SCNScene object passing usdz file url like below. do { usdzScene = try SCNScene(url: usdzUrl) } catch let error as NSError { print(error) } Since SCNScene support to NSSecureCoding protocol , we directly archive that object and save it file and load back it from file and created SCNScene object using NSKeyedUnarchiver process. But for some files, we realised high memory consumption while archiving the SCNScene object using below line. func encode(with coder: NSCoder) { coder.encode(self.scnScene, forKey: "scnScene") } File referene link : toy_drummer_idle.usdz When we analyse apple documentation (check discussion section) , it said, scn file extension is the fastest format for processing than the usdz. So we used SCNSecne write to feature for creating scn file from given usdz file. After that, When we do the archive SCNScene object that was created by sun file url, the archive process is more faster and it will not take high memory as well. It is really faster previous case now. But unfortunately, SCNScene write method will take lot of time for this conversion and memory meter is also going high and it will be caused to app crash as well. I check the output file size as well. The given usdz file size is 18MB and generated scn file size is 483 MB. But SCNScene archive process is so fast. Please, analyse this case and please, provide some guideline how we can optimise this behaviour. I really appreciate your feedback. Full Code: import UIKit import SceneKit class ViewController: UIViewController { var scnView: SCNView? var usdzScene: SCNScene? var scnScene: SCNScene? lazy var exportButton: UIButton = { let btn = UIButton(type: UIButton.ButtonType.system) btn.tag = 1 btn.backgroundColor = UIColor.blue btn.addTarget(self, action: #selector(buttonPressed(_:)), for: .touchUpInside) btn.setTitle("USDZ to SCN", for: .normal) btn.setTitleColor(.white, for: .normal) btn.layer.borderColor = UIColor.gray.cgColor btn.titleLabel?.font = .systemFont(ofSize: 20) btn.translatesAutoresizingMaskIntoConstraints = false return btn }() func deleteTempDirectory(directoryName: String) { let tempDirectoryUrl = URL(fileURLWithPath: NSTemporaryDirectory()) let tempDirectory = tempDirectoryUrl.appendingPathComponent(directoryName, isDirectory: true) if FileManager.default.fileExists(atPath: URL(string: tempDirectory.absoluteString)!.path) { do{ try FileManager.default.removeItem(at: tempDirectory) } catch let error as NSError { print(error) } } } func createTempDirectory(directoryName: String) -> URL? { let tempDirectoryUrl = URL(fileURLWithPath: NSTemporaryDirectory()) let toBeCreatedDirectoryUrl = tempDirectoryUrl.appendingPathComponent(directoryName, isDirectory: true) if !FileManager.default.fileExists(atPath: URL(string: toBeCreatedDirectoryUrl.absoluteString)!.path) { do{ try FileManager.default.createDirectory(at: toBeCreatedDirectoryUrl, withIntermediateDirectories: true, attributes: nil) } catch let error as NSError { print(error) return nil } } return toBeCreatedDirectoryUrl } @IBAction func buttonPressed(_ sender: UIButton){ let scnFolderName = "SCN" let scnFileName = "3D" deleteTempDirectory(directoryName: scnFolderName) guard let scnDirectoryUrl = createTempDirectory(directoryName: scnFolderName) else {return} let scnFileUrl = scnDirectoryUrl.appendingPathComponent(scnFileName).appendingPathExtension("scn") guard let usdzScene else {return} let result = usdzScene.write(to: scnFileUrl, options: nil, delegate: nil, progressHandler: nil) if (result) { print("exporting process is success.") } else { print("exporting process is failed.") } } override func viewDidLoad() { super.viewDidLoad() let usdzUrl: URL? = Bundle.main.url(forResource: "toy_drummer_idle", withExtension: "usdz") guard let usdzUrl else {return} do { usdzScene = try SCNScene(url: usdzUrl) } catch let error as NSError { print(error) } guard let usdzScene else {return} scnView = SCNView(frame: .zero) guard let scnView else {return} scnView.translatesAutoresizingMaskIntoConstraints = false self.view.addSubview(scnView) self.view.addSubview(exportButton) NSLayoutConstraint.activate([ scnView.leadingAnchor.constraint(equalTo: view.safeAreaLayoutGuide.leadingAnchor), scnView.trailingAnchor.constraint(equalTo: view.safeAreaLayoutGuide.trailingAnchor), scnView.topAnchor.constraint(equalTo: view.safeAreaLayoutGuide.topAnchor), scnView.bottomAnchor.constraint(equalTo: view.safeAreaLayoutGuide.bottomAnchor, constant: -30), exportButton.widthAnchor.constraint(equalToConstant: 200), exportButton.heightAnchor.constraint(equalToConstant: 40), exportButton.centerXAnchor.constraint(equalTo: view.safeAreaLayoutGuide.centerXAnchor), exportButton.bottomAnchor.constraint(equalTo: view.safeAreaLayoutGuide.bottomAnchor), ]) DispatchQueue.main.asyncAfter(deadline: .now() + 0.01) {[weak self] in guard let self else {return} loadModel(scene: usdzScene) } } func loadModel(scene: SCNScene){ guard let scnView else {return} scnView.autoenablesDefaultLighting = true scnView.scene = scene scnView.allowsCameraControl = true } } ![]("https://developer.apple.com/forums/content/attachment/5ff0197b-7e0b-4efe-b295-cc205c73dc02" "title=Screenshot 2024-06-20 at 12.23.40.png;width=1712;height=1013")
0
0
128
1w
USDZ with vertex color
Hello, I have a USDC file with vertex color (WITHOUT textures), and it displays perfectly in Preview. If I package it in a zip (without compression) and rename the resulting file to USDZ, I can see it without any issues in AVP and Mac. However, if I send it to an iPhone, the vertex color does not display. Is there anything else I need to do besides packaging the USDC without compression in a ZIP? Thank you very much.
2
0
147
2w
Game porting toolkit 1.0.3 to 1.1 update error
It show the following : ==> Upgrading apple/apple/game-porting-toolkit 1.0.3 -> 1.1 ==> Staging /Users/mtahin07/Library/Caches/Homebrew/downloads/7baed2a6fd34b4a641 ==> Patching ==> /private/tmp/game-porting-toolkit-20240619-16666-si8pyg/wine/configure --pre ==> make Last 15 lines from /Users/mtahin07/Library/Logs/Homebrew/game-porting-toolkit/02.make: /private/tmp/game-porting-toolkit-20240619-16666-si8pyg/wine/dlls/crypt32/unixlib.c:641:23: error: use of undeclared identifier 'noErr' if (status == noErr) ^ /private/tmp/game-porting-toolkit-20240619-16666-si8pyg/wine/dlls/crypt32/unixlib.c:647:31: warning: this function declaration is not a prototype [-Wstrict-prototypes] if ((status = SecItemExport(cert, kSecFormatX509Cert, 0, NULL, &certData)) == noErr) ^ /private/tmp/game-porting-toolkit-20240619-16666-si8pyg/wine/dlls/crypt32/unixlib.c:647:51: error: use of undeclared identifier 'kSecFormatX509Cert' if ((status = SecItemExport(cert, kSecFormatX509Cert, 0, NULL, &certData)) == noErr) ^ /private/tmp/game-porting-toolkit-20240619-16666-si8pyg/wine/dlls/crypt32/unixlib.c:647:95: error: use of undeclared identifier 'noErr' if ((status = SecItemExport(cert, kSecFormatX509Cert, 0, NULL, &certData)) == noErr) ^ 2 warnings and 7 errors generated. make: *** [dlls/crypt32/unixlib.o] Error 1 make: *** Waiting for unfinished jobs.... If reporting this issue please do so to (not Homebrew/brew or Homebrew/homebrew-core): apple/apple I can't verify what error occur
1
0
209
1w
Elapsed scene time used in custom shader uniform
If you create a custom shader you get access to a collection of uniform values, one is the uniforms::time() parameter which is defined as "the number of seconds that have elapsed since RealityKit began rendering the current scene" in this doc: https://developer.apple.com/metal/Metal-RealityKit-APIs.pdf Is there some way to get this value from Swift code? I want to animate a value in my shader based on the time so I need to get the starting time value so I can interpolate the animation offset from that point. If I create a System in the update() function I get a SceneUpdateContext instance and that has a deltaTime property but not an elapsedTime property which I would assume would map to the shader time() value.
0
0
179
1w
App using MetalKit creates many IOSurfaces in rapid succession, causing MTKView to freeze and app to hang
I've got an iOS app that is using MetalKit to display raw video frames coming in from a network source. I read the pixel data in the packets into a single MTLTexture rows at a time, which is drawn into an MTKView each time a frame has been completely sent over the network. The app works, but only for several seconds (a seemingly random duration), before the MTKView seemingly freezes (while packets are still being received). Watching the debugger while my app was running revealed that the freezing of the display happened when there was a large spike in memory. Seeing the memory profile in Instruments revealed that the spike was related to a rapid creation of many IOSurfaces and IOAccelerators. Profiling CPU Usage shows that CAMetalLayerPrivateNextDrawableLocked is what happens during this rapid creation of surfaces. What does this function do? Being a complete newbie to iOS programming as a whole, I wonder if this issue comes from a misuse of the MetalKit library. Below is the code that I'm using to render the video frames themselves: class MTKViewController: UIViewController, MTKViewDelegate { /// Metal texture to be drawn whenever the view controller is asked to render its view. private var metalView: MTKView! private var device = MTLCreateSystemDefaultDevice() private var commandQueue: MTLCommandQueue? private var renderPipelineState: MTLRenderPipelineState? private var texture: MTLTexture? private var networkListener: NetworkListener! private var textureGenerator: TextureGenerator! override public func loadView() { super.loadView() assert(device != nil, "Failed creating a default system Metal device. Please, make sure Metal is available on your hardware.") initializeMetalView() initializeRenderPipelineState() networkListener = NetworkListener() textureGenerator = TextureGenerator(width: streamWidth, height: streamHeight, bytesPerPixel: 4, rowsPerPacket: 8, device: device!) networkListener.start(port: NWEndpoint.Port(8080)) networkListener.dataRecievedCallback = { data in self.textureGenerator.process(data: data) } textureGenerator.onTextureBuiltCallback = { texture in self.texture = texture self.draw(in: self.metalView) } commandQueue = device?.makeCommandQueue() } public func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize) { /// need implement? } public func draw(in view: MTKView) { guard let texture = texture, let _ = device else { return } let commandBuffer = commandQueue!.makeCommandBuffer()! guard let currentRenderPassDescriptor = metalView.currentRenderPassDescriptor, let currentDrawable = metalView.currentDrawable, let renderPipelineState = renderPipelineState else { return } currentRenderPassDescriptor.renderTargetWidth = streamWidth currentRenderPassDescriptor.renderTargetHeight = streamHeight let encoder = commandBuffer.makeRenderCommandEncoder(descriptor: currentRenderPassDescriptor)! encoder.pushDebugGroup("RenderFrame") encoder.setRenderPipelineState(renderPipelineState) encoder.setFragmentTexture(texture, index: 0) encoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4, instanceCount: 1) encoder.popDebugGroup() encoder.endEncoding() commandBuffer.present(currentDrawable) commandBuffer.commit() } private func initializeMetalView() { metalView = MTKView(frame: CGRect(x: 0, y: 0, width: streamWidth, height: streamWidth), device: device) metalView.delegate = self metalView.framebufferOnly = true metalView.colorPixelFormat = .bgra8Unorm metalView.contentScaleFactor = UIScreen.main.scale metalView.autoresizingMask = [.flexibleWidth, .flexibleHeight] view.insertSubview(metalView, at: 0) } /// initializes render pipeline state with a default vertex function mapping texture to the view's frame and a simple fragment function returning texture pixel's value. private func initializeRenderPipelineState() { guard let device = device, let library = device.makeDefaultLibrary() else { return } let pipelineDescriptor = MTLRenderPipelineDescriptor() pipelineDescriptor.rasterSampleCount = 1 pipelineDescriptor.colorAttachments[0].pixelFormat = .bgra8Unorm pipelineDescriptor.depthAttachmentPixelFormat = .invalid /// Vertex function to map the texture to the view controller's view pipelineDescriptor.vertexFunction = library.makeFunction(name: "mapTexture") /// Fragment function to display texture's pixels in the area bounded by vertices of `mapTexture` shader pipelineDescriptor.fragmentFunction = library.makeFunction(name: "displayTexture") do { renderPipelineState = try device.makeRenderPipelineState(descriptor: pipelineDescriptor) } catch { assertionFailure("Failed creating a render state pipeline. Can't render the texture without one.") return } } } My question is simply: what gives?
0
0
176
1w
Multiplayer testing with the new TabletopKit
Hey, Just watched and started investigating a new TabletopKit framework, which looks fantastic. I'm looking at how multiplayer can be tested. Found info about testing on multiple real devices here - https://developer.apple.com/documentation/tabletopkit/tabletopkitsample#Start-a-multiplayer-game-on-devices I wanted to ask about options for testing multiplayer on simulators, maybe with a simulator + a single real device. Unfortunately, having multiple VisionPros for indie development is unrealistic, so I hope there are ways to do it.
2
0
325
2w
Triangle count and texture size budget for RealityKit on visionOS
In the past, Apple recommended restricting USDZ models to a maximum of 100,000 triangles and a texture sizes of 2048x2048 for Apple QuickLook (and I think for RealityKit on iOS in general). Does Apple have any recommended max polygon counts for visionOS? Is it the same for models running in a Volumetric window in the shared space and in ImmersiveSpace? What is the recommended texture size for visionOS? (I seem to recall 8192x8192, but I can't find it now)
2
0
408
May ’24
Error: apple/apple/game-porting-toolkit 1.1 did not build
I was trying update gptk, so i removed the old one ( installed by Xcode 15.0 ) and reinstalled it, but accuired a issue: Error: apple/apple/game-porting-toolkit 1.1 did not build Logs: /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/00.options.out /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/01.configure /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/01.configure.cc /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/02.make /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/wine64-build If reporting this issue please do so to (not Homebrew/brew or Homebrew/homebrew-core): apple/apple Error: Your Xcode (15.0) is outdated. Please update to Xcode 15.4 (or delete it). Xcode can be updated from the App Store. So I upgrade the Xcode to 15.4 by app store and still encountered: Error: apple/apple/game-porting-toolkit 1.1 did not build Logs: /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/00.options.out /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/01.configure /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/01.configure.cc /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/02.make /Users/CASEM/Library/Logs/Homebrew/game-porting-toolkit/wine64-build If reporting this issue please do so to (not Homebrew/brew or Homebrew/homebrew-core): apple/apple I've been reading some post says use 15.1, but still get same error report. What should I do next? Any help woulbe appreciated.
0
0
263
1w
Couldnt link game on game center
my game center account are linked both on my ipad and iphone but the activity on the game center didnt update when i play games on my ipad. It only linked to games on my phone. So all the games i‘ve downloaded on my ipad arent linked to the game center. Is there any solution so that the games i‘ve downloaded on my ipad could link to the game center? i have tried to turn on and off both on my ipad and iphone but it still only linked to the phone.
0
0
105
1w
SceneKit or RealityKit for Non-AR Game Development
Hi everyone, I'm choosing a framework for developing a game that doesn't involve augmented reality (AR) and I'm unsure whether to use SceneKit or RealityKit. I would like to hear from Apple engineers on this matter. Which of these frameworks is better suited for creating non-AR games? Additionally, I'd like to know if it's possible to disable AR in RealityKit using the updated RealityView? Thanks in advance for your insights and recommendations!
2
0
346
1w
Reality Composer Pro inline documentation comments
The WWDC24 video "Build a spatial drawing app with RealityKit" https://developer.apple.com/wwdc24/10104 at 12:04 includes a slide showing a Reality Composer Pro shader graph that features wonderful inline documentation comment boxes: Are shader graph inline comments a new feature that Reality Composer Pro supports? This would be extraordinarily useful, as complex shader graphs can be challenging to decipher. If so, how are inline shader graph comments created in Reality Composer Pro?
1
0
205
1w
How to create screen-space meshes selectively in RealityKit AR Mode Using New OrthographicCameraComponent?
I'd like to create meshes in RealityKit ( AR mode on iPad ) in screen-space, i.e. for UI. I noticed a lot of useful new functionality in RealityKit for the next OS versions, including the OrthographicCameraComponent here: https://developer.apple.com/documentation/realitykit/orthographiccameracomponent?changes=_3 I think this would help, but I need AR worldtracking as well as a regular perspective camera to work with the 3D elements. Firstly, can I have a camera attached selectively to a few entities, just for those entities? This could be the orthographic camera. Secondly, can I make it so those entities are always rendered in-front, in screenspace? (They'd need to follow the camera.) If I can't have multiple cameras, what can be done in that case? Is it actually better to use a completely different view / API for layering on-top of RealityKit? I would much rather keep everything in RealityKit, however, for simplicity.
0
0
217
1w
Using a scene from Reality Composer Pro in an IOS app?
I am trying to establish a workflow with using Reality Composer Pro to make scenes - I am grey boxing a scene using primitives at the moment. I have set up a cube with a texture material and a simple animation to spin. I am confused as to what I should be loading. I have created what I think is a scene asset in the package for the Reality Composer Project. Here is a code snippet: struct ContentView: View { var body: some View { RealityView { content in do { let scene = try await ModelEntity(named: "HOF") content.add(scene) } catch { print("Error loading scene: \(error.localizedDescription)") } } } } Here is the project layout in Reality Composer Pro:
1
0
208
1w
copyFromBuffer offset and size working even when not multiple of 4
Hi, Reading the copyFromBuffer documentation states that on macOS, sourceOffset, destinationOffset, and size "needs to be a multiple of 4, but can be any value in iOS and tvOS". However, I have noticed that, at least on my M2 Max, this limitation does not seem to exist as there are no warnings and the copy works correctly regardless of the offset value. I'm curious to know if this is something that should still be avoided. Is the multiple of 4 limitation reserved for non Apple Silicon devices and that note can be ignored for Apple Silicon? I ask because I am a contributor to Metal.jl, and recently noticed that our tests pass even when copying using copyWithBuffer with offsets and sizes that are not multiples of 4. If that coul cause issues/correctness problems, we would need to fix that. Thank you. Christian
0
0
136
1w
Metal and Swift Concurrency
Hi, Introducing Swift Concurrency to my Metal app has been a bit challenging as Swift Concurrency is limited by the cooperative thread pool. GPU work is obviously not CPU bound and can block forward moving progress, especially when using waitUntilCompleted on the command buffer. For concurrent render work this has the potential of under utilizing the CPU and even creating dead locks. My question is, what is the Metal's teams general recommendation when it comes to concurrency? It seems to me that Dispatch or OperationQueues are still the preferred way for Metal bound tasks in order to gain maximum performance? To integrate with Swift Concurrency my idea is to use continuations that kick off render jobs via Dispatch or Queues? Would this be the best solution to bridge async tasks with Metal work? Thanks!
4
0
217
2w
Turn on ARView.ARSession.ARConfiguration.providesAudioData = true and add a ModelEntity to ARView (its Material is VideoMaterial (avPlayer: player) this video contains audio), and the video does not play properly?
import SwiftUI import RealityKit import ARKit import AVFoundation struct ContentView : View { var body: some View { ARViewContainer().edgesIgnoringSafeArea(.all) } } struct ARViewContainer: UIViewRepresentable { func makeUIView(context: Context) -> ARView { let arView = ARView(frame: .zero) arView.session.delegate = context.coordinator let worldConfig = ARWorldTrackingConfiguration() worldConfig.planeDetection = .horizontal // worldConfig.providesAudioData = true // open here -----> Error: arView.session.run(worldConfig) addTestEntity(arView: arView) return arView } func updateUIView(_ uiView: ARView, context: Context) {} func makeCoordinator() -> Coordinator { Coordinator() } class Coordinator: NSObject, ARSessionDelegate, ARSessionObserver { func session(_ session: ARSession, didOutputAudioSampleBuffer audioSampleBuffer: CMSampleBuffer) { } } } func addTestEntity(arView: ARView) { let mesh = MeshResource.generatePlane(width: 0.5, depth: 0.35) guard let url = Bundle.main.url(forResource: "videoplayback", withExtension: "mp4") else { return } let player = AVPlayer(url: url) let videoMaterial = VideoMaterial(avPlayer: player) let model = ModelEntity(mesh: mesh, materials: [videoMaterial]) model.transform.translation.y = 0.05 let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: SIMD2<Float>(0.2, 0.2))) anchor.children.append(model) player.play() arView.scene.anchors.append(anchor) } Error: failed to update STS state: Error Domain=com.apple.STS-N Code=1396929899 "Error: failed to signal change" UserInfo={NSLocalizedDescription=Error: failed to signal change} failed to update STS state: Error Domain=com.apple.STS-N Code=1396929899 "Error: failed to signal change" UserInfo={NSLocalizedDescription=Error: failed to signal change} ...... ARSession <0x125d88040>: did fail with error: Error Domain=com.apple.arkit.error Code=102 "Required sensor failed." UserInfo={NSLocalizedFailureReason=A sensor failed to deliver the required input., NSUnderlyingError=0x302922dc0 {Error Domain=AVFoundationErrorDomain Code=-11819 "Cannot Complete Action" UserInfo={NSLocalizedDescription=Cannot Complete Action, NSLocalizedRecoverySuggestion=Try again later.}}, NSLocalizedRecoverySuggestion=Make sure that the application has the required privacy settings., NSLocalizedDescription=Required sensor failed.} iOS 17.5.1 Xcode 15.4
2
0
242
2w
Incorrect Attachment bounds VisionOS 1.1
We have been using attachment.bounds.extents to determine the size of a RealityView attachment at run time. It has been working fine until VisionOS 1.1 update. I wonder if we are doing something wrong as the release notes suggest some visual bounds calculation issue was fixed with the latest release. The funny thing is we did not have an issue before. Below is how we access to height value: let height = attachmentEntity.attachment.bounds.extents.y Previously it returned the correct value. Now it returns 0. I wonder if anyone else is having the same issue.
3
1
411
Mar ’24