Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics

Post

Replies

Boosts

Views

Activity

Crash after splash screen - latest gen ipads only
Hi, I've got a well established game in gamemaker, which seems to crash on latest gen ipads. In particular have been able to reproduce in ipad 11-inch M2 in the emulator. Game launches, shows the splash screen, and seems to crash as it's leaving it, but before launching the game itself, as such the log results are rather limited. I'm attaching what I can see from the crash, and hope anyone has a suggestion in what might be different in this model of the ipad to warrant this sort of crash. Any leads would be welcome. Thank you in advance.
2
0
340
May ’24
Using the same texture for both input & output of a fragment shader
Hello, This exact question was already asked in this forum (8 years ago) but I can't find a definitive answer: Does Metal allow using the same color texture as both an input and output (color attachment) of a fragment shader? Is the behavior defined somewhere? I believe this results in undefined behavior under both DirectX and OpenGL, so I'd assume the same for Metal, but then why doesn't Metal warn me about this as it does on some many other "misconfigurations"? It also seems to work correctly in my case, as I found out by accident. Would love to get a clarification! Thanks ahead!
1
1
439
May ’24
RealityView and Anchors
Hello, In the documentation for an ARView we see a diagram that shows that all Entity's are connected via AnchorEntity's to the Scene: https://developer.apple.com/documentation/realitykit/arview What happens when we are using a RealityView? Here the documentation suggests we direclty add Entity's: https://developer.apple.com/documentation/realitykit/realityview/ Three questions: Do we need to add Entitys to an AnchorEntity first and add this via content.add(...)? Is an Entity ignored by physics engine if attached via an Anchor? If both the AnchorEntity and an attached Entity is added via content.add(...), does is its anchor's position ignored?
1
0
410
May ’24
coordinate system in shareplay + spatial persona experience
I cannot figure out how shareplay + spatial persona place the origin of RealityKit's coordinate system I have an App on visionOS with immersiveSpace(.mix) scene. In the scene I am using ARKit to track my hand, creating a virtual object following the movement of my hand palm. Every frame I query positions from the HandAnchor to update the position of my object, using originFromAnchorTransform to correctly place my object in the scene. However when I try to adopt that into a shareplay experience with spatial persona, the virtual object's position update becomes a mess. With different template (.sidebyside or .conversational), the origin in my space appears with no pattern. I can always see that the virtual object don't follow my hand but in a random place. it seems that there was a differnce/transform between handAnchor's origin and immersiveSpace origin under stpatial persona + shareplay mode. isn't it? Or is there something I can try to use convert(displacement.inverse.rotation, from: .immersiveSpace, to: .scene) that mentioned here: https://developer.apple.com/documentation/realitykit/realitycoordinatespaceconverting and https://developer.apple.com/documentation/swiftui/environmentvalues/immersivespacedisplacement to do the translation and apply it on my virtual object. But not working yet. Can someone tells me how to do this correctly?
1
0
394
May ’24
Anchor an Reality scene on an image anchor
Developing a prototype Vision Pro app and would like to render a 3D scene made from Reality Composer Pro on an image anchor in a RealityView. But I have no luck so far to make it work and need some guidance to move on. I got the image file stored in the assets like below: And from below is the source codes: import SwiftUI import RealityKit import RealityKitContent struct AnchorView: View { @State var imageEntity: Entity = { let anchorEntity = AnchorEntity(.image(group: "AR Resources", name: "reanchor")) return anchorEntity }() var body: some View { RealityView { content in do { // Add the initial RealityKit content if let scene = try? await Entity(named: "Scene", in: realityKitContentBundle) { imageEntity.addChild(scene) content.add(imageEntity) } } catch { print("Error occurs when adding reality view content: \(error)") } } } }
2
0
438
May ’24
How to exclude RealityKitContent from Swift package for iOS?
I've created an app for visionOS that uses a custom package that includes RealityKitContent as well (as a sub-package). I now want to turn this app into a multi-platform app that also supports iOS. When I try to compile the app for this platform, I get this error message: Building for 'iphoneos', but realitytool only supports [xros, xrsimulator] Thus, I want to exclude the RealityKitContent from my package for iOS, but I don't really know how. The Apple docs are pretty complicated, and ChatGPT did only give me solutions that did not work at all. I also tried to post this on the Swift forum, but no-one could help me there either - so I am trying my luck here. Here is my Package.swift file: // swift-tools-version: 5.10 import PackageDescription let package = Package( name: "Overlays", platforms: [ .iOS(.v17), .visionOS(.v1) ], products: [ .library( name: "Overlays", targets: ["Overlays"]), ], dependencies: [ .package( path: "../BackendServices" ), .package( path: "../MeteorDDP" ), .package( path: "Packages/OverlaysRealityKitContent" ), ], targets: [ .target( name: "Overlays", dependencies: ["BackendServices", "MeteorDDP", "OverlaysRealityKitContent"] ), .testTarget( name: "OverlaysTests", dependencies: ["Overlays"]), ] ) Based on a recommendation in the Swift forum, I also tried this: dependencies: [ ... .package( name: "OverlaysRealityKitContent", path: "Packages/OverlaysRealityKitContent" ), ], targets: [ .target( name: "Overlays", dependencies: [ "BackendServices", "MeteorDDP", .product(name: "OverlaysRealityKitContent", package: "OverlaysRealityKitContent", condition: .when(platforms: [.visionOS])) ] ), ... ] but this won't work either. The problem seems to be that the package is listed under dependencies, which makes the realitytool kick in. Is there a way to avoid this? I definitely need the RealityKitContent package being part of the Overlay package, since the latter depends on the content (on visionOS). And I would not want to split the package up in two parts (one for iOS and one for visionOS), if possible.
1
0
396
May ’24
How to make an entity move in a RealityView so that collisions can be detected
I'm trying to detect when two entities collide. The following code shows a very basic set-up. How do I get the upperSphere to move? If I set its physicsBody.mode to .dynamic it moves with gravity (and the collision is reported), but when in kinematic mode it doesn't respond to the impulse: struct CollisionView: View { @State var subscriptions: [EventSubscription] = [] var body: some View { RealityView { content in let upperSphere = ModelEntity(mesh: .generateSphere(radius: 0.04)) let lowerSphere = ModelEntity(mesh: .generateSphere(radius: 0.04)) upperSphere.position = [0, 2, -2] upperSphere.physicsBody = .init() upperSphere.physicsBody?.mode = .kinematic upperSphere.physicsMotion = PhysicsMotionComponent() upperSphere.generateCollisionShapes(recursive: false) lowerSphere.position = [0, 1, -2] lowerSphere.physicsBody = .init() lowerSphere.physicsBody?.mode = .static lowerSphere.generateCollisionShapes(recursive: false) let sub = content.subscribe(to: CollisionEvents.Began.self, on: nil) { _ in print("Collision!") } subscriptions.append(sub) content.add(upperSphere) content.add(lowerSphere) Task { try? await Task.sleep(for: .seconds(2)) print("Impulse applied") upperSphere.applyLinearImpulse([0, -1, 0], relativeTo: nil) } } } }
3
0
409
May ’24
GroundingShadowComponent tanks performance even when set to false
Working on a vision OS app. I've noticed that even when castsShadow is false, performance goes down the drain when there are more than a few dozen entities that have GroundingShadowComponent. I managed to hard crash the Vision Pro with about 200 or so entities that each had two ModelEntities with GroundingShadowComponent attached but set to castShadows = false. My solution is to add and remove the GroundingShadowComponent from entities as needed, but I thought maybe someone at Apple might want to look into this. I don't expect great performance with that many entities casting shadows, but I'd think turning the shadow off would effectively disable the component and not incur a performance penalty.
0
0
406
May ’24
IOS issues... Guideline 4.3(a) - Design - Spam
Hey everyone, My game got rejected due to Guideline 4.3. Here's the gameplay video: https://www.youtube.com/watch?v=Vb6uBUsOSDg Does anyone have experience dealing with Guideline 4.3 and can share some insights? I already appealed, explaining that my game has online multiplayer battles, but no response yet. Feeling frustrated after spending six months on this game. Any advice or suggestions would be appreciated. Thanks!
0
0
471
May ’24
IOSurface objects aren't released in ScreenCaptureKit
I use ScreenCaptureKit, CoreVideo, CoreImage, CoreMedia frameworks to capture screenshots on macOS 14.0 and higher. Example of creating CGImageRef: CVImageBufferRef cvImageBufferRef = ..; CIImage* temporaryImage = [CIImage imageWithCVPixelBuffer:cvImageBufferRef]; CIContext* temporaryContext = [CIContext context]; CGImageRef imageRef = [temporaryContext createCGImage:temporaryImage fromRect:CGRectMake(0, 0, CVPixelBufferGetWidth(cvImageBufferRef), CVPixelBufferGetHeight(cvImageBufferRef))]; I have the next results of profiling with XCode Instruments Memory Leaks & Allocations: there is constantly increasing memory usage, but no memory leaks are detected, and there are many calls to create IOSurface objects, that have been never released. The most part of memory - All Anonymous VM - VM: IOSurface. The heaviest stack trace: [RPIOSurfaceObject initWithCoder:] [IOSurface initWithMachPort:] IOSurfaceClientLookupFromMachPort I don't have any of IOSurface objects created by myself. There are low-level calls to it. In Allocation List I can see many allocations of IOSurface objects, but there are no info about releasing it. Due to this info, how can I release them to avoid permanent increasing memory consumption?
2
0
452
May ’24
fail install game porting toolkit
when i install game porting toolkit 1.1 ,i got an error as below: Error: Failure while executing; patch -g 0 -f -p0 exited with 2 ==> Installing game-porting-toolkit from apple/apple ==> Staging /Users/***/Library/Caches/Homebrew/downloads/bf215bb93a68251243c69e19a3ba0d670aa82daad63a2bfe24d7ff0a11c2c969--crossover-sources-24.0.2.tar.gz in /private/tmp/game-porting-toolkit-20240520-35139-asrgp4 ==> Patching Error: Failure while executing; patch -g 0 -f -p0 exited with 2. Here's the output: patching file 'wine/include/distversion.h' patching file 'wine/configure guys who knows how to fix it?
0
0
267
May ’24
Failed to perform GPU performance counter
Hi all, I am trying to measure the performce of my video game in iOS platform. After launching the debugger and caputre GPU workload (with "frame" selected as scope), I can not do performance profiling. The error message is: "Failed to enable shader profiler. (516)". And if I export the GPU trace and reopen it, the Xcode can not find any compatible devices connected. However, the same device used for capturing is connected. Device informations are: Mac device: MacBook Air M2, 2022 Mac OS version: Sonoma 14.4 XCode version: 15.2 (15C500b) mobile device: iPhone 13 Pro Max iOS version: 15.0 GPU performance counter can be performed using the same mobile device in my colleague's Mac. So I think there might be something wrong with my Xcode.
2
0
389
May ’24
Camera pointOfView vs. Error
Dear all, I have several scenes, each with it’s own camera at different positions. The scenes will be loaded with transitions. If I set the pointOfView in every Scene to the scene-camera, the transitions don’t work properly. The active scene View switches to the position of the camera of the scene, which is fading in. If I comment the pointOfView out, the transitions works fine, but the following error message appears: Error: camera node already has an authoring node - skip Has someone an idea to fix this? Many Thanks, Ray
0
0
258
May ’24
Store USDZ with SwiftData
I am trying to store usdz files with SwiftData for now. I am converting usdz to data, then storing it with SwiftData My model import Foundation import SwiftData import SwiftUI @Model class Item { var name: String @Attribute(.externalStorage) var usdz: Data? = nil var id: String init(name: String, usdz: Data? = nil) { self.id = UUID().uuidString self.name = name self.usdz = usdz } } My function to convert usdz to data. I am currently a local usdz just to test if it is going to work. func usdzData() -> Data? { do { guard let usdzURL = Bundle.main.url(forResource: "tv_retro", withExtension: "usdz") else { fatalError("Unable to find USDZ file in the bundle.") } let usdzData = try Data(contentsOf: usdzURL) return usdzData } catch { print("Error loading USDZ file: \(error)") } return nil } Loading the items @Query private var items: [Item] ... var body: some View { ... ForEach(items) { item in HStack { Model3D(?????) { model in model .resizable() .scaledToFit() } placeholder: { ProgressView() } } } ... } How can I load the Model3D? I have tried: Model3D(data: item.usdz) Gives me the errors: Cannot convert value of type '[Item]' to expected argument type 'Binding<C>' Generic parameter 'C' could not be inferred Both errors are giving in the ForEach. I am able to print the content inside item: ForEach(items) { item in HStack { Text("\(item.name)") Text("\(item.usdz)") } } This above works fine for me. The item.usdz prints something like Optional(10954341 bytes) I would like to know 2 things: Is this the correct way to save usdz files into SwiftData? Or should I use FileManager? If so, how should I do that? Also how can I get the usdz from the storage (SwiftData) to my code and use it into Model3D?
3
0
388
May ’24
EXC_BAD_ACCESS crash with .replace(withDrawables: ) on MaterialParameters.Value.textureResource
I get a crash every time I try to swap this texture for a drawable queue. I have a DrawableQueue leftQueue created on the main thread, and I invoke this block on the main thread. Scene.usda contains a reference to a model cinema. It crashes on the line with the replace(). if let shaderMaterial = try? await ShaderGraphMaterial(named: "/Root/cinema/_materials/Screen", from: "Scene.usda", in: theaterBundle) { if let imageParam = shaderMaterial.getParameter(name: "image"), case let .textureResource(imageTexture) = imageParam { imageTexture.replace(withDrawables: leftQueue) } } One weird thing, the imageParam has an invalid value when it crashes. imageParam RealityFoundation.MaterialParameters.Value <invalid> (0x0) Here is the stack trace is: * thread #1, queue = 'com.apple.main-thread', stop reason = EXC_BAD_ACCESS (code=1, address=0x9) frame #0: 0x0000000191569734 libobjc.A.dylib`objc_release + 8 frame #1: 0x00000001cb9e5134 CoreRE`re::SharedPtr<re::internal::AssetReference>::reset(re::internal::AssetReference*) + 64 frame #2: 0x00000001cba77cf0 CoreRE`re::AssetHandle::operator=(re::AssetHandle const&) + 36 frame #3: 0x00000001ccc84d14 CoreRE`RETextureAssetReplaceDrawableQueue + 228 frame #4: 0x00000001acf9bbcc RealityFoundation`RealityKit.TextureResource.replace(withDrawables: RealityKit.TextureResource.DrawableQueue) -> () + 164 * frame #5: 0x00000001006d361c Screenlit`TheaterView.setupMaterial(self=Screenlit.TheaterView @ 0x000000011e2b7a30) at TheaterView.swift:245:74 frame #6: 0x00000001006e0848 Screenlit`closure #1 in closure #1 in closure #1 in closure #1 in TheaterView.body.getter(self=Screenlit.TheaterView @ 0x000000011e2b7a30) at TheaterView.swift:487 frame #7: 0x00000001006f1658 Screenlit`partial apply for closure #1 in closure #1 in closure #1 in closure #1 in TheaterView.body.getter at <compiler-generated>:0 frame #8: 0x00000001004fb7d8 Screenlit`thunk for @escaping @callee_guaranteed @Sendable @async () -> (@out τ_0_0) at <compiler-generated>:0 frame #9: 0x0000000100500bd4 Screenlit`partial apply for thunk for @escaping @callee_guaranteed @Sendable @async () -> (@out τ_0_0) at <compiler-generated>:0
2
0
337
May ’24
Error When Building Game Porting Toolkit 1.1
Does anyone know what this might be? /private/tmp/game-porting-toolkit-20240516-2293-dagdsr/wine/dlls/wow64cpu/cpu.c:356:18: error: assignment to 'DWORD64' {aka 'long long unsigned int'} from 'PVOID' {aka 'void '} makes integer from pointer without a cast [-Wint-conversion] 356 | context->Rsp = NtCurrentTeb()->TlsSlots[2]; / WOW64_TLS_WINEHYBRID_RESERVED_R14 */ | ^ make: *** [dlls/wow64cpu/cpu.cross.o] Error 1 make: *** Waiting for unfinished jobs.... Path: /usr/local/Homebrew/Library/Taps/apple/homebrew-apple/Formula/game-porting-toolkit.rb ==> Configuration HOMEBREW_VERSION: 4.3.0 ORIGIN: https://github.com/Homebrew/brew HEAD: 8378cc825d83acffd125fb0fec041793df378a57 Last commit: 3 days ago Core tap JSON: 17 May 05:28 UTC Core cask tap JSON: 17 May 05:28 UTC HOMEBREW_PREFIX: /usr/local HOMEBREW_CASK_OPTS: [] HOMEBREW_MAKE_JOBS: 10 Homebrew Ruby: 3.1.4 => /usr/local/Homebrew/Library/Homebrew/vendor/portable-ruby/3.1.4/bin/ruby CPU: 10-core 64-bit westmere Clang: 15.0.0 build 1500 Git: 2.39.3 => /Library/Developer/CommandLineTools/usr/bin/git Curl: 8.4.0 => /usr/bin/curl macOS: 14.4.1-x86_64 CLT: 15.1.0.0.1.1700200546 Xcode: 15.1 => /Users/raiderpig/Downloads/Xcode.app/Contents/Developer Rosetta 2: true
4
3
928
May ’24