Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics

Post

Replies

Boosts

Views

Activity

Blit color&depth texture to current MTKView
I have a fbo generated by last frame, which contain a color texture and a depth texture. Then i want blit that fbo to current MTKView, but failed. Thanks a lot for any suggestions. Here is the code: last frame: id <MTLTexture> src_color; id <MTLTexture> src_depth; drawTo(src_color, src_depth); current frame: first, init a depth texture if null: id<MTLTexture> depth_texture_; if(depth_texture_) { depth_texture_ = [device_ newTextureWithDescriptor:desc]; } second, create desc for encoder: current_desc = metal_view_.currentRenderPassDescriptor; current_desc.depthAttachment.texture = depth_texture_; current_desc.depthAttachment.loadAction = MTLLoadActionClear; current_desc.depthAttachment.clearDepth = 1.0; current_desc.stencilAttachment.texture = depth_texture_; current_desc.stencilAttachment.loadAction = MTLLoadActionClear; current_desc.stencilAttachment.clearStencil = 0; current_desc.colorAttachments[0].loadAction = MTLLoadActionClear; current_desc.colorAttachments[0].clearColor = MTLClearColorMake(1.0, 1, 1, 1.0); third, blit to current MTKView: auto dst_color = current_desc.colorAttachments[0].texture; auto dst_depth = current_desc.depthAttachment.texture; id<MTLCommandBuffer> commandBuffer = [command_queue_ commandBuffer]; commandBuffer.label = @"blit"; id <MTLBlitCommandEncoder> blitEncoder = [commandBuffer blitCommandEncoder]; [blitEncoder copyFromTexture:src_color sourceSlice:0 sourceLevel:0 sourceOrigin:MTLOriginMake(srcRect.x, srcRect.y, 0) sourceSize:MTLSizeMake(srcRect.width, srcRect.height, 1) toTexture:dst_color destinationSlice:0 destinationLevel:0 destinationOrigin:MTLOriginMake(dstRect.x, dstRect.y, 0)]; [blitEncoder copyFromTexture:src_depth sourceSlice:0 sourceLevel:0 sourceOrigin:MTLOriginMake(srcRect.x, srcRect.y, 0) sourceSize:MTLSizeMake(srcRect.width, srcRect.height, 1) toTexture:dst_depth destinationSlice:0 destinationLevel:0 destinationOrigin:MTLOriginMake(dstRect.x, dstRect.y, 0)]; [blitEncoder endEncoding]; [commandBuffer commit]; [commandBuffer waitUntilCompleted];
1
0
255
Apr ’24
Metal-cpp doesn't show window content on startup
I'm trying to follow the metal-cpp tutorials I've found at https://developer.apple.com/metal/sample-code/?q=learn The program seems to be launching correctly (I can see the menu bar and interact with it), but nothing is rendered inside the window. I suppose the culprit is somewhere in the following function (I see it binds the device, the view and the window with the object in charge of drawing stuff in the view) void core::Application::applicationDidFinishLaunching(NS::Notification *pNotification) { CGRect frame = (CGRect){{100.0, 100.0}, {512.0, 512.0}}; m_Window->init(frame, NS::WindowStyleMaskClosable | NS::WindowStyleMaskTitled, NS::BackingStoreBuffered, false); m_Device = MTL::CreateSystemDefaultDevice(); m_View = MTK::View::alloc()->init(frame, m_Device); m_View->setColorPixelFormat(MTL::PixelFormat::PixelFormatBGRA8Unorm); m_View->setClearColor(MTL::ClearColor::Make(1.0, 0.0, 0.0, 1.0)); m_ViewDelegate = new graphics::ViewDelegate(m_Device); m_View->setDelegate(m_ViewDelegate); m_Window->setContentView(m_View); m_Window->setTitle(NS::String::string("Template 1", NS::StringEncoding::UTF8StringEncoding)); m_Window->makeKeyAndOrderFront(nullptr); NS::Application* nsApp = reinterpret_cast<NS::Application*>(pNotification->object()); nsApp->activateIgnoringOtherApps(true); } but, as you can infer from the fact that I'm failing at the very first tutorial of the bunch, I'm quite lost. I've tried debugging the app with the Xcode debugger and I saw that it never enters in this function. void ViewDelegate::drawInMTKView(MTK::View *pView) { m_Renderer->Draw(pView); } Can it be a symptom of some call missing from my code? Thank you in advance for your help
0
0
361
Apr ’24
Adding geometry asynchronously [do work in background thread]
Hey, I'm wondering what would be the proper way to add RealityView content asynchronously, while doing the heavy lifting in a background thread. My use case is that I am generating procedural geometry which takes a few seconds to complete. Meanwhile I would like the UI to show other geometry / UI elements and the Main thread to be responsive. Basically what I would like to do, in pseudocode, is: runInBackgroundThread { let geometry = generateGeometry() // CPU intensive, takes 1-2 s let entity = createEntity(geometry) // CPU intensive, takes ~1 s let material = try! await ShaderGraphMaterial(..) entity.model!.materials = [material] runInMainThread { addToRealityViewContent(entity) } } With this I am running into so many issues with especially the material, which apparently cannot be constructed on a non-main thread and cannot be passed over thread borders.
0
0
296
Mar ’24
Knowing GPU architecture for better compute programs.
Hi, I have a CUDA program that I want to convert to Metal Compute so that we can support Apple hardware. When I wrote the CUDA version, I was able to write efficient code because I learned first about the Cuda-core architecture. The way the cores can access memory for instance is very important information so that I could write code that efficiently access the memory. Now I want to do the same for the Metal Compute software. But I can not find any information about the low level architecture and especially the things you should know to be able to write efficient code. Do I miss something? Is there some guide giving hints for the most efficient way to access memory for instance?
0
0
290
Mar ’24
New Game Center Sign In Screen: How to know when it is presented, covering my app?
Recently (I'm not sure exactly when), the "Sign in to Game Center" banner started appearing at the top of my app when the app sets the GC authentication handler and there is no Game Center player currently signed in on the device: So far so good. But if the banner is tapped, a full "Sign In to Game Center" modal view automatically appears and covers the app without notification: This is not the sign-in view controller that the GC authentication handler normally passes when a player is not signed in, which previously gave my app control over when to present it. My app is unaware that it is covered by this new sign-in screen. Is there any way for my app to know when the user taps the "Sign In" banner and causes this new automatic sign-in screen to appear? I need to pause my game while it is covered up. In general, is there a way for my view controller to be notified when it has been covered by a modal view controller that's outside my app's control?
0
0
402
Mar ’24
Compatibility Inquiry: 8BitDo Ultimate 2.4G Wireless Controller on macOS
Hi there, I'm reaching out to inquire about compatibility restrictions regarding the 8BitDo Ultimate 2.4G Wireless Controller on macOS systems, particularly its functionality with third-party applications such as Construct 3, a browser-based game development software. As an indie game developer, I recently acquired the 8BitDo Ultimate controller for its touted compatibility and functionality. While I've primarily used it in Bluetooth mode on my Mac without encountering any significant issues, I've stumbled upon a peculiar problem when using the controller while developing projects in Construct 3, specifically within the preview mode of the game engine. In attempting to test my projects using Construct 3's preview feature, the controller fails to detect any of my inputs. This is particularly confounding as the controller operates flawlessly when I export a proper macOS build of the game from the engine and play it externally. To further investigate, I experimented with Nintendo Switch Joy-Cons, which surprisingly worked seamlessly without encountering any issues during testing within Construct 3's preview mode. Upon reaching out to both Construct 3 and 8BitDo customer support, I was informed that the compatibility restriction with third-party controllers on macOS is due to limitations imposed by Apple, which are beyond their control. Consequently, I sought assistance from Apple Developer Program Support, who directed me to contact the Code-level Technical Support team or inquire on the developer forum. The lack of compatibility with crucial development tools like Construct 3 significantly hampers my workflow as an indie developer, and I'm eager to find a resolution to this issue. Thank you for your time, and I'm hopeful that someone within the community might be able to provide guidance on this matter.
4
0
706
Mar ’24
How can i make a disco ball shader that shines light on scene mesh in vision pro
I am trying to make a shader for a disco ball lighting effect for my app. I want the light to reflect on the scene mesh. i was curious if anyone has pointers on how to do this in shader graph in reality composer pro or writing a surface shader. The effect rotates the dots as the ball spins. This is the effect in the apple clips that applies the effect to the scene mesh
2
0
504
Mar ’24
VisionOS RealityKit
I have a plane that is stereoscopic so represents to the user depth that is beyond the plane. I would like to have the options to render the depth buffer for the pixels or to not render any information into the depth for the plane. I cannot see any option in Shader Graph Material to affect the depth buffer during render. I also cannot see any way in RealityKit to not render to the depth buffer for an entity. I'm open to any suggestions.
1
0
480
Mar ’24
Adding Player as an attachment
Adding AVPlayer as attachment on the side using RealityKit. The video in it thought is not aligned. And thoughts on what could be going wrong? RealityView { content, attachments in let url = self.video.resolvedURL let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) var videoPlayerComponent = VideoPlayerComponent(avPlayer: player) videoPlayerComponent.isPassthroughTintingEnabled = true // entity.components[VideoPlayerComponent.self] = videoPlayerComponent entity.position = [0, 0, 0] entity.scale *= 0.50 player.replaceCurrentItem(with: playerItem) player.play() content.add(entity) } update: { content, attachments in // if content.entities.count < 2 { if showAnotherPlayer { if let attachment = attachments.entity(for: "Attachment") { playerModel.loadVideo(library.selectedVideo!, presentation: .fullWindow) //4. Position the Attachment and add it to the RealityViewContent attachment.position = [1.0, 0, 0] attachment.scale *= 1.0 //let radians = -45.0 * Float.pi / 180.0 //attachment.transform.rotation += simd_quatf(angle: radians, axis: SIMD3<Float>(0,1,0)) let entity = content.entities.first attachment.setParent(entity) content.add(attachment) } } if showLibrary { if let attachment = attachments.entity(for: "Featured") { //4. Position the Attachment and add it to the RealityViewContent attachment.position = [0.0, -0.3, 0] attachment.scale *= 0.7 //let radians = -45.0 * Float.pi / 180.0 //attachment.transform.rotation += simd_quatf(angle: radians, axis: SIMD3<Float>(0,1,0)) let entity = content.entities.first attachment.setParent(entity) viewModel.attachment = attachment content.add(attachment) } } else { if let scene = content.entities.first?.scene { let _ = print("found scene") } if let featuredEntity = content.entities.first?.scene?.findEntity(named: "Featured") { let _ = print("featured entity found") } if let attachment = viewModel.attachment { let _ = print("-- removing attachment") if let anchor = attachment.anchor { let _ = print("-- removing anchor") anchor.removeFromParent() } attachment.removeFromParent() content.remove(attachment) } else { let _ = print("the attachment is missing") } } // } } attachments: { Attachment(id: "Attachment") { PlayerView() .frame(width: 2048, height: 1024) .environment(library) .environment(playerModel) .onAppear { DispatchQueue.main.asyncAfter(deadline: .now()+1) { playerModel.play() } } .onDisappear { } } if showLibrary { Attachment(id: "Featured") { VideoListView(title: "Featured", videos: library.videos, cardStyle: .full, cardSpacing: 20) { video in library.selectedVideo = video showAnotherPlayer = true } .frame(width: 2048, height: 1024) } } } PlayerView
0
1
324
Mar ’24
Dynamically Loading USDZ Objects from an Array into a fully immersive Scene using Reality Kit
Hello, I am currently working on a project where I am creating a bookstore visualization with racks and shelves(Full immersive view). I have an array of names, each representing a USDZ object that is present in my working directory. Here’s the enum I am trying to iterate over: enum AssetName: String, Codable, Hashable, CaseIterable { case book1 = "B1" case book2 = "B2" case book3 = "B3" case book4 = "B4" } and the code for adding objects I wrote: import SwiftUI import RealityKit struct LocalAssetRealityView: View { let assetName: AssetName var body: some View { RealityView { content in if let asset = try? await ModelEntity(named: assetName.rawValue) { content.add(asset) } } } } Now I get the error, when I try to add multiple objects on Button click: Unable to present another Immersive Space when one is already requested or connected please suggest any solutions. Also suggest if anything can be done to add positions for the objects as well programatically.
1
0
458
Mar ’24
How do I properly set tagged color data in MTKView and CIContext?
I have provided a test UIKit app which displays three different images, side by side, each inside a separate MTKView. Each image is tagged with a different color profile: Display P3 uRGB Test RGB (from an image supplied in Apple's ImageApp sample). I set up default values for all color spaces and formats. I then check if the image is tagged and, if so, I override those values with state from the tagged color space. The variables I am setting: “workingColorSpace” in the Metal CIContext, default = sRGB “workingFormat” in the Metal CIContext, default = RGBAf “outputColorSpace” in the Metal CIContext, default = displayP3 “colorPixelFormat” in the MTKView, default = bgra8Unorm “colorSpace” in a CIRenderDestination that I use in the MTKView delegate draw method The “colorSpace” default value = CGColorSpaceCreateDeviceRGB() I also set “pixelFormat” in CIRenderDestination with the MTKView.colorPixelFormat. If the image is tagged, I override the following values with the tagged colorSpace: CIContext.workingColorSpace CIContext.outputColorSpace CIRenderDestination.colorSpace If the tagged colorSpace.isWideGamutRGB = true, then I set the CIRenderDestination.colorSpace to extendedSRGB, ignoring the color space in the tagged wide gamut color space, as well as set the colorPixelFormat = bgr10_xr Results: The above scenario will properly render the DisplayP3 image, and the uRGB image. The “Test RGB” image fails: If I do not override the CIRenderDestination.colorSpace with a value from the tagged image, then the “Test RGB” image succeeds, but the “uRGB” image fails to render properly: Question: Do I have everything hooked up correctly and, if so, why does one image fail, and the other succeed? Link to sample project: https://www.dropbox.com/scl/fi/57u2fcrgdvys7jtzykzxt/ColorSpaceTest.zip?rlkey=unjeeiu7mi0wx9wfpylt78nwd&dl=0
2
0
489
Mar ’24
Error while using JAX
Platform 'METAL' is experimental and not all JAX functionality may be correctly supported! 2024-03-23 22:04:38.947506: W pjrt_plugin/src/mps_client.cc:563] WARNING: JAX Apple GPU support is experimental and not all JAX functionality is correctly supported! Metal device set to: Apple M1 Pro systemMemory: 16.00 GB maxCacheSize: 5.33 GB loc("-":0:0): error: current mps dialect version is 1.0.0, can't parse version 1.1.0 /AppleInternal/Library/BuildRoots/495c257e-668e-11ee-93ce-926038f30c31/Library/Caches/com.apple.xbs/Sources/MetalPerformanceShadersGraph/mpsgraph/MetalPerformanceShadersGraph/Core/Files/MPSGraphExecutable.mm:1097: failed assertion `Error importing MLIR bytecode. ' zsh: abort python -c 'import jax; print(jax.numpy.arange(10))'
2
1
478
Mar ’24
Can't see video when play video using VideoPlayerComponent
Hi, I am implementing player using RealityKit's VideoPlayerComponent and AVPlayer. When app enter immersive space, playback beigns. But only audio playabck, I can't see video. Do I need specify entity's position and size? struct MyApp: App { @State private var playerImmersionStyle: ImmersionStyle = .full var body: some Scene { WindowGroup { ContentView() } .defaultSize(width: 800, height: 200) ImmersiveSpace(id: "playerImmersionStyle") { ImmersiveSpaceView() } .immersionStyle(selection: $playerImmersionStyle, in: playerImmersionStyle) } func application(_ application: UIApplication, configurationForConnecting connectingSceneSession: UISceneSession, options: UIScene.ConnectionOptions) -> UISceneConfiguration { return UISceneConfiguration(name: "My Scene Configuration", sessionRole: connectingSceneSession.role) } } struct PlayerViewEx: View { let entity = Entity() var body: some View { RealityView() { content in let entity = makeVideoEntity() content.add(entity) } } public func makeVideoEntity() -> Entity { let url = Bundle.main.url(forResource: "football", withExtension: "mov")! let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) let player = AVPlayer() var videoPlayerComponent = VideoPlayerComponent(avPlayer: player) videoPlayerComponent.isPassthroughTintingEnabled = true entity.components[VideoPlayerComponent.self] = videoPlayerComponent entity.scale *= 0.4 player.replaceCurrentItem(with: playerItem) player.play() return entity } } #Preview { PlayerViewEx() }
0
0
255
Mar ’24
I delete Gaming Port Kit and want to redownload it again, than it happened.
I've installed the GPTK before, but it pops up with some problems so I want to reinstall it again. So I deleted home brew and redownload all the materials I needed. But when it comes to install the game-porting-toolkit formula, I can't download it, did anyone have the same problem with me? jimmy@Jimmymbp14 ~ % brew -v install apple/apple/game-porting-toolkit Error: Formulae found in multiple taps: * apple/apple/game-porting-toolkit-compiler * gcenx/apple/game-porting-toolkit-compiler Please use the fully-qualified name (e.g. apple/apple/game-porting-toolkit-compiler) to refer to a specific formula. jimmy@Jimmymbp14 ~ %
0
0
530
Mar ’24
mesh shader and vertex amplification
I tried to render to two layers using vertex amplification in my mesh shader program, but in the vision pro only the left eye has content and it contains 2 eyes image, and when I switch the mapping0.renderTargetArrayIndexOffset in the encoder, it does not transfer the image to the right eye. Using vertex amplification achieve 2 eyes rendering?
1
0
338
Mar ’24
Cannot locate Release or Debug native libraries for Apple Unity plugins for iOS.
I am in Unity 2022.3.21f1 using the Apple plugins for Unity with the following versions: Apple.Core - 3.1.0 Apple.Accessibility - 1.1.0 Apple.GameController - 1.2.0 Apple.GameKit - 2.2.0 I am on MacOS 14.4 Apple Silicon and Xcode 15.3. I started working in a file that I haven't worked on in a while, and found that I was getting errors in Unity with the Apple.Accessibility plugin, so I updated the Apple plugins and stopped getting the errors. However, when I went to build my project (which is just for iOS), I now get the following error for each of the four plugins I have installed: Please ensure that the build invocation (build.py, xcodebuild, or Xcode) compiled cleanly and that the build was configured to support Release on iOS. UnityEngine.Debug:LogError (object) Apple.Core.AppleNativeLibraryUtility:ProcessWrapperLibrary (string,UnityEditor.BuildTarget,string,UnityEditor.iOS.Xcode.PBXProject) (at ./Library/PackageCache/com.apple.unityplugin.core@ba71bdbec187/Editor/ApplePlugInEnvironment.cs:604) Apple.GameController.Editor.AppleGameControllerBuildStep:OnProcessFrameworks (Apple.Core.AppleBuildProfile,UnityEditor.BuildTarget,string,UnityEditor.iOS.Xcode.PBXProject) (at ./Library/PackageCache/com.apple.unityplugin.gamecontroller@4ec66225948e/Editor/AppleGameControllerBuildStep.cs:61) Apple.Core.AppleBuild:OnPostProcessBuild (UnityEditor.BuildTarget,string) (at ./Library/PackageCache/com.apple.unityplugin.core@ba71bdbec187/Editor/AppleBuild.cs:195) UnityEditor.EditorApplication:Internal_CallGlobalEventHandler () (at /Users/bokken/build/output/unity/unity/Editor/Mono/EditorApplication.cs:493) I downloaded these plugins from Github and built with the build.py script, and had no errors in doing so. I've tried rebuilding multiple times and even specifying the platform as Release (although the default is all so it should have built Release anyways). I've tried rolling back to previous versions of the plugins as well with no luck so far. I don't remember which exact versions I used to be on but have had no luck with the approximate ones. Does anyone know how I can point Unity to the NativeRelease folders? I've checked that the frameworks for my libraries are there (i.e. at ../Library/PackageCache/com.apple.unityplugin.core@287366a1eaa5/NativeLibraries~/Release/iOS/AppleCoreNative.framework)
2
0
547
Mar ’24
Excluding Apple TV 5,3 (Apple TV HD 2015) from supported device list for for our game?
We have built the game on Unreal engine 4 and we have optimised the game to run on tvOS devices newer than 2017 (viz. Apple TV 4k and above). We could not bring it down to support Apple TV HD (2015) due to its visual and memory requirements. Is there a way to exclude Apple TV HD from support list. We couldnt find any required device capability to add to info.plist (eg: iphone-ipad-minimum-performance-a12, we tried it but this does not work for tvOS build).
0
1
419
Mar ’24