Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics

Post

Replies

Boosts

Views

Activity

Transferring Apps with iCloud KVS
Hi All! I'm being asked to migrate an app which utilizes iCloud KVS (Key Value Storage). This ability is a new-ish feature, and the documentation about this is sparse [1]. Honestly, the entire documentation about the new iCloud transfer functionality seems to be missing. Same with Game Center / GameKit. While the docs say that it should work, I'd like to understand the process in more detail. Has anyone migrated an iCloud KVS app? What happens after the transfer goes through, but before the first release? Do I need to do anything special? I see that the Entitlements file has the TeamID in the Key Value store - is that fine? <key>com.apple.developer.ubiquity-kvstore-identifier</key> <string>$(TeamIdentifierPrefix)$(CFBundleIdentifier)</string> Can someone please share their experience? Thank you! [1] https://developer.apple.com/help/app-store-connect/transfer-an-app/overview-of-app-transfer
2
0
1.1k
May ’23
Ai controlled i-devices dev help
hey if i wanted to create an app that takes screenshots from an apple device (and any app within) to give context to an ai so the ai can then respond. Then the app parses the response then executes commands on behalf of the ai/user, how would I do so with the rule that "screenshots/captures are not allowed within other apps"? Want to stay within bounds of the rules in place. Possibilities: Ai assistant, Ai pals, passive automation
5
0
1.5k
May ’23
Game Center leaderboard privacy
I have implemented a standard GKLeaderboard in my app. The leaderboard includes the player's avatar, display name, and the score. I only use functionality provided by GameKit without any custom server functionality. I don't even have an own server. Still, my app got rejected with the following notice: We noticed that your app does not obtain the user's consent prior to uploading users' scores to a global leaderboard. To collect personal data with your app, you must make it clear to the user that their personal data will be uploaded to your server. What should I do here? Do I really have to obtain user's consent before uploading his score to Game Center?
2
0
835
May ’23
Unity Game Crashes After GameCenter Login Attempt
We have Apple Unity Plugins imported to our project. We only include Apple.Core and Apple.GameKit. We have tried to import Empty 3D URP Sample Project and project worked on iOS but when we try to build for our project we got this error. On Empty Project, we just import the Apple Plugins, nothing about authentication or login to Apple Game Center. But in our main project, we login to Apple Game Center. We disabled the GameCenter Login code and game opens perfectly. No errors. The main problem is app crashes when we attempt to login to Apple Game Center. Do you have any suggestions for this? Unity: 2022.3.1f1 LTS (Current Latest LTS) OS: macOS Ventura 13.4 (Latest Stable) XCode: 14.3.1 (Latest Stable) Apple Plugins: Apple.Core 1.0.3 - Apple.GameKit 1.0.4 (Latest)
4
1
1.3k
Jun ’23
Jax-Metal - error: failed to legalize operation 'mhlo.cholesky'
After building jaxlib as per the instructions and installing jax-metal, upon testing upon an existing model which works fine using CPU (and GPU on linux), I get the following error. jax._src.traceback_util.UnfilteredStackTrace: jaxlib.xla_extension.XlaRuntimeError: UNKNOWN: /Users/adam/Developer/Pycharm Projects/gpy_flow_test/sparse_gps.py:66:0: error: failed to legalize operation 'mhlo.cholesky' /Users/adam/Developer/Pycharm Projects/gpy_flow_test/sparse_gps.py:66:0: note: called from /Users/adam/Developer/Pycharm Projects/gpy_flow_test/sparse_gps.py:66:0: note: see current operation: %406 = "mhlo.cholesky"(%405) {lower = true} : (tensor<50x50xf32>) -> tensor<50x50xf32> A have tried to reproduce this with the following minimal example, but this works fine. from jax import jit import jax.numpy as jnp import jax.random as jnr import jax.scipy as jsp key = jnr.PRNGKey(0) A = jnr.normal(key, (100,100)) def calc_cholesky_decomp(test_matrix): psd_test_matrix = test_matrix @ test_matrix.T col_decomp = jsp.linalg.cholesky(psd_test_matrix, lower=True) return col_decomp calc_cholesky_decomp(A) jitted_calc_cholesky_decomp = jit(calc_cholesky_decomp) jitted_calc_cholesky_decomp(A) I am unable to attach the full error message has it exceeds all the restricts placed on uploads attached to a post. I am more than happy to try a more complex model if you have any suggestions.
7
3
1k
Jun ’23
iOS 17 AR QuickLook: Support for multiple UV channels
Is there support for using multiple UV channels in AR QuickLook in iOS17? One important use case would be to put a tiling texture in an overlapping tiling UV set while mapping Ambient Occlusion to a separate unwrapped non-overlapping UV set. This is very important to author 3D content combining high-resolution surface detail and high-quality Ambient Occlusion data while keeping file size to a minimum.
2
0
1.3k
Jun ’23
Debug symbols in metallib
Hello, I’ve started testing the Metal Shader Converter to convert my HLSL shaders to metallib directly, and I was wondering if the option ’-frecord-sources’ was supported in any way? Usually I’m compiling my shaders as follows (from Metal): xcrun -sdk macosx metal -c -frecord-sources shaders/shaders.metal -o shaders/shaders.air xcrun -sdk macosx metallib shaders/shaders.air -o shaders/shaders.metallib The -frecord-sources allow me to see the source when debugging and profiling a Metal frame. Now with DXC we have a similar option, I can compile a typical HLSL shader with embedded debug symbols with: dxc -T vs_6_0 -E VSMain shaders/triangle.hlsl -Fo shaders/triangle.dxil -Zi -O0 -Qembed_debug The important options here are ’-Zi` and ’-Qembed_debug’, as they make sure debug symbols are embedded in the DXIL. It seems that right now Metal Shader Converter doesn’t pass through the DXIL debug information, and I was wondering if it was possible. I’ve looked at all the options in the utility and haven’t seen anything that looked like it. Right now debug symbols in my shaders is a must-have, so I’ll explore other routes to convert my HLSL shaders to Metal (I’ve been testing spir-v cross to do the conversion, I haven’t actually tested the debug symbols yet, I’ll report back later). Thank you for your time!
4
0
1.2k
Jun ’23
Buzzing when activate Voice Chat feature from GameKit
I got the sample project from Apple’s official documentation (https://developer.apple.com/documentation/gamekit/creating_real-time_games/). That sample project is a simple real-time game where two players are immediately aware of the actions each other takes. I figured out that one of the player's phone keeps buzzing when the voice chat is turned on. Suppose we have player A and player B. When the Voice Chat is activated, I noticed that player A can clearly hear player B's voice, but player B cannot hear anything except for their own voice, which can be heard by player A. Occasionally, this situation can switch. When a player is unable to hear anything, they only hear a constant beeping sound, even though their voice can still be heard by others. What happen? Anyone figured out how to solve this?
1
0
555
Jun ’23
Sample Code for visionOS Metal?
There is a project tutorial for visionOS Metal rendering in immersive mode here (https://developer.apple.com/documentation/compositorservices/drawing_fully_immersive_content_using_metal?language=objc), but there is no downloadable sample project. Would Apple please provide sample code? The set-up is non-trivial.
4
1
2.1k
Jun ’23
RealityViewContent update
I am working on a project where changes in a window are reflected in a volumetric view which includes a RealityView. I have a shared data model between the window and volumetric view, but it unclear to me how I can programmatically refresh the RealityViewContent. Initially I tried holding the RealityViewContent passed from the RealityView closure in the data model, and I also tried embedding a .sink into the closure, but because the RealityViewContent is inout, neither of those work. And changes to the window's contents do not cause the RealityView's update closure fire. Is there a way to notify the RealityViewContent to update?
4
0
828
Jun ’23
How do I resize a new image to an existing Sprite?
I have multiple images that at various times I need to replace a target image for a SKSpriteNode. Each of these multiple images has a different size. The target SKSpriteNode has a fixed frame that I want to stay fixed. This target is created via: myTarget = SKSpriteNode(imageNamed: “target”) myTarget.size = CGSize(…) myTarget.physicsBody = SKPhysicsBody(rectangleOf: myTarget.size) How do I resize each of the multiple images so that each fills up the target frame (expand or contract)? Pretend the target is a shoebox and each image is a balloon that expands or contracts to fill the shoebox. I have tried the following that fails, that is, it changes the size of the target to fit the new image .. in short, it does the exact opposite of what I want. let newTexture = SKTexture(imageNamed: newImage) let changeImgAction = SKAction.setTexture(newTexture, resize: true) myTarget.run(changeImgAction) Again, keep frame of myTarget fixed and change size of newTexture to fit the above frame ..
1
0
669
Jun ’23
RealityKit visionOS anchor to POV
Hi, is there a way in visionOS to anchor an entity to the POV via RealityKit? I need an entity which is always fixed to the 'camera'. I'm aware that this is discouraged from a design perspective as it can be visually distracting. In my case though I want to use it to attach a fixed collider entity, so that the camera can collide with objects in the scene. Edit: ARView on iOS has a lot of very useful helper properties and functions like cameraTransform (https://developer.apple.com/documentation/realitykit/arview/cameratransform) How would I get this information on visionOS? RealityViews content does not seem offer anything comparable. An example use case would be that I would like to add an entity to the scene at my users eye-level, basically depending on their height. I found https://developer.apple.com/documentation/realitykit/realityrenderer which has an activeCamera property but so far it's unclear to me in which context RealityRenderer is used and how I could access it. Appreciate any hints, thanks!
7
6
2.9k
Jun ’23
AVFoundation with lidar and this year's RealityKit Object Capture.
With AVFoundation's builtInLiDARDepthCamera, if I save photo.fileDataRepresentation to heic, it only has Exif and TIFF metadata. But, RealityKit's object capture's heic image has not only Exif and TIFF, but also has HEIC metadata including camera calibration data. What should I do for AVFoundation's exported image has same meta data?
2
0
1.1k
Jun ’23
iOS 17 SceneKit normalmap & morphtarget causes lighting/shading issue
After the iOS 17 update, objects rendered in SceneKit that have both a normal map and morph targets do not render correctly. The shading and lighting appear dark and without reflections. Using a normal map without morph targets or having morph targets on an object without using a normal map works fine. However, the combination of using both breaks the rendering. Using diffuse, normal map and a morpher: Diffuse and normal, NO morpher:
5
1
1.7k
Jun ’23
Game porting toolkit build error
I followed the instruction from this page https://www.applegamingwiki.com/wiki/Game_Porting_Toolkit#Game_compatibility_list to install the toolkit. Everything worked fine until I ran the command brew -v install apple/apple/game-porting-toolkit. It threw the following error: Error: apple/apple/game-porting-toolkit 1.0 did not build Logs: /Users/user_name/Library/Logs/Homebrew/game-porting-toolkit/00.options.out /Users/user_name/Library/Logs/Homebrew/game-porting-toolkit/wine64-build /Users/user_name/Library/Logs/Homebrew/game-porting-toolkit/01.configure.cc /Users/user_name/Library/Logs/Homebrew/game-porting-toolkit/02.make.cc /Users/user_name/Library/Logs/Homebrew/game-porting-toolkit/01.configure /Users/user_name/Library/Logs/Homebrew/game-porting-toolkit/02.make Do not report this issue to Homebrew/brew or Homebrew/homebrew-core! Error: You are using macOS 14. We do not provide support for this pre-release version. It is expected behaviour that some formulae will fail to build in this pre-release version. It is expected behaviour that Homebrew will be buggy and slow. Do not create any issues about this on Homebrew's GitHub repositories. Do not create any issues even if you think this message is unrelated. Any opened issues will be immediately closed without response. Do not ask for help from Homebrew or its maintainers on social media. You may ask for help in Homebrew's discussions but are unlikely to receive a response. Try to figure out the problem yourself and submit a fix as a pull request. We will review it but may or may not accept it. Did anyone have the same issue and found a fix for it?
27
3
25k
Jun ’23
Original Reality Composer (non pro) in Vision OS
When I create an USDZ file from the original Reality Composer (non pro) and view it in the Vision OS simulator, the transforms and rotations don't look similar. For example a simple Tap and Flip behaviour does not rotate similar in Vision OS. Should we regard RC as discontinued sw and only work with RC-pro? Hopefully Apple will combine the features from the original RC into the new RC pro !
1
1
734
Jul ’23
MTKView fullscreen stutter
hello, when I do Metal drawing into an MTKView in full screen, there's an issue with frame scheduling, it seems. There is visible stutter, and the Metal HUD shows the frame rate jittering about. happens in Ventura and Sonoma b2 on my Macbook Pro. here's a really minimal example. Not even actively drawing anything, just presenting the drawable. #import "ViewController.h" #import <Metal/Metal.h> #import <MetalKit/MetalKit.h> @interface ViewController() <MTKViewDelegate> @property (nonatomic, weak) IBOutlet MTKView *mtkView; @property (nonatomic) id<MTLDevice> device; @property (nonatomic) id<MTLCommandQueue> commandQueue; @end @implementation ViewController - (void)viewDidLoad { [super viewDidLoad]; _device = MTLCreateSystemDefaultDevice(); _mtkView.device = _device; _mtkView.delegate = self; _commandQueue = [_device newCommandQueue]; } - (void)drawInMTKView:(MTKView *)view { MTLRenderPassDescriptor *viewRPD = view.currentRenderPassDescriptor; if(viewRPD) { id<MTLCommandBuffer> commandBuffer = [_commandQueue commandBuffer]; [commandBuffer presentDrawable:view.currentDrawable]; [commandBuffer commit]; } } - (void)mtkView:(MTKView *)view drawableSizeWillChange:(CGSize)size { NSLog(@"%@", NSStringFromSize(size)); } Looks like there's some collision between display and render timer, or something. what gives? I would like to be able to render stutter free on this very nice machine? how would I go about that?
12
0
1.4k
Jul ’23