Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics

Post

Replies

Boosts

Views

Activity

GKLocalPlayer.Local.FetchItems() Task Error on Unity
We are using apple unity plugin (gamekit) to authorized player using game center account. To get player the info we run a task from the plugin, var fetchItemsResponse = await GKLocalPlayer.Local.FetchItems(); But when this code run, there is an error on the xcode application on mac. The error is the following, Thread 1: EXC_BAD_ACCESS (code=257, address=0x2)
15
0
3.3k
Dec ’22
RealityKit sometimes renders using only the green channel
Hi We're working on an app that uses RealityKit to present products in the customer's home. We have a bug where every once in a while (somewhere around 1 in 100 runs), all entities are rendered using the green channel only (see image below). It seems to happen to occur in all entities in the ARView, regardless of model or material type. Due to the flaky nature of this bug, I found it really hard to debug, and I can't seem to rule out an internal issue in RealityKit. Did anyone run into similar issues or have any hints of where to look for the culprit?
6
0
1.4k
Jan ’23
SKAttributes do not work as expected with SKShapeNode instance
The documentation suggests that it should be possible to use a single shader with multiple instances of an SKNode, such that each instance will use the unique SKAttributes that are passed to it. Let's try that with an SKShapeNode. This is the fragment shader testFill.fsh, simply coloring based on the value for a_test: void main() {     gl_FragColor = vec4(vec3(a_test), 1.0); } And here we make two nodes `testNode0`, and `testNode1`, each using the same shader, but with a different value for `a_test`: class GameScene: SKScene {     override func didMove(to view: SKView) {         let testShader = shaderWithFilename( "testFill", fileExtension: "fsh", uniforms: [])         testShader.attributes = [             SKAttribute(name: "a_test", type: .float)         ] let testNode0 = SKShapeNode(rect: CGRect(x: 0.0, y: 0.0, width: 100.0, height: 100.0)) testNode0.fillShader = testShader testNode0.position = CGPoint(x: -100, y: 300) testNode0.setValue(SKAttributeValue(float: 0.2), forAttribute: "a_test") addChild(testNode0) let testNode1 = SKShapeNode(rect: CGRect(x: 0.0, y: 0.0, width: 100.0, height: 100.0)) testNode1.fillShader = testShader testNode1.position = CGPoint(x: 100, y: 300) testNode1.setValue(SKAttributeValue(float: 0.8), forAttribute: "a_test") addChild(testNode1) } } Here is the result: The squares are the same color, in particular the result of passing the second value 0.8 for a_test. Now, let's try the same thing with SKSpriteNode: class GameScene: SKScene {     override func didMove(to view: SKView) {         let testShader = shaderWithFilename( "testFill", fileExtension: "fsh", uniforms: [])         testShader.attributes = [             SKAttribute(name: "a_test", type: .float)         ] let testNode0 = SKSpriteNode() testNode0.size = CGSize(width: 100.0, height: 100.0) testNode0.shader = testShader testNode0.position = CGPoint(x: -100, y: 300) testNode0.setValue(SKAttributeValue(float: 0.2), forAttribute: "a_test") addChild(testNode0) let testNode1 = SKSpriteNode() testNode1.size = CGSize(width: 100.0, height: 100.0) testNode1.shader = testShader testNode1.position = CGPoint(x: 100, y: 300) testNode1.setValue(SKAttributeValue(float: 0.8), forAttribute: "a_test") addChild(testNode1) } } And it works! Why does the documentation not say that this is not possible with an SKSpriteNode? Why does an SKSpriteNode have a .setValue method if it does not function as expected? Is this a bug? Or something that is expected to be obvious? I am not sure, but I am sharing this in case somebody else ends up stuck on this issue as I was when otherwise trying to do something relatively straightforward. The solution (if your shape is a simple rect, as it is in my case) is to initialize an empty SKSpriteNode and size it accordingly, after which SKAttributes should work as expected. Apple, please either fix this, or update the documentation.
2
2
1.4k
Jan ’23
How do you incorporate WKInterfaceSKScene into a watchOS Starter Template App?
I'm trying to create a Apple Watch game using Xcode 14.2 and watchOS 9. Getting started creating a watchOS App seems pretty straight forward, and getting started creating a game project via the starter template seems easy enough. Trying to put these two together though doesn't seem to work (or is not straight foward). The documentation specifies limitations on what libraries can be used with watchOS noting WKInterfaceSKScene, but doesn't give any specific examples of how to start out a WatchOS project using this. Additionally nearly every online tutorial that I'm able to find uses Storyboards to create a watchOS game, which does not seem to be supported in the latest version of watchOS or Xcode. Can anyone provide example starter code using the watchOS App project starter that that loads with a small colored square on the screen that moves from left to right using the WKInterfaceSKScene library? I've tried the Apple documentation, asking ChatGPT for a sample or reference links, and various tutorials on YouTube and elsewhere.
2
1
975
Feb ’23
GKLocalPlayer.FetchItems BAD ACCESS
We are using the Apple Unity plugin (Gamekit) to authorize players, using a Game Center account. To get player info we run a task from the plugin, var fetchItemsResponse = await GKLocalPlayer.Local.FetchItems(); But when this code run, there is an error in Xcode. The error is the following, Thread 1: EXC_BAD_ACCESS (code=257, address=0x2) MacOS 12.5 Monterey Unity 2021.3.4f1 XCode 14.2 AppleCore Unity Package - 1.0.2 AppleGameKit Unity Package - 1.0.3 Crashes when calling FetchItems in Unity Installed in iPhone XR (iOS 15.2.1)
3
0
1.3k
Feb ’23
OpenGL on future iPhones and Macs?
Hello everyone! After some time to think about I proceed with graphics api, I figured opengl will be my first since I'm completely new to graphics programming. As in my last post you may find, I was speaking on moltenvk and might just use metal instead, along with the demos I found using metal. So for now, and I know this is said MANY TIMES, apple deprecated opengl but wish to use it because I'm new to graphics programming and want to develop an app(a rendering engine really) for the iPhone 14 Pro Max and macOS Ventura 13.2(I think this is the latest). So what do you guys think? Can I still use opengl es on the 14 max, along with opengl 4+ on latest macOS even though is deprecated?
15
0
5.5k
Feb ’23
Reality Composer Exporting USDZ files - slow animation
It seems that something must of changed in the reality composer export feature to USDZ. Importing any animated .usdz file into reality composer and then exporting it is reducing the playback frame rate to about 30%. The same file imported and then exported as a .reality file plays back just fine. Anyone else experiencing this issue, as its happening for every usdz file imported and also across 2 different apple laptops running the software?
2
1
1.5k
Feb ’23
Core Haptics Unity Errors In Xcode
I am trying to get the Apple Core Haptics plug-in to work with Unity but am having issues when I try to build the project. I created a completely blank project with just the Core and CoreHaptics plug-ins installed. I then put in a single script (below) which is a replication of the script shared in the WWDC22 video on this subject. Then when I try to build and run that project I get a series of “Undefined symbol:” errors in Xcode. These effect the CoreHaptics, UIFeedbackGenerator frameworks. If I remove the script and just build an empty project with Core and CoreHaptics installed the build runs successfully. What am I doing wrong or missing that is causing these errors? using Apple.CoreHaptics; using System.Collections; using UnityEngine; public class Haptics : MonoBehaviour { private CHHapticEngine _hapticEngine; private CHHapticPatternPlayer _hapticPlayer; [SerializeField] private AHAPAsset _hapticAsset; private void PrepareHaptics() { _hapticEngine = new CHHapticEngine(); _hapticEngine.Start(); _hapticPlayer = _hapticEngine.MakePlayer(_hapticAsset.GetPattern()); } private void Play() { _hapticPlayer.Start(); } }
3
0
1.2k
Mar ’23
Metal Shader Library - invalid UUID
Hi, I am generating a Metal library that I build using the command line tools on macOS for iphoneos, following the instructions here. Then I serialise this to a binary blob that I load at runtime, which seems to work ok as everything renders as expected. When I am doing a frame capture and open up a shader function it tries to load the symbols and fails. I tried pointing it to the directory (and the file) containing the symbols file, but it never resolves those. In the bottom half of the Import External Sources dialogue there is one entry in the Library | Debug Info section: The library name is Library 0x21816b5dc0 and below Debug Info it says Invalid UUID. The validation layer doesn't flag any invalid behaviour so I am a bit lost and not sure what to try next?
1
0
855
Mar ’23
Modify the ProRAW pixel buffer to write a modified DNG
Hello, In one of my apps, I'm trying to modify the pixel buffer from a ProRAW capture to then write the modified DNG. This is what I try to do: After capturing a ProRAW photo, I work in the delegate function func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { ... } In here I can access the photo.pixelBuffer and get its base address: guard let buffer = photo.pixelBuffer else { return } CVPixelBufferLockBaseAddress(buffer, []) let pixelFormat = CVPixelBufferGetPixelFormatType(buffer) // I check that the pixel format corresponds with ProRAW . This is successful, the code enters the if block if (pixelFormat == kCVPixelFormatType_64RGBALE) { guard let pointer = CVPixelBufferGetBaseAddress(buffer) else { return } // We have 16bits per component, 4 components let count = CVPixelBufferGetWidth(buffer) * CVPixelBufferGetHeight(buffer) * 4 let mutable = pointer.bindMemory(to: UInt16.self, capacity: count) // As a test, I want to replace all pixels with 65000 to get a white image let finalBufferArray : [Float] = Array.init(repeating: 65000, count: count) vDSP_vfixu16(finalBufferArray, 1, mutable, 1, vDSP_Length(finalBufferArray.count)) // I create an vImage Pixel buffer. Note that I'm referencing the photo.pixelBuffer to be sure that I modified the underlying pixelBuffer of the AVCapturePhoto object let imageBuffer = vImage.PixelBuffer<vImage.Interleaved16Ux4>(referencing: photo.pixelBuffer!, planeIndex: 0) // Inspect the CGImage let cgImageFormat = vImage_CGImageFormat(bitsPerComponent: 16, bitsPerPixel: 64, colorSpace: CGColorSpace(name: CGColorSpace.displayP3)!, bitmapInfo: CGBitmapInfo(rawValue: CGImageAlphaInfo.last.rawValue | CGBitmapInfo.byteOrder16Little.rawValue))! let cgImage = imageBuffer.makeCGImage(cgImageFormat: cgImageFormat)! // I send the CGImage to the main view controller. This is successful, I can see a white image when rendering the CGImage into a UIImage. This lets me think that I successfully modified the photo.pixelBuffer firingFrameDelegate?.didSendCGImage(image: cgImage) } // Now I try to write data. Unfortunately, this does not work. The photo.fileDataRepresentation() writes the data corresponding to the original, unmodified pixelBuffer `if let photoData = photo.fileDataRepresentation() { // Sending the data to the view controller and rendering it in the UIImage displays the original photo, not the modified pixelBuffer firingFrameDelegate?.didSendData(data: photoData) thisPhotoData = photoData }` CVPixelBufferUnlockBaseAddress(buffer, []) The same happens if I try to write the data to disk. The DNG file displays the original photo and not the data corresponding to the modified photo.pixelBuffer. Do you know why this code should not work? Do you have any ideas on how I can modify the ProRAW pixel buffer so that I can write the modified buffer into a DNG file? My goal is to write a modified file, so, I'm not sure I can use CoreImage of vImage to output a ProRAW file.
1
0
1.1k
Mar ’23
I will quit from Apple development no profit from my developer account unfortunately besides other companies I work to I have profit!
Hello community this post is for show my complete unsatisfied with Apple specially on developing games for Apples platforms there is lack of support for it for example some new gaming technologies and still that there is no profit or worth from all the work and money invested to develop for it I will close the journey with Apple very unsatisfied I'm going to give opportunities with my business to other platforms that are really worth it and give support to all new technologies in gaming and yes Apple destroyed other gaming makers with their new services like arcade and seems no future for gaming in Apples platforms. Quit goodbye and good luck to everyone.
3
0
1.4k
Mar ’23
Game Center issue on IOS 16.4
An issue appeared on IOS 16.4 when presenting GKMatchmakerViewController with the matchmakingMode set to inviteOnly. The view controller appears with the invite option as expected. But trying to tap it, the GKMatchmakerViewController disappears immediately. No problem on the previous IOS versions. It works also when matchmakingMode is not used at all.
7
2
1.6k
Mar ’23
Can I run CatBoost/XGBoost on my GPU(s) on my Mac?
I'm interested in using CatBoost and XGBoost for some machine learning projects on my Mac, and I was wondering if it's possible to run these algorithms on my GPU(s) to speed up training times. I have a Mac with an AMD Radeon Pro 5600M and an Intel UHD Graphics 630 GPUs, and I'm running macOS Ventura 13.2.1. I've read that both CatBoost and XGBoost support GPU acceleration, but I'm not sure if this is possible on my system. Can anyone point me in the right direction for getting started with GPU-accelerated CatBoost/XGBoost on macOS? Are there any specific drivers or tools I need to install, or any other considerations I should be aware of? Thank you.
1
0
2.1k
Apr ’23
MPSNDArrayConvolutionA14.mm:3967: failed assertion `destination kernel width and filter kernel width mismatch'
Hi, I am training an adversarial auto encoder using PyTorch 2.0.0 on Apple M2 (Ventura 13.1), with conda 23.1.0 as manager. I encountered this error: /AppleInternal/Library/BuildRoots/5b8a32f9-5db2-11ed-8aeb-7ef33c48bc85/Library/Caches/com.apple.xbs/Sources/MetalPerformanceShaders/MPSNDArray/Kernels/MPSNDArrayConvolutionA14.mm:3967: failed assertion `destination kernel width and filter kernel width mismatch' /Users/vk/miniconda3/envs/betavae/lib/python3.10/multiprocessing/resource_tracker.py:224: UserWarning: resource_tracker: There appear to be 1 leaked semaphore objects to clean up at shutdown To my knowledge, the code broke down when running self.manual_backward(loss["g_loss"]) this block: g_opt.zero_grad() self.manual_backward(loss["g_loss"]) g_opt.step() The same code run without problems on linux distribution. Any thoughts on how to fix it are highly appreciated!
2
0
1.3k
Apr ’23
How To Resize An Image and Retain Wide Color Gamut
I'm trying to resize NSImages on macOS. I'm doing so with an extension like this. extension NSImage { // MARK: Resizing /// Resize the image to the given size. /// /// - Parameter size: The size to resize the image to. /// - Returns: The resized image. func resized(toSize targetSize: NSSize) -> NSImage? { let frame = NSRect(x: 0, y: 0, width: targetSize.width, height: targetSize.height) guard let representation = self.bestRepresentation(for: frame, context: nil, hints: nil) else { return nil } let image = NSImage(size: targetSize, flipped: false, drawingHandler: { (_) -> Bool in return representation.draw(in: frame) }) return image } } The problem is, as far as I can tell, the image that comes out of the drawing handler has lost the original color profile of the original image rep. I'm testing it with a wide color gamut image, attached. This becomes pure red when examing the image result. If this was UIKit I guess I'd use the UIGraphicsImageRenderer and select the right UIGraphicsImageRendererFormat.Range and so I'm suspecting I need to use the NSGraphicsContext here to do the rendering but I can't see what on that I would set to make it use wide color or how I'd use it?
2
0
827
Apr ’23
Frames out of order using AVAssetWriter
We are using AVAssetWriter to write videos using both HEVC and H.264 encoding. Occasionally, we get reports of choppy footage in which frames appear out of order when played back on a Mac (QuickTime) or iOS device (stock Photos app). This occurs extremely unpredictably, often not starting until 20+ minutes of filming, but occasionally happening as soon as filming starts. Interestingly, users have reported the issue goes away while editing or viewing on a different platform (e.g. Linux) or in the built-in Google Drive player, but comes back as soon as the video is exported or downloaded again. When this occurs in an HEVC file, converting to H.264 seems to resolve it. I haven't found a similar fix for H.264 files. I suspect an AVAssetWriter encoding issue but haven't been able to uncover the source. Running a stream analyzer on HEVC files with this issue reveals the following error: Short-term reference picture with POC = [some number] seems to have been removed or not correctly decoded. However, running a stream analyzer on H.264 files with the same playback issue seems to show nothing wrong. At a high level, our video pipeline looks something like this: Grab a sample buffer in captureOutput(_ captureOutput: AVCaptureOutput!, didOutputVideoSampleBuffer sampleBuffer: CMSampleBuffer!) Perform some Metal rendering on that buffer Pass the resulting CVPixelBuffer to the AVAssetWriterInputPixelBufferAdaptor associated with our AVAssetWriter Example files can be found here: https://drive.google.com/drive/folders/1OjDZ3XaC-ubD5hyDiNvMQGl2NVqZbWnR?usp=sharing This includes a video file suffering this issue, the same file fixed after converting to mp4, and a screen recording of the distorted playback in QuickTime. Can anyone help point me in the right direction to solving this issue? I can provide more details as necessary.
4
2
1.2k
May ’23
Did iOS 16's SceneKit change something about handling pixel format?
Hi, my app displays a video as texture on SceneKit's SCNNode. Now I've just found that some videos looks different in iOS 16.4 from previous versions. The videos look more pale than they should be. I looked up some documents and set SCNDisableLinearSpaceRendering as true in info.plist, they look exactly what they should be but then the problem is that the other videos which already looked fine now turned different. But anyway it seems to relate to the linear or gamma color spaces regarding to this answer (https://developer.apple.com/forums/thread/710643). Definitely those problematic videos have some different color space setting or something. I am not really expert in these field, where should I start to dig in? Or how can I just make iOS 16.4 behave same as previous versions? It worked well for all the videos then. What was actually updated?
1
0
725
May ’23