Hi all
I have two mystic issues with saving and fetching data to and from iCloud. Both repro only after first launch of an app.
1. [GKLocalPlayer fetchSavedGamesWithCompletionHandler:]
After first attempt I can see 0 saved games (but i know that there is at least one saved game) and there is no any error. In case if I try fetch one more time (without any additional actions) even immediately after first attempt I receive saved games correctly (not 0)
2. [GKLocalPlayer saveGameData: withName: completionHandler:]
After first attempt I can see error The requested operation could not be completed because local player has not been authenticated. In case if I try save one more time (without any additional actions) even immediately after first attempt I can save data successfully without any error
I found the same issue in StackOverflow, but there are no fixes...
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
I'm writing an EntityAction that animates a material base tint between two different colours. However, the colour that is being actually set differs in RGB values from that requested.
For example, trying to set an end target of R0.5, G0.5, B0.5, results in a value of R0.735357, G0.735357, B0.735357. I can also see during the animation cycle that intermediate actual tint values are also incorrect, versus those being set.
My understanding is the the values of material base colour are passed as a SIMD4. Therefore I have a couple of helper extensions to convert a UIColor into this format and mix between two colours. Note however, I don't think the issue is with this functions - even if their outputs are wrong, the final value of the base tint doesn't match the value being set.
I wondered if this was a colour space issue?
import simd
import RealityKit
import UIKit
typealias Float4 = SIMD4<Float>
extension Float4 {
func mixedWith(_ value: Float4, by mix: Float) -> Float4 {
Float4(
simd_mix(x, value.x, mix),
simd_mix(y, value.y, mix),
simd_mix(z, value.z, mix),
simd_mix(w, value.w, mix)
)
}
}
extension UIColor {
var float4: Float4 {
var r: CGFloat = 0.0
var g: CGFloat = 0.0
var b: CGFloat = 0.0
var a: CGFloat = 0.0
getRed(&r, green: &g, blue: &b, alpha: &a)
return Float4(Float(r), Float(g), Float(b), Float(a))
}
}
struct ColourAction: EntityAction {
let startColour: SIMD4<Float>
let targetColour: SIMD4<Float>
var animatedValueType: (any AnimatableData.Type)? { SIMD4<Float>.self }
init(startColour: UIColor, targetColour: UIColor) {
self.startColour = startColour.float4
self.targetColour = targetColour.float4
}
static func registerEntityAction() {
ColourAction.subscribe(to: .updated) { event in
guard let animationState = event.animationState else { return }
let interpolatedColour = event.action.startColour.mixedWith(event.action.targetColour, by: Float(animationState.normalizedTime))
animationState.storeAnimatedValue(interpolatedColour)
}
}
}
extension Entity {
func updateColour(from currentColour: UIColor, to targetColour: UIColor, duration: Double, endAction: @escaping (Entity) -> Void = { _ in }) {
let colourAction = ColourAction(startColour: currentColour, targetColour: targetColour, endedAction: endAction)
if let colourAnimation = try? AnimationResource.makeActionAnimation(for: colourAction, duration: duration, bindTarget: .material(0).baseColorTint) {
playAnimation(colourAnimation)
}
}
}
The EntityAction can only be applied to an entity with a ModelComponent (because of the material), so it can be called like so:
guard
let modelComponent = entity.components[ModelComponent.self],
let material = modelComponent.materials.first as? PhysicallyBasedMaterial else
{
return
}
let currentColour = material.baseColor.tint
let targetColour = UIColor(_colorLiteralRed: 0.5, green: 0.5, blue: 0.5, alpha: 1.0)
entity.updateColour(from:currentColour, to: targetColour, duration: 2)
Issue Summary:
In our Flutter application, we utilize Tencent's TRTC API for voice and video communication. While the broadcast functionality operates correctly on Android, it fails to respond on iOS devices. Attempting to initiate a broadcast results in no action, and long-pressing the broadcast button does not reveal the broadcast extension.
Steps to Reproduce:
Add Broadcast Upload Extension:
In Xcode, navigate to File > New > Target.
Select Broadcast Upload Extension and add it to the project.
2. Build the Project:
Attempt to build the project.
Encounter the error: "Cycle inside Runner; building could produce unreliable results."
3. Resolve Build Cycle Error:
Go to the project’s Build Phases.
Locate the Embed App Extensions phase.
Move Embed App Extensions just below Copy Bundle Resources.
Ensure the Copy only when installing option is selected.
Rebuild the project; the cycle error is resolved.
4.Test Broadcast Functionality:
Install the app on an iOS device.
Tap the broadcast button; observe no response.
Long-press the broadcast button in the top right hand scroll down menu; the broadcast extension is not listed.
5. Isolate the Issue:
Create a new Flutter project.
Repeat the above steps to add the broadcast upload extension.
The issue persists: broadcast functionality remains unresponsive on iOS.
I searched the Metal Shading Language Specification Version 3.0 document, however I cannot see any function for inverting a matrix. Is there really no function in Metal for inverting a matrix?
I often need to this in linear equations and have so far resorted to writing the necessary function each time, most of the time just copy-and-pasting code.
inverse exists in SIMD and GLSL, so why not in Metal? It seems so unexpected that this function does not exist that I am almost certain I have just overlooked something obvious. I even tried 1 / M, to no avail.
I have a scene built up in RealityComposerPro, in which I've added a ParticleEmitter with isEmitting set to False and 'Loop' set to True.
In my app, when I toggle isEmitting to True there can be a delay of a few seconds before the ParticleEmitter starts.
However, if I programatically add the emitter in code at that point, it starts immediately.
To be clear, I'm seeing this on the VisionOS simulator - I don't have access to a device at this time.
Am I misunderstanding how to control the ParticleEmitter when I need precise control on when it starts.
I am working on a custom resolve tile shader for a client. I see a big difference in performance depending on where we write to:
1- the resolve texture of the color attachment
2- a rw tile shader texture set via [renderEncoder setTileTexture: myResolvedTexture]
Option 2 is more than twice as slow than option 1.
Our compute shader writes to 4 UAVs so just using the resolve texture entry is not possible.
Why such a difference as there is no more data being written? Can option 2 be as fast as option 1?
I can demonstrate the issue in a modified version of the Multisample code sample.
Given a graph with added obstacles I want to make a copy of it.
When I make the copy:
currentGrath added 20 obstacles.
var newGrapth = currentGrath.copy() as? GKObstacleGraph
newGrapth2.removeObstacles([newGrapth!.obstacles.first!])
This returns a BAD ACCESS.
I don't understand what's going on or what the problem is.
If I do this same thing with the main network there is no problem:
currentGrath.removeObstacles([currentGrath!.obstacles.first!])
Thanks for the help
Topic:
Graphics & Games
SubTopic:
SpriteKit
Hi, is there any way to use front camera to do the motion capture?
I want recognize if the user raised there hands up with the front camera on iPhone.
I was able to do it with the back camera, not the front.
Also, if there is any sample code, or document, I would be super happy.
Waiting for your reply!!
Topic:
Graphics & Games
SubTopic:
RealityKit
Tags:
Swift Student Challenge
Swift
ARKit
Swift Playground
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[NSBundle allFrameworks]: unrecognized selector sent to instance
NS::Bundle* Bundle = NS::Bundle::mainBundle(); Bundle->allFrameworks();
call to allFrameworks() and allBundles() will throw exception, but other functions work well.
I am currently working on a game that involves earning achievements, which I am using the Apple Unity Plug-Ins to display. I have found that occasionally opening the Game Center Dashboard the last achievement earned will not be displayed until the game is closed and reopened. I am using GKAccessPoint.Shared.Trigger to display the Achievements screen, which occasionally seems to open a cached version of the dashboard. I've found that it seems to consistently happen when earning multiple achievements within one minute, but this is not always the case. Does anybody have any experience with something like this in the past?
After following the instructions here:
https://developer.apple.com/metal/cpp/
I attempted building my project and Xcode presented several errors. In essence it's complaining about some redeclarations in the Metal-CPP headers.
NSBundle.hpp and NSError.hpp are included in the metal-cpp/foundation directory from the metal-cpp download.
Any help in getting these issues resolved is appreciated.
Thanks!
I have a 3D model with morphing animation that works correctly in Blender.
I exported this model as a USDZ file and tried to display it in an Xcode-developed visionOS app, but the morphing animation does not play.
What I Have Tried:
Morphing animation works correctly in Blender.
After exporting to USDZ, the morphing animation does not play in the Xcode app.
Linear motion animations (such as object movement) work fine.
Behavior in Reality Converter:
GLB files do not display.
USDZ files load, but morphing animations do not play.
What I Want to Know:
Is there a way to play morphing animations in an Xcode-developed app?
Does RealityKit support morphing animations?
Can morphing animations be played in an Xcode-developed app?
If RealityKit does not support morphing animations, what alternative methods can be used to play them?
I am looking for a way to use the existing animations without recreating them.
Additional Information:
I have both the Blender file (where animations work) and the USDZ file (where animations do not play).
I am developing a visionOS app using Xcode.
Any advice or solutions would be greatly appreciated.
Thank you in advance!
How many 32-bit variables can I use concurrently in a single thread of a Metal compute kernel without worrying about the variables getting spilled into the device memory? Alternatively: how many 32-bit registers does a single thread have available for itself?
Let's say that each thread of my compute kernel needs to store and work with its own array of N float variables, where N can be 128, 256, 512 or more. To achieve maximum possible performance, I do not want to the local thread variables to get spilled into the slow device memory. I want all N variables to be stored "on-chip", in the thread memory space.
To make my question more concrete, let's say there is an array thread float localArray[N]. Assuming an unrealistic hypothetical scenario where localArray is the only variable in the whole kernel, what is the maximum value of N for which no portion of localArray would get spilled into the device memory?
I searched in the Metal feature set tables, but I could not find any details.
Hey folks,
I have a legacy game that is running OpenGL ES - and it no longer works on the simulators that are running Apple Silicon, ie iPhone 15 Pro, or the 13" iPads. And yes, i'm also running on Apple Silicon (M1 Max).
The apps work fine on the actual devices, but the simulator crashes on any glDrawElements with a stack that looks like the following:
I have not yet seen an announcement about this not working but i've seen mention in other apps of stopping to support GL (https://github.com/maplibre/maplibre-native/issues/2351)
Can anyone shed some light? I'm obviously going to try to fix it, or find a recent sample app from which to start to see what might be up. Or move to metal, but i hadn't bargained for that level of effort atm ;)
Any suggestions appreciated!
Hey all! I'm got my hands on a refurbished mac mini m1 and already diving into metal. At the moment, i'm currently studying graphics programming with opengl and got to a point where I can almost create a 3d cube. However, I noticed there aren't many tutorials for metal cpp but rather demos. One thing I love about graphic programming, is skinning/skeletal animation. At the moment, I can't find any sources or tutorials on how to load skeletal animations into metal-cpp. So, if I create my character in blender and had all types of animations all loaded into a .FBX or maybe .DAE and load this into metal api with metal-cpp, how can I go on about how this works?
I have two apps released -- ReefScan and ReefBuild -- that are based on the WWDC21 sample photogrammetry apps for iOS and MacOS. Those run fine without LiDAR and are used mostly for underwater models where LiDAR does not work at all. It now appears that the updated photogrammetry session requires LiDAR data, and building my app on current xcode results in a non-working app. Has the "old" version of photgrammetry session been broken by this update? It worked very well previously so I would hate to see this regression to needing LiDAR. Most of my users do not have that.
Topic:
Graphics & Games
SubTopic:
RealityKit
How to create a beautiful fire animation using Swift?
Which API is better to use?
Hello ladies and gentlemen, I'm writing a simple renderer on the main actor using Metal and Swift 6. I am at the stage now where I want to create a render pipeline state using asynchronous API:
@MainActor
class Renderer {
let opaqueMeshRPS: MTLRenderPipelineState
init(/*...*/) async throws {
let descriptor = MTLRenderPipelineDescriptor()
// ...
opaqueMeshRPS = try await device.makeRenderPipelineState(descriptor: descriptor)
}
}
I get a compilation error if try to use the asynchronous version of the makeRenderPipelineState method:
Non-sendable type 'any MTLRenderPipelineState' returned by implicitly asynchronous call to nonisolated function cannot cross actor boundary
Which is understandable, since MTLRenderPipelineState is not Sendable. But it looks like no matter where or how I try to access this method, I just can't do it - you have this API, but you can't use it, you can only use the synchronous versions.
Am I missing something or is Metal just not usable with Swift 6 right now?
Our application is trying to read all resolutions of an external monitor. We have observed that, for the external monitor there is a mismatch in resolution list in our application and the resolution list in system settings. We are using the apple API "CGDisplayCopyAllDisplayModes" to read the resolutions.
I notice some metal-cpp classes have static funtion like
static URL* fileURLWithPath(const class String* pPath);
static class ComputePassDescriptor* computePassDescriptor();
static class AccelerationStructurePassDescriptor* accelerationStructurePassDescriptor();
which return a new object.
these classes also provide 'alloc' and 'init' function to create object by default.
for object created by 'alloc' and 'init', I use something like NS::Shaderd_Ptr or call release directly to free memory. Because 'alloc' and 'init' not explicit call on these static function.
I wonder how to correctly free object created by these static function? did they managed by autorelease pool?