Hi,
when compiling shaders, metal command line tool has more options than MTLDevice::newLibraryWithSource().
For instance, "man metal" mentions 10 levels of optimization (-O0, -O1, -O2, -O3, -Ofast, -Os ...) while MTLCompileOptions doc only shows 2 levels (Default, Size).
Is there a way to pass -O2 as optimization level to MTLDevice::newLibraryWithSource()?
Thanks
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Post
Replies
Boosts
Views
Activity
Good afternoon, I had a developer account and for years I developed several gaming applications for Apple.
And a few years ago I went to look at my old developer email and there was a confirmation of Apple's payment debit to me for the profit I earned on my apps. Could you check if Apple left any payments pending on my old developer account?
I would like you to give me an answer because even my old games were removed from the Apple store.
Thank you!
In our multiplayer game prototype, we experience a ping of 300 ms (at best) when using Game Center and GKMatch to send data between players, over the GKMatch.SendDataMode.unreliable channel. This latency is not suitable for a real-time game.
When we tested alternative services like Unity's Relay under identical conditions (location, devices, and Wi-Fi), we achieved a ping of 120 ms.
Is a ping value of 300 ms typical when using Game Center?
I can think of possible reasons in case it's not typical, but I can't be sure:
Is there a different behavior (servers relaying peer-to-peer connections) when the game is not yet released on the store?
We're in Europe, maybe this is normal in Europe and better in US?
I've ended up on here following the instructions from the installer for the community extender of AGPT, hoping to provide it the DMG for the official apple GPTK but when I followed the link in the installer, it wasn't there. I searched "GPTK and Game Porting Toolkit" to no avail. I'm running Sonoma on a 2023 M2 pro MBP with 32gb RAM, plenty powerful and up to date enough for me to have a genuine reason to use this tool. what gives? was it taken offline? if so, why?
How can we move the player within a RealityKit/RealityView scene? I am not looking for any animation or gradual movement, just instantaneous position changes.
I am unsure of how to access the player (the person wearing the headset) and it's transform within the context of a RealityView.
The goal is to allow the player to enter a full space in immersive mode and explore a space with various objects. They should be able to select an object and move closer to it.
USDZ is not getting the job done for AR on ios. Are there plans to utilize WebXR in future versions of ios for phones and ipads, so that developers might leverage all the capabilities that the GLB file format provides? By embracinng WebXR, Android is providing a much better environment to build AR experiences.
As content creators, we would like to support all of our users with a common code stack and workflow. Thanks for any insights.
All of a sudden (like when XCode 15.2 left beta yesterday?) I can't build attachments into my RealityView:
var body: some View {
RealityView { content, attachments in
// stuff
} attachments: {
// stuff
}
Produces "No exact matches in call to initializer" on the declaration line (RealityView { content, attachments in).
So far as I can tell, this is identical to the sample code provided at the WWDC session, but I've been fussing with various syntaxes for an hour now and I can't figure out what the heck it wants.
I am trying to pass array data in Uniform from Swift to Metal's fragment shader. I am able to pass normal Float numbers that are not arrays with no problem. The structure is as follows
struct Uniforms {
var test: [Float]
}
The values are as follows
let floatArray: [Float] = [0.5]
As usual, we are going to write and pass the following As mentioned above, normal Float values can be passed without any problem.
commandEncoder.setFragmentBytes(&uniforms, length: MemoryLayout<Uniforms>.stride, index: 0)
The shader side should be as follows
// uniform
struct Uniforms {
float test[1];
};
Fragment Shader
// in fragment shader
float testColor = 1.0;
// for statement
for (int i = 0; i < 1; i++) {
testColor *= uniforms.test[i];
}
float a = 1.0 - testColor;
return float4(1.0,0.0,0.0,a);
I thought that 0.5 in the array was passed, but no value is passed.
I think I am writing something wrong, but how should I write it?
Is there a SceneKit equivalent of the HoverEffectComponent used in RealityKit to highlight an entity as the user looks around a scene in a VisionOS app?
Hi Guys, I've been trying to put my model to react to light in visionOS Simulator by editing the component in Reality Composer Pro and also modifying it by code, but I can only put the shadow if I put it as an usdz file, it's not as reflective as when I see it on reality converter or reality composer pro, does someone have this problem too?
RealityView { content in
if let bigDonut = try? await ModelEntity(named: "bigdonut", in: realityKitContentBundle) {
print("LOADED")
// Create anchor for horizontal placement on a table
let anchor = AnchorEntity(.plane(.horizontal, classification: .table, minimumBounds: [0,0]))
// Configure scale and position
bigDonut.setScale([1,1,1], relativeTo: anchor)
bigDonut.setPosition([0,0.2,0], relativeTo: anchor)
// Add the anchor
content.add(anchor)
// Enable shadow casting but this does not work
bigDonut.components.set(GroundingShadowComponent(castsShadow: true))
}
}
The error is:
"Error Domain=MTLLibraryErrorDomain Code=3 ":1:10: fatal error: cannot open file './metal_types': Operation not permitted
#include "metal_types""
On my Mac mini (with intel chip), I run flutter application in VScode lldb debugger and got this error, flutter application cannot draw its UI and shows a blank white window.
My Xcode version is latest version 15.2.
Flutter application can run normally in Mac mini M1 in VSCode lldb debugger, and can run normally without debugger in Mac mini Intel chip.
In Metal framework and Core Graphic framework location, there is no file named "metal_types".
Before, it didn't happen. I could run normal in vscode lldb debugger on Mac mini intel chip and M1.
Anyone knows anythings, please comments.
Thank you!
I'm trying to implement multiplayer invitation from game center for macos, ios, tvos following this https://developer.apple.com/documentation/gamekit/finding_multiple_players_for_a_game?language=objc
Implementation works for tvos and ios, I can invite my friends, Start Game and continue sending data. But macos still doesn't work at all. I tried using build that works on iOS to invite macos player, but my mac cannot even receive the invitation notification from game center (game is installed already on mac).
While from mac, my invitation process is stuck here until it's failed
Is there any specific setting on macos to enable game center invitation?
Background:
This is a question asking for a good example of SpriteKit from a very new iOS developer who is investigating for starting an iOS 2D game project.
As I explored in the official apple development doc, to dev a 2D game SpriteKit is the very framework I am looking for.
There have been some clear and succinct words for any API and class documented in the reference spec when I started a project in Xcode. However I haven't been able to finish the project as having no any general idea about what is always needed of a typical game using the framework.
Question:
As an experienced Java Spring programmer I believe that I am needed a brief example to get started with the SpriteKit framework which provides me an idea of necessary steps for a 2D game.
Hi, I have a series of child entities in a USDZ file that I would like to be able to rotate relative to a joint point between one another. I believe the functionality I am looking for was available in SceneKit with SCNPhysicsHingeJoint. How would I go about replicating this functionality in RealityKit?
The current output of a rotation applied is relative to the model origin as a whole. (see below)
Thanks!
Freddie
Hi guys,
I thought I make a visionOS test app with the Apple's native robot.usdz file.
My plan was to rotate limbs of the robot programatically, but while I can see the bones in previous Xcode versions and in Blender, somehow I can not reach them in Xcode 15.3 or Reality Composer Pro.
Has anyone any experience with that?
Hello everyone
I am porting my existing 2d game writing by spritekit to visionOS
and I am creating a SpriteView in WindowGroup
let currentScene = BattleScene.newGameScene(gameMode: "endless", dataContext: dataController.container.viewContext)
SpriteView(scene: currentScene)
.ignoresSafeArea(.all)
.frame(width: currentScene.frame.width, height: currentScene.frame.height, alignment: .center)
.onReceive(NotificationCenter.default.publisher(for: GameoverNotification)) { _ in
stopAllAudio()
}
.onTapGesture { location in
let viewPosition = location
let touchLocation = CGPoint(x: viewPosition.x, y: viewPosition.y)
print("touch on vision window: ", touchLocation.x, touchLocation.y)
}
.glassBackgroundEffect()
//WindowGameView()
// .environment(\.managedObjectContext, dataController.container.viewContext)
// .environment(model)
// .environment(pressedKeys)
}
.windowStyle(.automatic)
.defaultSize(width: 0.5, height: 1.0, depth: 0.0, in: .meters)
run it and it turns out the scene can't receive tap event.
but it works normal if I run it with my ios target (vision Os designd for ipad)
is there anything I missed?
I have implemented auto renewable subscriptions in my app, as well as promo codes. Purchase of subscriptions both monthly and annual; work correctly. What I don't know is what to "listen for" instead of product, when the user uses a promo code to purchase the product. am I looking for a different product code? or product identifier when the offer code is used to subscribe?
Is it possible to animate some property on a RealityKit component? For example, the OpacityComponent has an opacity property that allows the opacity of the entities it's attached to, to be modified. I would like to animate the property so the entity fades in and out.
I've been looking at the animation API for RealityKit and it either assumes the animation is coming from a USDZ (which this is not), or it allows properties of entities themselves to be animated using a BindTarget. I'm not sure how either can be adapted to modify component properties?
Am I missing something?
Thanks
Hello All -
I'm receiving .usdz files from a client. When previewing the .usdz file in Reality Converter - the materials show up as expected. But when I load the .usdz in Reality Composer Pro, all the materials show up as grey. I've attached an image of the errors I'm get inside Reality Converter, to help trouble shoot.
What steps can I take to get these materials working in my Reality Composer Pro project? Thanks!
I have two pictures, one render only in left screen and one render only in right screen. What should I do . in vision pro