Let's say the type of project doesn't work well by using the LSApplicationCategoryType property in Info.plist.
How can I trigger it via code in MacOS?
Build captivating gaming experiences for Apple platforms.
Post
Replies
Boosts
Views
Activity
When developing VR experiences for the Apple Vision Pro, using Unreal Engine 5.4.
Can I deploy the project with deferred rendering?
Or do I still need to use forward rendering? (for mobile devices)
Thanks for any information on that,
appreicate it!
Hi, I'm displaying linear gray by CAMetalLayer with the shader below.
fragment float4 fragmentShader(VertexOut in [[stage_in]],
texture2d<float, access::sample> BGRATexture [[ texture(0) ]])
{
float color = in.texCoordinates.x;
return float4(float3(color), 1.0);
}
And my CAMetalLayer has been set to linearSRGB.
metalLayer.colorspace = CGColorSpace(name: CGColorSpace.linearSRGB)
metalLayer.pixelFormat = .bgra8Unorm
Why the display seem add gamma? Apparently the middle gray is 187 but not 128.
DS4macOS Compiled Things Partially and Run But it Has 30+ Yellow Warnings & Doesn't Show the Setting
Hi Apple and Swift friends.
I'm not really a fluent Swift programmer (or any language) just knowledgable, I just a vague logic of what's going on. This is a gamepad controller remapper similar to DS4Windows on the PC and DSX on Steam gaming.
On macOS Sonoma, it successfully compiled and connected the amazing Sony Playstation DualSense but because it has 33 yellow warning, the Setting doesn't show. It says something about outdated something and needs to be replaced by a newer framework.network. It also says it can't use self:
It should look like these:
What could be the syntax changes that won't produce the 30+ yellow warnings?
The file can be had here:
[https://github.com/marcowindt/ds4macos)
God bless.
I have my Xbox Controller plugged into my Macbook, but my controller won't stay on (no batteries are needed for this controller, and the plug I am using works). Also in system settings when I scroll to Xbox Controllers it says no devices found. How do I fix this?
Hello.
When I try to profile a frame capture of an app running on iPad, I get the following error:
Unexpected Replayer Termination
Abort trap: 6
This is my configuration:
macOS 14.0
Xcode 15.0
iPad Pro (10.5-inch) - A10X
iPadOS 17.2
Is this expected? Is there a way to fix the issue?
Thanks
The touch input stutter issue that exists since iOS 16 on devices with Pro Motion Displays has not been fixed yet. I filed a bug report in July but there isn't any progress since months.
I see the problem in all games I tried. My game is fast paced so the stutters are quite obvious and I receive a lot of complaining emails.
My game did run smoothly on Pro Motion devices with iOS 15. Is there a known workaround? I am seeing other developers having the same issue but I can't find any solutions.
Other threads about this issue:
IPhone 14 Pro stuttering in most games when using touch controls
FPS drops when tapping the screen on iPhone 13 Pro Max
I am trying to control the orientation of a box in Scene Kit (iOS) using gestures. I am using the translation in x and y to update the x and y rotation of the SCNNode.
After a long search I have realised that x and y rotation will always lead to z rotation, thanks to this excellent post: [https://gamedev.stackexchange.com/questions/136174/im-rotating-an-object-on-two-axes-so-why-does-it-keep-twisting-around-the-thir?newreg=130c66c673f848a7be2873bf675573a9)
So I am trying to get the z rotation causes, and then remove this from my object by applying the inverse quaternion however when I rotate the object 90 deg around x, and then 90 deg around Y it behaves VERY weirdly.
It is almost behaving as it is in gimbal lock, but I did not think that using quaternion in the way that I am would cause gimbal lock in this way.
I am sure it is something I am missing, or perhaps I am not able to remove the z rotation in this way.
Thanks!
I have added a video of the strange behaviour here [https://github.com/marcusraty/RotationExample/blob/main/Example.MP4)
And the code example is here [https://github.com/marcusraty/RotationExample)
I'm using DrawableQueue to create textures that I apply to my ShaderGraphMaterial texture. My metal render is using a range of alpha values as a test.
My objects displayed with the DrawableQueue texture are working as expected, but the alpha component is not working.
Is this an issue with my DrawableQueue descriptor? My ShaderGraphMaterial? A missing setting on my scene objects? or some limitation in visionOS?
DrawableQueue descriptor
let descriptor = await TextureResource.DrawableQueue.Descriptor(
pixelFormat: .rgba8Unorm,
width: textureResource!.width,
height: textureResource!.height,
usage: [.renderTarget, .shaderRead, .shaderWrite], // Usage should match the requirements for how the texture will be used
//usage: [.renderTarget], // Usage should match the requirements for how the texture will be used
mipmapsMode: .none // Assuming no mipmaps are needed for the text texture
)
let queue = try await TextureResource.DrawableQueue(descriptor)
queue.allowsNextDrawableTimeout = true
await textureResource!.replace(withDrawables: queue)
Draw frame:
guard
let drawable = try? drawableQueue!.nextDrawable(),
let commandBuffer = commandQueue?.makeCommandBuffer()//,
//let renderPipelineState = renderPipelineState
else {
return
}
let renderPassDescriptor = MTLRenderPassDescriptor()
renderPassDescriptor.colorAttachments[0].texture = drawable.texture
renderPassDescriptor.colorAttachments[0].loadAction = .clear
renderPassDescriptor.colorAttachments[0].storeAction = .store
renderPassDescriptor.colorAttachments[0].clearColor = clearColor
/*renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor(
red: clearColor.red,
green: clearColor.green,
blue: clearColor.blue,
alpha: 0.5 )*/
renderPassDescriptor.renderTargetHeight = drawable.texture.height
renderPassDescriptor.renderTargetWidth = drawable.texture.width
guard let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor) else {
return
}
renderEncoder.pushDebugGroup("DrawNextFrameWithColor")
//renderEncoder.setRenderPipelineState(renderPipelineState)
// No need to create a render command encoder with shaders, as we are only clearing the drawable.
// Since we are just clearing the drawable to a solid color, no need to draw primitives
renderEncoder.endEncoding()
commandBuffer.commit()
commandBuffer.waitUntilCompleted()
drawable.present()
}
What is the most efficient way to use a MTLTexture (created procedurally at run-time) as a RealityKit TextureResource? I update the MTLTexture per-frame using regular Metal rendering, so it’s not something I can do offline. Is there a way to wrap it without doing a copy?
A specific example would be great.
Thank you!
Hello community this post is for show my complete unsatisfied with Apple specially on developing games for Apples platforms there is lack of support for it for example some new gaming technologies and still that there is no profit or worth from all the work and money invested to develop for it I will close the journey with Apple very unsatisfied I'm going to give opportunities with my business to other platforms that are really worth it and give support to all new technologies in gaming and yes Apple destroyed other gaming makers with their new services like arcade and seems no future for gaming in Apples platforms. Quit goodbye and good luck to everyone.
Hello everyone! After some time to think about I proceed with graphics api, I figured opengl will be my first since I'm completely new to graphics programming. As in my last post you may find, I was speaking on moltenvk and might just use metal instead, along with the demos I found using metal. So for now, and I know this is said MANY TIMES, apple deprecated opengl but wish to use it because I'm new to graphics programming and want to develop an app(a rendering engine really) for the iPhone 14 Pro Max and macOS Ventura 13.2(I think this is the latest). So what do you guys think? Can I still use opengl es on the 14 max, along with opengl 4+ on latest macOS even though is deprecated?