CGImageSourceCreateThumbnailAtIndex function isn't generating cgImage for majority of the images on iOS 17.4 OS version. It works if I pass in option kCGImageSourceThumbnailMaxPixelSize, but doesn't work if this key is missing. This function works with and without kCGImageSourceThumbnailMaxPixelSize in stable OS versions. Is this a new change in iOS 17.4 beta versions?
General
RSS for tagDelve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Post
Replies
Boosts
Views
Activity
Hi, i'm trying to adapt our project to run on visionOS and faced with problem from the topic while running command:
xcrun --sdk xros metal --target=arm64-apple-xros1.0 input.metal -c -o output.air
The full command output looks like that:
While building module 'metal_types' imported from <built-in>:1:
In file included from <built-in>:1:
In file included from /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_types:90:
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_extended_vector:121:49: error: bfloat is not supported on this target
typedef __attribute__((__ext_vector_type__(2))) bfloat bfloat2;
^
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_extended_vector:122:49: error: bfloat is not supported on this target
typedef __attribute__((__ext_vector_type__(3))) bfloat bfloat3;
^
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_extended_vector:123:49: error: bfloat is not supported on this target
typedef __attribute__((__ext_vector_type__(4))) bfloat bfloat4;
^
While building module 'metal_types' imported from <built-in>:1:
In file included from <built-in>:1:
In file included from /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_types:91:
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_packed_vector:121:52: error: bfloat is not supported on this target
typedef __attribute__((__packed_vector_type__(2))) bfloat packed_bfloat2;
^
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_packed_vector:122:52: error: bfloat is not supported on this target
typedef __attribute__((__packed_vector_type__(3))) bfloat packed_bfloat3;
^
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/lib/clang/32023.98/include/metal/metal_packed_vector:123:52: error: bfloat is not supported on this target
typedef __attribute__((__packed_vector_type__(4))) bfloat packed_bfloat4;
I'm using Xcode 15.2 (15C500b) on MacBook 16 Pro (M1 Pro) and xcrun --sdk xros metal --version gives me this:
Apple metal version 32023.98 (metalfe-32023.98)
Target: air64-apple-darwin23.2.0
Thread model: posix
InstalledDir: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/metal/ios/bin
Seems that metal-shaderconverter can build a metallib, but I need .air files. Then I link the .air files into a single metallib and metallibdsym file.
HLSL -> dxc -> DXIL -> metal-shaderconverter -> .metallib
But there's no way to link together multiple metallib into a single metallib is there?
Hi there, I have some existing metal rendering / shader views that I would like to use to present stereoscopic content on the Vision Pro. Is there a metal shader function / variable that lets me know which eye we're currently rendering to inside my shader? Something like Unity's unity_StereoEyeIndex? I know RealityKit has GeometrySwitchCameraIndex, so I want something similar (but outside of a RealityKit context).
Many thanks,
Rich
Hi,
I have a MBP 2023 M3 Max 64GB with 16 Core CPU ( 4 efficiency, 12 Performance) and 40C GPU.
I've got a Game (Cities Skylines 2) successfully working using Whisky. However, only 4 Cores are reported to the game which leads to a situation where there are many calculations batched up while at the same time my CPU performance cores almost idle, while the efficiency cores are well utilized.
I suspect this is because the game only sees 4 cores and has some logic to batch the calculations differently depending on how much cores are available.
Is there a way to override how many cores the game sees? e.g. by using an environment variable or something?
Thanks,
Dominik
I captured my office using 3D Scanner and get a USDZ file.
The file contains a 3-D Model and a Physically based material.
I can view the file correctly with texture on Xcode and Reality Composer Pro.
But when using RealityView to present the model in immersive space. I got the model in whole black.
My guess is my Material doesn't have a shader graph?
Does anyone caught into similar issue? How to solve it?
I’m really sorry if this is not the proper place for this. I’m developing a game with online mode what works just fine in WiFi mode but not in mobile network. If someone has two Apple devices and can try if with the mobile network the multiplayer mode works I will appreciate A LOT https://apps.apple.com/es/app/ufo-snowboard/id6474542185
I have build the plugins and added to project but when I run game I am getting these errors
https://github.com/apple/unityplugins
EntryPointNotFoundException: AppleCore_GetRuntimeEnvironment assembly: type: member:(null)
Apple.Core.Availability.OnApplicationStart () (at Library/PackageCache/com.apple.unityplugin.core@d07a9d20344c/Runtime/Availability.cs:40)
EntryPointNotFoundException: GKLocalPlayer_GetLocal assembly: type: member:(null)
As I understood, the ability to use metal is only in "vr" mode. But is there any way to make a custom post-process in mixed reality mode?
Hi,
i am required to upload my CFD simulation results to the new vision pro glasses.
This simulation shall be visible as a soft VR/AR object in the room.
I am very new to the developer world. Could someone give me a hint which IDE, tool etc. to use for this task?
SwiftUI, swift, visionOS, Xcode, ... ????
After I know what IDE/tool/language to use, I will start learning courses with it.
Thanks a lot!!
Hi,
I have a MacBook Pro with an Intel Iris Plus Graphics 640, which gives maximum 4.1 OpenGL context. Thus, I would like to use Mesa as an alternative for getting higher contexes e.g. 4.5 or 4.6. I don’t care about performance in this phase.
I have built Mesa with LLVM (llvm-config… etc…) but even by placing the built libGL.dylib in the same folder with executable or setting DYLD_LIBRARY_PATH or DYLD_INSERT_LIBRARIES, I am not able to get the Mesa renderer.
Does Apple support this overriding of libGL with a custom one?
Any help/guidance would be really appreciated.
Thanks
device.supportsRaytracing can be used to check if the device supports Raytracing. It seems most of the devices support Raytracing. However, only M3 and iPhone 15 Pro supports hardware Raytracing. How do I check programmatically if a device supports hardware Raytracing? Thank you.
I have two pictures, one render only in left screen and one render only in right screen. What should I do . in vision pro
Hello All -
I'm receiving .usdz files from a client. When previewing the .usdz file in Reality Converter - the materials show up as expected. But when I load the .usdz in Reality Composer Pro, all the materials show up as grey. I've attached an image of the errors I'm get inside Reality Converter, to help trouble shoot.
What steps can I take to get these materials working in my Reality Composer Pro project? Thanks!
The error is:
"Error Domain=MTLLibraryErrorDomain Code=3 ":1:10: fatal error: cannot open file './metal_types': Operation not permitted
#include "metal_types""
On my Mac mini (with intel chip), I run flutter application in VScode lldb debugger and got this error, flutter application cannot draw its UI and shows a blank white window.
My Xcode version is latest version 15.2.
Flutter application can run normally in Mac mini M1 in VSCode lldb debugger, and can run normally without debugger in Mac mini Intel chip.
In Metal framework and Core Graphic framework location, there is no file named "metal_types".
Before, it didn't happen. I could run normal in vscode lldb debugger on Mac mini intel chip and M1.
Anyone knows anythings, please comments.
Thank you!
I am trying to pass array data in Uniform from Swift to Metal's fragment shader. I am able to pass normal Float numbers that are not arrays with no problem. The structure is as follows
struct Uniforms {
var test: [Float]
}
The values are as follows
let floatArray: [Float] = [0.5]
As usual, we are going to write and pass the following As mentioned above, normal Float values can be passed without any problem.
commandEncoder.setFragmentBytes(&uniforms, length: MemoryLayout<Uniforms>.stride, index: 0)
The shader side should be as follows
// uniform
struct Uniforms {
float test[1];
};
Fragment Shader
// in fragment shader
float testColor = 1.0;
// for statement
for (int i = 0; i < 1; i++) {
testColor *= uniforms.test[i];
}
float a = 1.0 - testColor;
return float4(1.0,0.0,0.0,a);
I thought that 0.5 in the array was passed, but no value is passed.
I think I am writing something wrong, but how should I write it?
USDZ is not getting the job done for AR on ios. Are there plans to utilize WebXR in future versions of ios for phones and ipads, so that developers might leverage all the capabilities that the GLB file format provides? By embracinng WebXR, Android is providing a much better environment to build AR experiences.
As content creators, we would like to support all of our users with a common code stack and workflow. Thanks for any insights.
I've ended up on here following the instructions from the installer for the community extender of AGPT, hoping to provide it the DMG for the official apple GPTK but when I followed the link in the installer, it wasn't there. I searched "GPTK and Game Porting Toolkit" to no avail. I'm running Sonoma on a 2023 M2 pro MBP with 32gb RAM, plenty powerful and up to date enough for me to have a genuine reason to use this tool. what gives? was it taken offline? if so, why?
Good afternoon, I had a developer account and for years I developed several gaming applications for Apple.
And a few years ago I went to look at my old developer email and there was a confirmation of Apple's payment debit to me for the profit I earned on my apps. Could you check if Apple left any payments pending on my old developer account?
I would like you to give me an answer because even my old games were removed from the Apple store.
Thank you!
Hi,
when compiling shaders, metal command line tool has more options than MTLDevice::newLibraryWithSource().
For instance, "man metal" mentions 10 levels of optimization (-O0, -O1, -O2, -O3, -Ofast, -Os ...) while MTLCompileOptions doc only shows 2 levels (Default, Size).
Is there a way to pass -O2 as optimization level to MTLDevice::newLibraryWithSource()?
Thanks