__builtin_ia32_cvtb2mask512() is the GNU C builtin for vpmovb2m k, zmm.
The Intel intrinsic for it is _mm512_movepi8_mask.
It extracts the most-significant bit from each byte, producing an integer mask.
The SSE2 and AVX2 instructions pmovmskb and vpmovmskb do the same thing for 16 or 32-byte vectors, producing the mask in a GPR instead of an AVX-512 mask register. (_mm_movemask_epi8 and _mm256_movemask_epi8).
I would like an implementation for ARM that is faster than below
I would like an implementation for ARM NEON
I would like an implementation for ARM SVE
I have attached a basic scalar implementation in C. For those trying to implement this in ARM, we care about the high bit, but each byte's high bit (in a 128bit vector), can be easily shifted to the low bit using the ARM NEON intrinsic: vshrq_n_u8(). Note that I would prefer not to store the bitmap to memory, it should just be the return value of the function similar to the following function.
#define _(n) __attribute((vector_size(1<<n),aligned(1)))
typedef char V _(6); // 64 bytes, 512 bits
typedef unsigned long U;
#undef _
U generic_cvtb2mask512(V v) {
U mask=0;int i=0;
while(i<64){
// shift mask by 1 and OR with MSB of v[i] byte
mask=(mask<<1)|((v[i]&0x80)>>7);
i++;}
return mask;
}
This is also a dup of : https://stackoverflow.com/questions/79225312
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Post
Replies
Boosts
Views
Activity
Project: I have some data wich could be transformed by shader, result may be kept in rgb channels of image. Great.
But now to mix dozens of those results? Not one by one, image after image, but all at once. Something like „complicated average” color of particular pixel from all delivered images.
Is it possible?
I would like to receive some guidance and discussion on the ideas implemented with RealityKit.
Hi. The earliest version of MacOS that Unity supports is 10.13. However, it seems that running a game using Unity Plugins on 10.13 causes DLL loading exceptions whenever you try to access part of the GameKit API. The errors look like this:
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/GameKitWrapper
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.dylib
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.dylib
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.so
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.bundle
DllNotFoundException: GameKitWrapper assembly:<unknown assembly> type:<unknown type> member:(null)
at (wrapper managed-to-native) Apple.GameKit.DefaultNSErrorHandler+Interop.DefaultNSErrorHandler_Set(Apple.Core.Runtime.NSExceptionCallback)
at Apple.GameKit.DefaultNSErrorHandler.Init () [0x00001] in ./Library/PackageCache/com.apple.unityplugin.gamekit@e3d4ad5a2c8e/Source/DefaultHandlers.cs:35
(Filename: ./Library/PackageCache/com.apple.unityplugin.gamekit@e3d4ad5a2c8e/Source/DefaultHandlers.cs Line: 35)
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/GameKitWrapper
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.dylib
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.dylib
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.so
Fallback handler could not load library /Users/love/Desktop/REDACTED/Contents/Frameworks/MonoEmbedRuntime/osx/libGameKitWrapper.bundle
DllNotFoundException: GameKitWrapper assembly:<unknown assembly> type:<unknown type> member:(null)
at (wrapper managed-to-native) Apple.GameKit.DefaultNSExceptionHandler+Interop.DefaultNSExceptionHandler_Set(Apple.Core.Runtime.NSExceptionCallback)
at Apple.GameKit.DefaultNSExceptionHandler.Init () [0x00001] in ./Library/PackageCache/com.apple.unityplugin.gamekit@e3d4ad5a2c8e/Source/DefaultHandlers.cs:14
These errors do not appear on 10.15 or later, which is why I am assuming it's a problem with this particular version of MacOS. Have not been able to test 10.14 so not sure how it handles there.
So, here is my question - what is the earliest version of MacOS that the Apple Unity plugins support? It's not documented anywhere on the GitHub page.
// Love
There is a Android Dynamic Performance Framework,
https://developer.android.com/games/optimize/adpf which allows you to monitor device's thermal state and send performance hints to the OS, describing current workload.
This helps to consume resources effectively, while having target performance. As I can see from tracing and profiling, hints help OS scheduler to switch tasks between cores more effectively - this helps to reach performance stability between multiple runs.
I wonder, is there anything similar for iOS platform?
Hello reader,
I am facing an issue that I am not able to resolve. I have been able to create a demo project that demonstrates the issue, which I hope enables you to have a look as well and hopefully find a way to resolve it.
What is the issue:
I am using SKTileMapNode in order to draw tile maps. Instead of using the tilesets as you can use from within the Xcode editor, I prefer to do it all programmatically using tilesheets (for a plethora of reasons that I will leave out of this equation).
This is the code of the gameScene:
import SpriteKit
import GameplayKit
class GameScene: SKScene {
private let tileSize = CGSize(width: 32, height: 32)
override func didMove(to view: SKView) {
super.didMove(to: view)
let tileSet = createTileSet()
let tileMap = SKTileMapNode(tileSet: tileSet,
columns: 100,
rows: 100,
tileSize: tileSize)
for column in 0..<tileMap.numberOfColumns {
for row in 0..<tileMap.numberOfRows {
guard let tileGroup = tileSet.tileGroups.randomElement() else {
fatalError()
}
tileMap.setTileGroup(tileGroup, forColumn: column, row: row)
}
}
addChild(tileMap)
}
private func createTileSet() -> SKTileSet {
let tileSheetTexture = SKTexture(imageNamed: "terrain")
var tileGroups = [SKTileGroup]()
let relativeTileSize = CGSize(width: tileSize.width/tileSheetTexture.size().width,
height: tileSize.height/tileSheetTexture.size().height)
for idx in 0...2 {
for jdx in 0...2 {
let tileTexture = SKTexture(rect: .init(x: CGFloat(idx) * relativeTileSize.width,
y: CGFloat(jdx) * relativeTileSize.height,
width: relativeTileSize.width,
height: relativeTileSize.height),
in: tileSheetTexture)
let tileDefinition = SKTileDefinition(texture: tileTexture,
size: tileSize)
let tileGroup = SKTileGroup(tileDefinition: tileDefinition)
tileGroups.append(tileGroup)
}
}
let tileSet = SKTileSet(tileGroups: tileGroups)
return tileSet
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
presentSceneAgain()
}
func presentSceneAgain() {
if let frame = view?.frame {
view?.presentScene(GameScene(size: frame.size),
transition: .doorsCloseHorizontal(withDuration: 1.0))
}
}
}
This demo project create a tilemapnode of 100 X 100 tiles. Then, it fills these 10.000 tiles with a random tile from the tilesheet named "terrain.png". This tile sheet contains many tiles, but I only take the 9 tiles (3 X 3) from the lower left corner as a random tile option.
Thus, the 10.000 tiles get filled with one of these 9 tiles. So it doesnt look pretty or anything, but that isnt the purpose.
Now, to create these 9 tile textures, I use the SKTexture(rectIn:) method on the source texture being "terrain.png".
I think the code is quite clear in itself, but so far the explanation. When you run it, you should see the map being rendered.
When you tap the scene, the scene will present a new instance of the scene. Not more than that.
Now, when you do this, have a look at the RAM usage of the app. You will see it steadily increases over time, each time you click the scene and a new scene is presented.
I looked deeper into what is happening, and what I see in the memory graph, is that for every present of the scene that is done, there are 3 SKTexture instances being created that are never released. The first time the scene is rendered, there 11 SKTexture instances allocated (I dont know why there are 11 though. I would expect 10: the source texture and the 9 tile textures).
But then as mentioned, after a tap and a new present, I get 14 SKTexture, of which 3 are zombies, see image leak_1.
Moreover, Xcode reports multiple additional leaks from Jet and Metal allocations, see image leak_all.
As far as I know, the code presented is not retaining any references that it should not, and I suspect this leaks are happening somewhere inside SpriteKit. But I am not able to find exactly where, or how to resolve it.
I hope someone can help with this issue.
I have a very basic usdz file from this repo
I call loadTextures() after loading the usdz via MDLAsset. Inspecting the MDLTexture object I can tell it is assigning a colorspace of linear rgb instead of srgb although the image file in the usdz is srgb.
This causes the textures to ultimately render as over saturated.
In the code I later convert the MDLTexture to MTLTexture via MTKTextureLoader but if I set the srgb option it seems to ignore it.
This significantly impacts the usefulness of Model I/O if it can't load a simple usdz texture correctly. Am I missing something?
Thanks!
I have created a simple scene in reality composer (composer not composer pro).
It contains just a cube and text item.
I convert this to usdz file and load it into a Arkit swift app.
Since ios 18/xcode 16 - the "text" element is not displayed at all.
The cube is displayed, anchors correctly and can be moved etc....
The output from usdchecker
➜ Desktop usdchecker GKTUHR1.6.3.usdz -v --arkit
Opening GKTUHR1.6.3.usdz
Checking layer <GKTUHR1.6.3.usdz>.
Checking package <GKTUHR1.6.3.usdz>
Checking prim </Root>.
Checking prim </Root/Scenes>.
Checking prim </Root/Scenes/Scene>.
Checking prim </Root/Scenes/Scene/Gravity>.
Checking prim </Root/Scenes/Scene/sceneGroundPlane>.
Checking prim </Root/Scenes/Scene/sceneGroundPlane/physicsMaterial>.
Checking prim </Root/Scenes/Scene/Children>.
Checking prim </Root/Scenes/Scene/Children/hello>.
Checking prim </Root/Scenes/Scene/Children/hello/Generated>.
Checking prim </Root/Scenes/Scene/Children/hello/Generated/Text>.
Checking prim </Root/Scenes/Scene/Children/hello/Generated/Text/Material>.
Checking prim </Root/Scenes/Scene/Children/hello/Generated/Text/Material/PBRShader>.
Checking shader </Root/Scenes/Scene/Children/hello/Generated/Text/Material/PBRShader>.
Checking prim </Root/Scenes/Scene/Children/hello/Children>.
Checking prim </Root/Scenes/Scene/Children/Box>.
Checking prim </Root/Scenes/Scene/Children/Box/Generated>.
Checking prim </Root/Scenes/Scene/Children/Box/Generated/Mesh0>.
Checking prim </Root/Scenes/Scene/Children/Box/Generated/Mesh0/Mesh0>.
Checking prim </Root/Scenes/Scene/Children/Box/Generated/Mesh0/Material>.
Checking prim </Root/Scenes/Scene/Children/Box/Generated/Mesh0/Material/PBRShader>.
Checking shader </Root/Scenes/Scene/Children/Box/Generated/Mesh0/Material/PBRShader>.
Checking prim </Root/Scenes/Scene/Children/Box/Children>.
Checking prim </Root/Scenes/Scene/Children/Box/PhysicsMaterial_Box>.
Found material bindings but no MaterialBindingAPI applied on the prim </Root/Scenes/Scene/sceneGroundPlane>. (fails 'MaterialBindingAPIAppliedChecker')
Found material bindings but no MaterialBindingAPI applied on the prim </Root/Scenes/Scene/Children/hello/Generated/Text>. (fails 'MaterialBindingAPIAppliedChecker')
Found material bindings but no MaterialBindingAPI applied on the prim </Root/Scenes/Scene/Children/Box>. (fails 'MaterialBindingAPIAppliedChecker')
Found material bindings but no MaterialBindingAPI applied on the prim </Root/Scenes/Scene/Children/Box/Generated/Mesh0>. (fails 'MaterialBindingAPIAppliedChecker')
Failed!
I'm writing a swift app that uses metal to render textures to the main view. I currently use a NSViewRepresentable to place a MTKView into the window and a MTKViewDelegate to perform the metal operations. It's running well and I see my metal view being updated.
However, when I close the window (either through the user clicking the close button or by programatically using the appropriate @Environment(\.dismissWindow) private var dismissWindow) and then reopen the window, I no longer receive calls to MTKViewDelegate draw(in mtkView: MTView). If I manually call the MTKView::draw() function my view updates it's content as expected, so it seems to be still be correctly setup / alive.
As best as I can tell the CVDisplayLink created by MTKView is no longer active (or at least that's my understanding of how the MTKView::draw() function is called).
I've setup the MTKView like this
let mtkView = MTKView()
mtkView.delegate = context.coordinator // My custom delegate
mtkView.device = context.coordinator.device // The default metal device
mtkView.preferredFramesPerSecond = 60
mtkView.enableSetNeedsDisplay = false
mtkView.isPaused = false
which I was hoping would call the draw function at 60fps while the view is visible.
I've also verified the values don't change while running.
Does anyone have any ideas on how I could restart the CVDisplayLink or anyother methods to avoid this problem??
Cheers
Jack
I am working on adding synchronized physical properties to EntityEquipment in TableTopKit, allowing seamless coordination during GroupActivities sessions between players.
Treating EntityEquipment's state to DieState is not a way, because it doesn't support custom collision shapes.
I have also tried adding PhysicsBodyComponent and CollisionComponent to EntityEquipment's Entity. However, the main issue is that the position of EntityEquipment itself does not synchronize with the Entity's physics body, resulting in two separate instances of one object.
struct PlayerPawn: EntityEquipment {
let id: ID
let entity: Entity
var initialState: BaseEquipmentState
init(id: ID, entity: Entity) {
self.id = id
let massProperties = PhysicsMassProperties(mass: 1.0)
let material = PhysicsMaterialResource.generate(friction: 0.5, restitution: 0.5)
let shape = ShapeResource.generateBox(size: [0.4, 0.2, 0.2])
let physicsBody = PhysicsBodyComponent(massProperties: massProperties, material: material, mode: .dynamic)
let collisionComponent = CollisionComponent(shapes: [shape])
entity.components.set(physicsBody)
entity.components.set(collisionComponent)
self.entity = entity
initialState = .init(parentID: .tableID, pose: .init(position: .init(), rotation: .zero), entity: self.entity)
}
}
I’d appreciate any guidance on the recommended approach to adding synchronized physical properties to EntityEquipment.
When running the sample code below, every 3 seconds the middle sprite is replaced by a new one. When this happens, most of the time a flicker is noticeable. When recording the screen and stepping through the recording frame by frame, I noticed that the flicker is caused by a temporary reordering of the nodes’. Below you find two screenshots of two consecutive frames where the reordering is clearly visible.
This only happens for a SpriteKit scene used as an overlay for a SceneKit scene. Commenting out
buttons.zPosition = 1
or avoiding the fade in/out animations solves the issue.
I have created FB15945016.
import SceneKit
import SpriteKit
class GameViewController: NSViewController {
let overlay = SKScene()
var buttons: SKNode!
var previousButton: SKSpriteNode!
var nextButton: SKSpriteNode!
var pageContainer: SKNode!
var pageViews = [SKNode]()
var page = 0
override func viewDidLoad() {
super.viewDidLoad()
let scene = SCNScene(named: "art.scnassets/ship.scn")!
let scnView = self.view as! SCNView
scnView.scene = scene
overlay.anchorPoint = CGPoint(x: 0.5, y: 0.5)
scnView.overlaySKScene = overlay
buttons = SKNode()
buttons.zPosition = 1
overlay.addChild(buttons)
previousButton = SKSpriteNode(systemImage: "arrow.uturn.backward.circle")
previousButton.position = CGPoint(x: -100, y: 0)
buttons.addChild(previousButton)
nextButton = SKSpriteNode(systemImage: "arrow.uturn.forward.circle")
nextButton.position = CGPoint(x: 100, y: 0)
buttons.addChild(nextButton)
pageContainer = SKNode()
pageViews = [SKSpriteNode(systemImage: "square.and.arrow.up"), SKSpriteNode(systemImage: "eraser")]
overlay.addChild(pageContainer)
setPage(0)
Timer.scheduledTimer(withTimeInterval: 3, repeats: true) { [self] _ in
setPage((page + 1) % 2)
}
}
func setPage(_ page: Int) {
pageViews[self.page].run(.sequence([
.fadeOut(withDuration: 0.2),
.removeFromParent()
]), withKey: "fade")
self.page = page
let pageView = pageViews[page]
pageView.alpha = 0
pageView.run(.fadeIn(withDuration: 0.2), withKey: "fade")
pageContainer.addChild(pageView)
}
override func viewDidLayout() {
overlay.size = view.frame.size
}
}
extension SKSpriteNode {
public convenience init(systemImage: String) {
self.init()
let width = 100.0
let image = NSImage(systemSymbolName: systemImage, accessibilityDescription: nil)!.withSymbolConfiguration(.init(hierarchicalColor: NSColor.black))!
let scale = NSScreen.main!.backingScaleFactor
image.size = CGSize(width: width * scale, height: width / image.size.width * image.size.height * scale)
texture = SKTexture(image: image)
size = CGSize(width: width, height: width / image.size.width * image.size.height)
}
}
Hi, I’m creating a game and I’m just wondering if I can integrate GCVirtualController in my SwiftUI app.
I am trying to port SceneKit projects to Swift 6, and I just can't figure out how that's possible. I even start thinking SceneKit and Swift 6 concurrency just don't match together, and SceneKit projects should - hopefully for the time being only - stick to Swift 5.
The SCNSceneRendererDelegate methods are called in the SceneKit Thread.
If the delegate is a ViewController:
class GameViewController: UIViewController {
let aNode = SCNNode()
func renderer(_ renderer: any SCNSceneRenderer, updateAtTime time: TimeInterval) {
aNode.position.x = 10
}
}
Then the compiler generates the error "Main actor-isolated instance method 'renderer(_:updateAtTime:)' cannot be used to satisfy nonisolated protocol requirement"
Which is fully understandable.
The compiler even tells you those methods can't be used for protocol conformance, unless:
Conformance is declare as @preconcurrency SCNSceneRendererDelegate like this:
class GameViewController: UIViewController, @preconcurrency SCNSceneRendererDelegate {
But that just delays the check to runtime, and therefore, crash in the SceneKit Thread happens at runtime...
Again, fully understandable.
or the delegate method is declared nonisolated like this:
nonisolated func renderer(_ renderer: any SCNSceneRenderer, updateAtTime time: TimeInterval) {
aNode.position.x = 10
}
Which generates the compiler error: "Main actor-isolated property 'position' can not be mutated from a nonisolated context".
Again fully understandable.
If the delegate is not a ViewController but a nonisolated class, we also have the problem that SCNNode can't be used.
Nearly 100% of the SCNSceneRendererDelegate I've seen do use SCNNode or similar MainActor bound types, because they are meant for that.
So, where am I wrong ? What is the solution to use SceneKit SCNSceneRendererDelegate methods with full Swift 6 compilation ? Is that even possible for now ?
I have a Mac Studio 2023 M2 Max
Running Sonoma 14.6.1
Developing in XCode 16.1
It seems that the NSScreen frame settings may be incorrect. The frame settings received from NSScreen.screens don't seem to match up with the Desktop arrangement settings in the Settings.
Apologies in advance for this long post!
for screen in NSScreen.screens {
let name = screen.localizedName
Globals.logger.debug("Globals initializeScreens - screen \(i) '\(name, privacy: .public)'")
Globals.logger.debug("Globals initializeScreens - '\(screen.debugDescription, privacy: .public)'")
}
This is what I receive in the log:
Globals initializeScreens - '<NSScreen: 0x600000ef4240;
name="PHL 346E2C";
backingScaleFactor=1.000000;
frame={{0, 0}, {3440, 1440}};
visibleFrame={{0, 0}, {3440, 1415}}>'
Globals initializeScreens - screen 2 'Blackmagic (1)'
Globals initializeScreens - '<NSScreen: 0x600000ef42a0;
name="Blackmagic (1)";
backingScaleFactor=1.000000;
frame={{-3840, 0}, {1920, 1080}};
visibleFrame={{-3840, 0}, {1920, 1055}}>'
Globals initializeScreens - screen 3 'Blackmagic (4)'
Globals initializeScreens - '<NSScreen: 0x600000ef4360;
name="Blackmagic (4)";
backingScaleFactor=1.000000;
frame={{-1920, 0}, {1920, 1080}};
visibleFrame={{-1920, 0}, {1920, 1055}}>'
Globals initializeScreens - screen 4 'Blackmagic (2)'
Globals initializeScreens - '<NSScreen: 0x600000ef43c0;
name="Blackmagic (2)";
backingScaleFactor=1.000000;
frame={{5360, 0}, {1920, 1080}};
visibleFrame={{5360, 0}, {1920, 1055}}>'
Globals initializeScreens - screen 5 'Blackmagic (3)'
Globals initializeScreens - '<NSScreen: 0x600000ef4420;
name="Blackmagic (3)";
backingScaleFactor=1.000000;
frame={{3440, 0}, {1920, 1080}};
visibleFrame={{3440, 0}, {1920, 1055}}>'
It looks like the frame settings for Blackmagic (2) and Blackmagic (4) are switched.
The setup has five monitors. Four are using the USB-C Digital AV Multiport Adapters. The output for these are streamed into a rack of A/V equipment using BlackMagic Design mini converters and monitors.
My Swift application allows users to open four movies, one for each of the AV Adapters. The movies can then be played back in sync for later processing by the A/V equipment.
Here are some screen captures that show my display settings.
Blackmagic (1) and Blackmagic (2) are to the left of the main screen.
Blackmagic (3) and Blackmagic(4) are to the right of the main screen.
The desktop is hard to see but is correct.
The wallpaper settings are all correct.
The wallpaper is correctly ordered when displayed on the monitors.
After opening the movies and using the NSScreen frame settings, the displays are incorrectly ordered. Test B and Test D are switched, which is what I would expect given the NSScreen frame values.
Any ideas? I've tried re-arranging the desktops, rebooting, etc. but no luck.
The code that changes the screen location is similar to this post on Stack Overflow
public func setDisplay( screen: NSScreen ) {
Globals.logger.log("MovieWindowController - setDisplay = \(screen.localizedName, privacy: .public)")
Globals.logger.debug("MovieWindowController - setDisplay - '\(screen.debugDescription, privacy: .public)'")
let dx = CGFloat(Constants.midX)
let dy = CGFloat(Constants.midY)
var pos = NSPoint()
pos.x = screen.visibleFrame.midX - dx
pos.y = screen.visibleFrame.midY - dy
Globals.logger.debug("MovieWindowController - setDisplay - x = '\(pos.x, privacy: .public)', y = '\(pos.y, privacy: .public)'")
window?.setFrameOrigin(pos)
}
The log show just what I would expect given the incorrect frame values.
MovieWindowController - setDisplay = Blackmagic (1)
MovieWindowController - setDisplay - '<NSScreen: 0x6000018e8420; name="Blackmagic (1)"; backingScaleFactor=1.000000; frame={{-3840, 0}, {1920, 1080}}; visibleFrame={{-3840, 0}, {1920, 1055}}>'
MovieWindowController - setDisplay - x = '-3840.000000', y = '-12.500000'
MovieWindowController - setDisplay = Blackmagic (2)
MovieWindowController - setDisplay - '<NSScreen: 0x6000018a10e0; name="Blackmagic (2)"; backingScaleFactor=1.000000; frame={{5360, 0}, {1920, 1080}}; visibleFrame={{5360, 0}, {1920, 1055}}>'
MovieWindowController - setDisplay - x = '5360.000000', y = '-12.500000'
MovieWindowController - setDisplay = Blackmagic (3)
MovieWindowController - setDisplay - '<NSScreen: 0x6000018cc8a0; name="Blackmagic (3)"; backingScaleFactor=1.000000; frame={{3440, 0}, {1920, 1080}}; visibleFrame={{3440, 0}, {1920, 1055}}>'
MovieWindowController - setDisplay - x = '3440.000000', y = '-12.500000'
MovieWindowController - setDisplay = Blackmagic (4)
MovieWindowController - setDisplay - '<NSScreen: 0x6000018c9ce0; name="Blackmagic (4)"; backingScaleFactor=1.000000; frame={{-1920, 0}, {1920, 1080}}; visibleFrame={{-1920, 0}, {1920, 1055}}>'
MovieWindowController - setDisplay - x = '-1920.000000', y = '-12.500000'
Am I correct? I think this is driving me crazy!
Thanks in advance!
Edit: The mouse behavior is correct in moving across the displays!
Hi, I trying to use Metal cpp, but I have compile error:
ISO C++ requires the name after '::' to be found in the same scope as the name before '::'
metal-cpp/Foundation/NSSharedPtr.hpp(162):
template <class _Class>
_NS_INLINE NS::SharedPtr<_Class>::~SharedPtr()
{
if (m_pObject)
{
m_pObject->release();
}
}
Use of old-style cast
metal-cpp/Foundation/NSObject.hpp(149):
template <class _Dst>
_NS_INLINE _Dst NS::Object::bridgingCast(const void* pObj)
{
#ifdef __OBJC__
return (__bridge _Dst)pObj;
#else
return (_Dst)pObj;
#endif // __OBJC__
}
XCode Project was generated using CMake:
target_compile_features(${MODULE_NAME} PRIVATE cxx_std_20)
target_compile_options(${MODULE_NAME}
PRIVATE
"-Wgnu-anonymous-struct"
"-Wold-style-cast"
"-Wdtor-name"
"-Wpedantic"
"-Wno-gnu"
)
May be need to set some CMake flags for C++ compiler ?
So recently I have been trying to install game-porting-toolkit, and when I try to run
brew -v install apple/apple/game-porting-toolkit
Im getting an error:
Error: apple/apple/game-porting-toolkit 1.1 did not build
Logs:
/Users/mateusz/Library/Logs/Homebrew/game-porting-toolkit/00.options.out
/Users/mateusz/Library/Logs/Homebrew/game-porting-toolkit/01.configure
/Users/mateusz/Library/Logs/Homebrew/game-porting-toolkit/01.configure.cc
/Users/mateusz/Library/Logs/Homebrew/game-porting-toolkit/wine64-build
If reporting this issue please do so to (not Homebrew/brew or Homebrew/homebrew-core):
apple/apple
I searched on this forum some threads but I couldn't take any advice from them. I would really appreciate any help.
Hi everyone,
I’m working on a project using RealityKit and encountering an issue with object occlusion. Specifically, I need to disable the occlusion of real-world objects (e.g., tables, walls) in my RealityView. I want virtual entities to render fully, even if real-world objects would normally block their view.
I’ve explored options in ARSession and ARWorldTrackingConfiguration but haven’t found anything that affects occlusion in RealityView. I suspect there might be a setting or approach I’ve missed.
Has anyone dealt with a similar scenario or knows how to achieve this? Any insights or pointers would be greatly appreciated!
Thanks in advance,
Nicolas
Hi, wanted to test if possible to use Mesa3D Dozen driver(Vulkan on D3D12 )+D3DMetal 2b3 to get maybe better Vulkan driver on Wine than default MoltenVK.. this will support Vulkan windows apps via using D3D12Metal..
using vulkan_dzn.dll,dzn_icd.x86_64.json,dxil.dll from x64 folder from: https://github.com/pal1000/mesa-dist-win/releases/download/24.3.0-rc1/mesa3d-24.3.0-rc1-release-msvc.7z
using simple vulkaninfo app and running like:
wine64 vulkaninfo
I get error:
[D3DMetal:LOG:2A825] Unsupported API: CheckFeatureSupport, unhandled support query 53
also seems D3DMetal Wine integration on Whisky doesn't expose d3d12core.dll and d3d12.dll like new Agility D3D12 dlls or VKD3D, so
getting:
MESA: error: Failed to retrieve D3D12GetInterface MESA: error: Failed to load DXCore
but anyways seems to try to load the driver as:
WARNING: dzn is not a conformant Vulkan implementation, testing use only.
full log:
MESA: error: Failed to retrieve D3D12GetInterface MESA: error: Failed to load DXCore WARNING: dzn is not a conformant Vulkan implementation, testing use only. [D3DMetal:LOG:2A825] Unsupported API: CheckFeatureSupport, unhandled support query 53 00bc:fixme:dcomp:DCompositionCreateDevice 0000000000000000, {c37ea93a-e7aa-450d-b16f-9746cb0407f3}, 000000000011E328. MESA: error: Failed to load DXCore WARNING: dzn is not a conformant Vulkan implementation, testing use only. [D3DMetal:LOG:2A825] Unsupported API: CheckFeatureSupport, unhandled support query 53 00bc:fixme:dcomp:DCompositionCreateDevice 0000000000000000, {c37ea93a-e7aa-450d-b16f-9746cb0407f3}, 000000000011E578. ERROR: [Loader Message] Code 0 : setup_loader_term_phys_devs: Call to 'vkEnumeratePhysicalDevices' in ICD c:\windows\system32\.\vulkan_dzn.dll failed with error code -3 ERROR: [Loader Message] Code 0 : setup_loader_term_phys_devs: Failed to detect any valid GPUs in the current config ERROR at C:\j\msdk0\build\Khronos-Tools\repo\vulkaninfo\vulkaninfo.h:241:vkEnumeratePhysicalDevices failed with ERROR_INITIALIZATION_FAILED
I have multiple CAMetalLayers that I render content to and noticed that the graphics overview HUD does not function properly when I have more than one CAMetalLayer. The values reported will be very strange. For example, FPS may report 999 or some large negative value. It the HUD simply not designed to work with multiple CAMetalLayers or MTKViews? When I disable all but one of my CAMetalLayers, the HUD works as expected.
Into a SKScene, I add a SCNSphere as a child of SKShapeNode, as depicted below.
When the sphere hit another node (the fence in the example) the sphere is deformed as it were elastic.
I didn't found any information about elastic properties.
Someone know a way to avoid the deformation?
import SwiftUI
import SpriteKit
import SceneKit
@main struct MyApp: App
{
var body: some Scene
{
WindowGroup {SpriteView(scene: GameSceneSK(size: UIScreen.main.bounds.size))}
}
}
class GameSceneSK: SKScene
{
override func sceneDidLoad() {
var fencePoints = [
CGPoint(x: 300, y: 0), CGPoint(x: 300, y: 400), CGPoint(x: 0, y: 400)
]
let fence = SKShapeNode(points: &fencePoints,
count: fencePoints.count)
fence.physicsBody = SKPhysicsBody(edgeChainFrom: fence.path!)
addChild(fence)
let sphereGeometry = SCNSphere(radius: 20)
let sphereNode = SCNNode(geometry: sphereGeometry)
let sphereScnScene = SCNScene()
sphereScnScene.rootNode.addChildNode(sphereNode)
let ball3D = SK3DNode(viewportSize: CGSize(width: 40,
height: 40))
ball3D.scnScene = sphereScnScene
let ball = SKShapeNode(circleOfRadius: 20)
ball.physicsBody = SKPhysicsBody(circleOfRadius: 20)
ball.addChild(ball3D)
physicsWorld.gravity = CGVector(dx: 0.2, dy: 0.2)
addChild(ball)
}
}