Search results for

“show when run”

115,100 results found

Post

Replies

Boosts

Views

Activity

SCNTechnique clearColor Always Shows sceneBackground When Passes Share Depth Buffer
Problem Description I'm encountering an issue with SCNTechnique where the clearColor setting is being ignored when multiple passes share the same depth buffer. The clear color always appears as the scene background, regardless of what value I set. The minimal project for reproducing the issue: https://www.dropbox.com/scl/fi/30mx06xunh75wgl3t4sbd/SCNTechniqueCustomSymbols.zip?rlkey=yuehjtk7xh2pmdbetv2r8t2lx&st=b9uobpkp&dl=0 Problem Details In my SCNTechnique configuration, I have two passes that need to share the same depth buffer for proper occlusion handling: passes: [ box1_pass: [ draw: DRAW_SCENE, includeCategoryMask: 1, colorStates: [ clear: true, clearColor: 0 0 0 0 // Expecting transparent black ], depthStates: [ clear: true, enableWrite: true ], outputs: [ depth: box1_depth, color: box1_color ], ], box2_pass: [ draw: DRAW_SCENE, includeCategoryMask: 2, colorStates: [ clear: true, clearColor: 0 0 0 0 // Also expecting transparent black ], depthStates: [ clear: false, enableWrite: false ], output
1
0
702
1w
RealityView camera feed not shown
I have two RealityView: ParentView and When click the button in ParentView, ChildView will be shown as full screen cover, but the camera feed in ChildView will not be shown, only black screen. If I show ChildView directly, it works with camera feed. Please help me on this issue? Thanks. import RealityKit import SwiftUI struct ParentView: View{ @State private var showIt = false var body: some View{ ZStack{ RealityView{content in content.camera = .virtual let box = ModelEntity(mesh: MeshResource.generateSphere(radius: 0.2),materials: [createSimpleMaterial(color: .red)]) content.add(box) } Button(Click here){ showIt = true } } .fullScreenCover(isPresented: $showIt){ ChildView() .overlay( Button(Close){ showIt = false }.padding(20), alignment: .bottomLeading ) } .ignoresSafeArea(.all) } } import ARKit import RealityKit import SwiftUI struct ChildView: View{ var body: some View{ RealityView{content in content.camera = .spatialTracking } } }
5
0
2k
1w
Reply to Can a compute pipeline be as efficient as a render pipeline for rasterization?
The render pipeline has dedicated fixed-function hardware for rasterization — vertex assembly, primitive rasterization, depth/stencil testing, and blending all happen in hardware that's purpose-built for this work. A compute pipeline can't match this for rasterization because you'd be reimplementing all of that in software on the CPU. Fragment shaders in a render pipeline do operate on individual pixel data — that's their purpose. The rasterizer determines which pixels a triangle covers, then the fragment shader runs for each of those pixels, giving you full control over the output color. These resources are a good starting point for understanding both pipelines: Using Metal to Draw a View's Contents — basic MetalKit setup and rendering Performing Calculations on a GPU — shows the compute pipeline Metal Sample Code — the full collection of Metal samples, starting with the fundamentals section Working through the render pipeline samples first will give you a practical understanding of how ras
Topic: Graphics & Games SubTopic: Metal Tags:
1w
GPTK 3 and D3DMetal issue with Modern Pipeline Creation
Death Stranding 2: On the Beach (v1.0.48.0, Steam) crashes during rendering initialization when running through CrossOver 26 with D3DMetal 3.0 on an Apple M2 Max Mac Studio running macOS Sequoia. The game successfully initializes Streamline, NVAPI, DLSS (Result::eOk), DLSSG (Result::eOk), Reflex, and XeSS — all subsystems report success. The crash occurs immediately after, during rendering pipeline creation, before the game reaches NXStorage initialization or window creation. Minidump analysis confirms the crash is an access violation (0xc0000005) at DS2.exe+0x67233d, writing to address 0x0. RAX=0x0 (null pointer being dereferenced), R12=0xFFFFFFFFFFFFFFFF (error/invalid handle return). The game appears to call a D3D12 API — likely CheckFeatureSupport or a pipeline state creation function — that D3DMetal acknowledges as supported but returns null or invalid data for. The game trusts the response and dereferences the null pointer. Two other Nixxes titles using the same engine and D3DMetal set
1
0
710
1w
Reply to RealityKit fill the background environment
Great progress — the screenshots show a big improvement from where you started. To your first question: 468K triangles across 3 chunks is reasonable for Apple Silicon. That shouldn't be a problem on its own. The frame rate drop you're seeing when taking a screenshot is a separate issue — capturing a screenshot forces a synchronous GPU readback, which stalls the rendering pipeline while the GPU finishes its current work and copies the framebuffer to CPU-accessible memory. That stall is expected and isn't related to your triangle count. You should see the frame rate recover immediately after the capture completes. To your second question: there's no built-in tool that converts Reality Composer Pro models directly to LowLevelMesh. The typical workflow is to author your detailed models (trees, bushes, rocks) in a 3D modeling tool like Blender, export them as USDZ, then load them at runtime with ModelEntity and extract the vertex and index data from the MeshResource to repack into your batched LowLevelMes
Topic: Graphics & Games SubTopic: RealityKit Tags:
1w
Xcode26 Replay frame broken
Got a broken frame when using Xcode to capture a frame and replay it from a Unity game. It seems like the vertex buffer is broken; I see a bunch of nans in the vertex buffer. However, the game displays correct when running, and it only happend when I upgrade my Xcode and iphone to Xcode26 and IOS26 ios26
1
0
272
1w
SpriteKit FPS drops significantly on iOS 26.3.1 when touching screen, even in minimal scene
Title: SpriteKit FPS drops significantly on iOS 26.3.1 when touching screen, even in a minimal scene Summary: On a real device running iOS 26.3.1, FPS drops significantly in a very simple SpriteKit scene when touching or moving a finger on the screen. This happens even with a minimal setup where update(_:), touchesBegan, and touchesMoved are not overridden. Because the issue reproduces in a minimal scene, I suspect a performance regression in SpriteKit and/or iOS rather than in app-specific logic. Environment: OS: iOS 26.3.1 Device: iPhone 12 Xcode: 26.3 Build Configuration: Release Framework: SpriteKit FPS measurement: SKView.showsFPS = true Steps to Reproduce: Present a minimal SpriteKit scene. Enable SKView.showsFPS = true. Add only a very small number of nodes to the scene. Do not override update(_:), touchesBegan, or touchesMoved. Repeatedly tap the screen or move a finger around. Actual Result: The scene stays around 60 FPS when idle. FPS drops significantly when touching or moving a finger on
1
0
210
1w
Reply to receivedTurnEventForMatch giving stale data
What you're seeing is a propagation timing issue — the push notification that triggers receivedTurnEventForMatch can arrive before the updated match data is fully available on the server. That's why both the GKTurnBasedMatch delivered with the callback and a subsequent loadMatch(withID:) return stale data. A fixed delay is fragile because the propagation time varies with server load and network conditions. A more robust approach is to retry loadMatch(withID:) with a comparison check — for example, if the match still shows the previous participant as the current player, or the matchData hasn't changed from your last known state, wait briefly and retry. Something like an exponential backoff starting at 1 second, up to a reasonable cap, gives the server time to propagate without relying on a magic number. If you haven't already, please file a feedback report with Feedback Assistant and include your sample project. The behavior you're describing — where loadMatch(withID:) returns stale data even after th
1w
Trying to load image & identifier from photo library with PhotosPicker
I'm updating an older Mac app written in Objective C and OpenGL to be a mutliplatform app in SwiftUI and Metal. The app loads images and creates kaleidoscope animations from them. It is a document-based application, and saves info about the kaleidoscope into the document. On macOS, it creates a security-scoped bookmark to remember the user's chosen image. On iOS, I use a PhotosPicker to have the user choose an image from their photo library to use. I would like to get the itemIdentifier from the image they choose and save that into my document so I can use it to fetch the image when the user reloads the kaleidoscope document in the future. However, the call to loadTransferable is returning nil for the itemIdentifier. Here is my iOS/iPadOS code: #if os(macOS) // Mac code #else PhotosPicker(Choose image, selection: $selectedItem, matching: .images) .onChange(of: selectedItem) { Task { if let newValue = selectedItem { scopeState.isHEIC = newValue.supportedContentTypes.contains(UTType.heic) let data = try? await
1
0
386
1w
no policy, cannot allow apps outside /Applications;domain=OSSystemExtensionErrorDomain code=4
Here’s the formatted summary in English for your issue submission: Issue Summary We are activating a Network Extension system extension (filter-data) from a signed and notarized macOS app. Activation consistently fails with the following error: Error Message: OSSystemExtensionErrorDomain code=4 Extension not found in App bundle. Unable to find any matched extension with identifier: com.seaskylight.yksmacos.ExamNetFilter.data At the same time, sysextd logs show: no policy, cannot allow apps outside /Applications However, our host app and executable paths are already under /Applications, and the extension bundle physically exists in the expected app bundle location. Environment Information macOS: Darwin 25.4.0 Host App: /Applications/xxx.app Host Bundle ID: com.seaskylight.yksmacos System Extension Bundle ID: com.seaskylight.yksmacos.ExamNetFilter.data Team ID: BVU65MZFLK Device Management: Enrolled via DEP: No MDM Enrollment: No Reproduction Steps Install the host app to /Applications. Launch the host
1
0
125
1w
Reply to SystemLanguageModel.Adapter leaks ~100MB of irrecoverable APFS disk space per call
To clarify, this isn't a memory leak, it's an issue with not de-duplicating or evicting copied model weights from a SIP-protected disk space. I'm also not sure the bug is CLI-specific. I definitely observed it with my CLI tool, Junco,, but I have another SwiftUI application where I'm observing the issue as well. Simply running the app from XCode copies the weights & metadata to the location above, and I've only been able to delete the accumulated ~104GB (645 model clones) through Recovery Mode.
1w
SystemLanguageModel.Adapter leaks ~100MB of irrecoverable APFS disk space per call
FoundationModels framework, macOS Tahoe 26.4.1, MacBook Air M4. Loading a LoRA adapter via SystemLanguageModel.Adapter(fileURL:) leaks ~100MB of APFS disk space per invocation. The space is permanently consumed at the APFS block level with no corresponding file. Calls without an adapter show zero space loss. Running ~300 adapter calls in a benchmark loop leaked ~30GB and nearly filled a 500GB drive. The total unrecoverable phantom space is now ~239GB (461GB allocated on Data volume, 222GB visible to du). Reproduction: Build a CLI tool that loads a .fmadapter and runs one generation Measure before/after with df and du: Before: df free = 9.1 GB, du -smx /System/Volumes/Data = 227,519 MB After: df free = 9.0 GB, du -smx /System/Volumes/Data = 227,529 MB df delta: ~100 MB consumed du delta: +10 MB (background system activity) Phantom: ~90 MB -- no corresponding file anywhere on disk Without --adapter (same code, same model): zero space change du was run with sudo -x. Files modi
7
0
512
1w
D3DMetal Extreme Over Synchronization Issues
Explanation Currently, D3DMetal’s GPU synchronization approach introduces significant compute overhead on the CPU. This specifically affects D3D12 games that use modern rendering pipelines on Apple Silicon. Specifically, I’ve tested Death Stranding 2 On the Beach for how it handles its rendering. And the results are extreme: frame times are suffering from a 42% decrease from synchronization. Although there are obviously other effects at play, such as the overhead introduced by Rosetta and Wine, both of them don’t introduce as much overhead as D3DMetal. This issue isn’t just specific to Death Stranding 2 On the Beach; most games running through D3DMetal suffer from this. Most games still seem to force synchronization to ~30 ms to reach the 30 fps amount. But it could be better with better synchronization, such as how DXMT handles it. Instead of doubling the work, it allows Metal to single-handedly track resource dependencies internally. This is in part due to the unfortunate bad mapping of D3D12 calls
1
0
318
1w
Apple managed asset pack for FoundationModels adapter on Testflight does not download (statusUpdates silent)
Hi, I'm stuck distributing a custom FoundationModels adapter as an Apple-hosted managed asset pack via TestFlight. Everything looks correctly configured end to end but the download just never starts and the statusUpdates sequence is silent. Here's my configuration: App Info.plist: BAHasManagedAssetPacks BAUsesAppleHosting BAAppGroupIDgroup.com.fiuto.shared Entitlement com.apple.developer.foundation-model-adapter on both the app and the asset downloader extension. The asset downloader extension uses StoreDownloaderExtension , returning SystemLanguageModel.Adapter.isCompatible(assetPack) from shouldDownload , and the app group on app and asset download extension is the same. I have exported the adapter with toolkit 26.0.0, obtaining: adapterIdentifier = fmadapter-FiutoAdapter-1234567 I have packaged the asset pack using xcrun ba-package and uploaded it to App Store Connect via Transporter, and I get the ready for internal and external testing state on App Store Connect, and I have uploaded my app build on TestF
2
0
295
1w
Reply to HID Device Access / Mode Switch
Is there any way at all to do this on macOS? I'm not sure exactly which APIs you're interacting with and how that translates to the specific failure you're seeing, so I'm going to outline what should work. Give it a try and then we can dig into the specifics of any issue you run into. Getting into the details: You should be using the IOUSBHost Framework, which is the modern framework for interacting with USB devices from user space. I'll assume you already know how to find your target in the IORegistry, but please let me know if you need guidance on that. You should be looking and connecting to the IOUSBHostDevice, NOT the https://developer.apple.com/documentation/iousbhost/iousbhostinterface?language=objc. That gives you full control over the accessory and removes all other drivers, which is what you want for a firmware updater. When you create your host object, you'll need to pass in the IOUSBHostObjectInitOptionsDeviceCapture option. You can also try IOUSBHostObjectInitOptionsDeviceSeize, but I th
Topic: App & System Services SubTopic: Drivers Tags:
1w
SCNTechnique clearColor Always Shows sceneBackground When Passes Share Depth Buffer
Problem Description I'm encountering an issue with SCNTechnique where the clearColor setting is being ignored when multiple passes share the same depth buffer. The clear color always appears as the scene background, regardless of what value I set. The minimal project for reproducing the issue: https://www.dropbox.com/scl/fi/30mx06xunh75wgl3t4sbd/SCNTechniqueCustomSymbols.zip?rlkey=yuehjtk7xh2pmdbetv2r8t2lx&st=b9uobpkp&dl=0 Problem Details In my SCNTechnique configuration, I have two passes that need to share the same depth buffer for proper occlusion handling: passes: [ box1_pass: [ draw: DRAW_SCENE, includeCategoryMask: 1, colorStates: [ clear: true, clearColor: 0 0 0 0 // Expecting transparent black ], depthStates: [ clear: true, enableWrite: true ], outputs: [ depth: box1_depth, color: box1_color ], ], box2_pass: [ draw: DRAW_SCENE, includeCategoryMask: 2, colorStates: [ clear: true, clearColor: 0 0 0 0 // Also expecting transparent black ], depthStates: [ clear: false, enableWrite: false ], output
Replies
1
Boosts
0
Views
702
Activity
1w
RealityView camera feed not shown
I have two RealityView: ParentView and When click the button in ParentView, ChildView will be shown as full screen cover, but the camera feed in ChildView will not be shown, only black screen. If I show ChildView directly, it works with camera feed. Please help me on this issue? Thanks. import RealityKit import SwiftUI struct ParentView: View{ @State private var showIt = false var body: some View{ ZStack{ RealityView{content in content.camera = .virtual let box = ModelEntity(mesh: MeshResource.generateSphere(radius: 0.2),materials: [createSimpleMaterial(color: .red)]) content.add(box) } Button(Click here){ showIt = true } } .fullScreenCover(isPresented: $showIt){ ChildView() .overlay( Button(Close){ showIt = false }.padding(20), alignment: .bottomLeading ) } .ignoresSafeArea(.all) } } import ARKit import RealityKit import SwiftUI struct ChildView: View{ var body: some View{ RealityView{content in content.camera = .spatialTracking } } }
Replies
5
Boosts
0
Views
2k
Activity
1w
Reply to Can a compute pipeline be as efficient as a render pipeline for rasterization?
The render pipeline has dedicated fixed-function hardware for rasterization — vertex assembly, primitive rasterization, depth/stencil testing, and blending all happen in hardware that's purpose-built for this work. A compute pipeline can't match this for rasterization because you'd be reimplementing all of that in software on the CPU. Fragment shaders in a render pipeline do operate on individual pixel data — that's their purpose. The rasterizer determines which pixels a triangle covers, then the fragment shader runs for each of those pixels, giving you full control over the output color. These resources are a good starting point for understanding both pipelines: Using Metal to Draw a View's Contents — basic MetalKit setup and rendering Performing Calculations on a GPU — shows the compute pipeline Metal Sample Code — the full collection of Metal samples, starting with the fundamentals section Working through the render pipeline samples first will give you a practical understanding of how ras
Topic: Graphics & Games SubTopic: Metal Tags:
Replies
Boosts
Views
Activity
1w
GPTK 3 and D3DMetal issue with Modern Pipeline Creation
Death Stranding 2: On the Beach (v1.0.48.0, Steam) crashes during rendering initialization when running through CrossOver 26 with D3DMetal 3.0 on an Apple M2 Max Mac Studio running macOS Sequoia. The game successfully initializes Streamline, NVAPI, DLSS (Result::eOk), DLSSG (Result::eOk), Reflex, and XeSS — all subsystems report success. The crash occurs immediately after, during rendering pipeline creation, before the game reaches NXStorage initialization or window creation. Minidump analysis confirms the crash is an access violation (0xc0000005) at DS2.exe+0x67233d, writing to address 0x0. RAX=0x0 (null pointer being dereferenced), R12=0xFFFFFFFFFFFFFFFF (error/invalid handle return). The game appears to call a D3D12 API — likely CheckFeatureSupport or a pipeline state creation function — that D3DMetal acknowledges as supported but returns null or invalid data for. The game trusts the response and dereferences the null pointer. Two other Nixxes titles using the same engine and D3DMetal set
Replies
1
Boosts
0
Views
710
Activity
1w
Reply to RealityKit fill the background environment
Great progress — the screenshots show a big improvement from where you started. To your first question: 468K triangles across 3 chunks is reasonable for Apple Silicon. That shouldn't be a problem on its own. The frame rate drop you're seeing when taking a screenshot is a separate issue — capturing a screenshot forces a synchronous GPU readback, which stalls the rendering pipeline while the GPU finishes its current work and copies the framebuffer to CPU-accessible memory. That stall is expected and isn't related to your triangle count. You should see the frame rate recover immediately after the capture completes. To your second question: there's no built-in tool that converts Reality Composer Pro models directly to LowLevelMesh. The typical workflow is to author your detailed models (trees, bushes, rocks) in a 3D modeling tool like Blender, export them as USDZ, then load them at runtime with ModelEntity and extract the vertex and index data from the MeshResource to repack into your batched LowLevelMes
Topic: Graphics & Games SubTopic: RealityKit Tags:
Replies
Boosts
Views
Activity
1w
Xcode26 Replay frame broken
Got a broken frame when using Xcode to capture a frame and replay it from a Unity game. It seems like the vertex buffer is broken; I see a bunch of nans in the vertex buffer. However, the game displays correct when running, and it only happend when I upgrade my Xcode and iphone to Xcode26 and IOS26 ios26
Replies
1
Boosts
0
Views
272
Activity
1w
SpriteKit FPS drops significantly on iOS 26.3.1 when touching screen, even in minimal scene
Title: SpriteKit FPS drops significantly on iOS 26.3.1 when touching screen, even in a minimal scene Summary: On a real device running iOS 26.3.1, FPS drops significantly in a very simple SpriteKit scene when touching or moving a finger on the screen. This happens even with a minimal setup where update(_:), touchesBegan, and touchesMoved are not overridden. Because the issue reproduces in a minimal scene, I suspect a performance regression in SpriteKit and/or iOS rather than in app-specific logic. Environment: OS: iOS 26.3.1 Device: iPhone 12 Xcode: 26.3 Build Configuration: Release Framework: SpriteKit FPS measurement: SKView.showsFPS = true Steps to Reproduce: Present a minimal SpriteKit scene. Enable SKView.showsFPS = true. Add only a very small number of nodes to the scene. Do not override update(_:), touchesBegan, or touchesMoved. Repeatedly tap the screen or move a finger around. Actual Result: The scene stays around 60 FPS when idle. FPS drops significantly when touching or moving a finger on
Replies
1
Boosts
0
Views
210
Activity
1w
Reply to receivedTurnEventForMatch giving stale data
What you're seeing is a propagation timing issue — the push notification that triggers receivedTurnEventForMatch can arrive before the updated match data is fully available on the server. That's why both the GKTurnBasedMatch delivered with the callback and a subsequent loadMatch(withID:) return stale data. A fixed delay is fragile because the propagation time varies with server load and network conditions. A more robust approach is to retry loadMatch(withID:) with a comparison check — for example, if the match still shows the previous participant as the current player, or the matchData hasn't changed from your last known state, wait briefly and retry. Something like an exponential backoff starting at 1 second, up to a reasonable cap, gives the server time to propagate without relying on a magic number. If you haven't already, please file a feedback report with Feedback Assistant and include your sample project. The behavior you're describing — where loadMatch(withID:) returns stale data even after th
Replies
Boosts
Views
Activity
1w
Trying to load image & identifier from photo library with PhotosPicker
I'm updating an older Mac app written in Objective C and OpenGL to be a mutliplatform app in SwiftUI and Metal. The app loads images and creates kaleidoscope animations from them. It is a document-based application, and saves info about the kaleidoscope into the document. On macOS, it creates a security-scoped bookmark to remember the user's chosen image. On iOS, I use a PhotosPicker to have the user choose an image from their photo library to use. I would like to get the itemIdentifier from the image they choose and save that into my document so I can use it to fetch the image when the user reloads the kaleidoscope document in the future. However, the call to loadTransferable is returning nil for the itemIdentifier. Here is my iOS/iPadOS code: #if os(macOS) // Mac code #else PhotosPicker(Choose image, selection: $selectedItem, matching: .images) .onChange(of: selectedItem) { Task { if let newValue = selectedItem { scopeState.isHEIC = newValue.supportedContentTypes.contains(UTType.heic) let data = try? await
Replies
1
Boosts
0
Views
386
Activity
1w
no policy, cannot allow apps outside /Applications;domain=OSSystemExtensionErrorDomain code=4
Here’s the formatted summary in English for your issue submission: Issue Summary We are activating a Network Extension system extension (filter-data) from a signed and notarized macOS app. Activation consistently fails with the following error: Error Message: OSSystemExtensionErrorDomain code=4 Extension not found in App bundle. Unable to find any matched extension with identifier: com.seaskylight.yksmacos.ExamNetFilter.data At the same time, sysextd logs show: no policy, cannot allow apps outside /Applications However, our host app and executable paths are already under /Applications, and the extension bundle physically exists in the expected app bundle location. Environment Information macOS: Darwin 25.4.0 Host App: /Applications/xxx.app Host Bundle ID: com.seaskylight.yksmacos System Extension Bundle ID: com.seaskylight.yksmacos.ExamNetFilter.data Team ID: BVU65MZFLK Device Management: Enrolled via DEP: No MDM Enrollment: No Reproduction Steps Install the host app to /Applications. Launch the host
Replies
1
Boosts
0
Views
125
Activity
1w
Reply to SystemLanguageModel.Adapter leaks ~100MB of irrecoverable APFS disk space per call
To clarify, this isn't a memory leak, it's an issue with not de-duplicating or evicting copied model weights from a SIP-protected disk space. I'm also not sure the bug is CLI-specific. I definitely observed it with my CLI tool, Junco,, but I have another SwiftUI application where I'm observing the issue as well. Simply running the app from XCode copies the weights & metadata to the location above, and I've only been able to delete the accumulated ~104GB (645 model clones) through Recovery Mode.
Replies
Boosts
Views
Activity
1w
SystemLanguageModel.Adapter leaks ~100MB of irrecoverable APFS disk space per call
FoundationModels framework, macOS Tahoe 26.4.1, MacBook Air M4. Loading a LoRA adapter via SystemLanguageModel.Adapter(fileURL:) leaks ~100MB of APFS disk space per invocation. The space is permanently consumed at the APFS block level with no corresponding file. Calls without an adapter show zero space loss. Running ~300 adapter calls in a benchmark loop leaked ~30GB and nearly filled a 500GB drive. The total unrecoverable phantom space is now ~239GB (461GB allocated on Data volume, 222GB visible to du). Reproduction: Build a CLI tool that loads a .fmadapter and runs one generation Measure before/after with df and du: Before: df free = 9.1 GB, du -smx /System/Volumes/Data = 227,519 MB After: df free = 9.0 GB, du -smx /System/Volumes/Data = 227,529 MB df delta: ~100 MB consumed du delta: +10 MB (background system activity) Phantom: ~90 MB -- no corresponding file anywhere on disk Without --adapter (same code, same model): zero space change du was run with sudo -x. Files modi
Replies
7
Boosts
0
Views
512
Activity
1w
D3DMetal Extreme Over Synchronization Issues
Explanation Currently, D3DMetal’s GPU synchronization approach introduces significant compute overhead on the CPU. This specifically affects D3D12 games that use modern rendering pipelines on Apple Silicon. Specifically, I’ve tested Death Stranding 2 On the Beach for how it handles its rendering. And the results are extreme: frame times are suffering from a 42% decrease from synchronization. Although there are obviously other effects at play, such as the overhead introduced by Rosetta and Wine, both of them don’t introduce as much overhead as D3DMetal. This issue isn’t just specific to Death Stranding 2 On the Beach; most games running through D3DMetal suffer from this. Most games still seem to force synchronization to ~30 ms to reach the 30 fps amount. But it could be better with better synchronization, such as how DXMT handles it. Instead of doubling the work, it allows Metal to single-handedly track resource dependencies internally. This is in part due to the unfortunate bad mapping of D3D12 calls
Replies
1
Boosts
0
Views
318
Activity
1w
Apple managed asset pack for FoundationModels adapter on Testflight does not download (statusUpdates silent)
Hi, I'm stuck distributing a custom FoundationModels adapter as an Apple-hosted managed asset pack via TestFlight. Everything looks correctly configured end to end but the download just never starts and the statusUpdates sequence is silent. Here's my configuration: App Info.plist: BAHasManagedAssetPacks BAUsesAppleHosting BAAppGroupIDgroup.com.fiuto.shared Entitlement com.apple.developer.foundation-model-adapter on both the app and the asset downloader extension. The asset downloader extension uses StoreDownloaderExtension , returning SystemLanguageModel.Adapter.isCompatible(assetPack) from shouldDownload , and the app group on app and asset download extension is the same. I have exported the adapter with toolkit 26.0.0, obtaining: adapterIdentifier = fmadapter-FiutoAdapter-1234567 I have packaged the asset pack using xcrun ba-package and uploaded it to App Store Connect via Transporter, and I get the ready for internal and external testing state on App Store Connect, and I have uploaded my app build on TestF
Replies
2
Boosts
0
Views
295
Activity
1w
Reply to HID Device Access / Mode Switch
Is there any way at all to do this on macOS? I'm not sure exactly which APIs you're interacting with and how that translates to the specific failure you're seeing, so I'm going to outline what should work. Give it a try and then we can dig into the specifics of any issue you run into. Getting into the details: You should be using the IOUSBHost Framework, which is the modern framework for interacting with USB devices from user space. I'll assume you already know how to find your target in the IORegistry, but please let me know if you need guidance on that. You should be looking and connecting to the IOUSBHostDevice, NOT the https://developer.apple.com/documentation/iousbhost/iousbhostinterface?language=objc. That gives you full control over the accessory and removes all other drivers, which is what you want for a firmware updater. When you create your host object, you'll need to pass in the IOUSBHostObjectInitOptionsDeviceCapture option. You can also try IOUSBHostObjectInitOptionsDeviceSeize, but I th
Topic: App & System Services SubTopic: Drivers Tags:
Replies
Boosts
Views
Activity
1w