My goal is to print a debug message from a shader. I follow the guide that orders to set -fmetal-enable-logging metal compiler flag and following environment variables: MTL_LOG_LEVEL=MTLLogLevelDebug MTL_LOG_BUFFER_SIZE=2048 MTL_LOG_TO_STDERR=1 However there's an issue with the guide, it's only covers Xcode project setup, however I'm working on a Swift Package. It has a Metal-only target that's included into main target like this: targets: [ // A separate target for shaders. .target( name: MetalShaders, resources: [ .process(Metal) ], plugins: [ // https://github.com/schwa/MetalCompilerPlugin .plugin(name: MetalCompilerPlugin, package: MetalCompilerPlugin) ] ), // Main target .target( name: MegApp, dependencies: [MetalShaders] ), .testTarget( name: MegAppTests, dependencies: [ MegApp, MetalShaders, ] ] So to apply compiler flag I use MetalCompilerPlugin which emits debug.metallib, it also allows to define DEBUG macro for shaders. This code compiles: #ifdef DEBUG logger.log_error(Hello There!); os_log_default.
Search results for
show when run
112,669 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
How to change the image launch screen using story to show picture display in rotated view when ipad in portrait orientation ? Current launch screen -Image Portrait Orientation -Image Landscape Orientation -Info Setting Expected launch screen as below (Not Working) -Expected Launch Screen I have uploaded the entire sample source here
Hello guys, i need a little help. Im building an alarm clock app, pretty good one, and i have my own sounds i want to use as the alarm ring but notifications on apple cant work when the phone is turned off or the device is in silent mode (Or at least thats how i understand it) unless they have this feature called critical alerts that lets you have notifications even when the phone is turned off or silented. Without this, the phone can do just one beep and only when you open the notification, then it starts ringing but how is this supposed to wake you up? Alarmy has this worked out fine and i cant figure out how, maybe someone here knows. Im thinking maybe they have the critical alerts enabled but then i dont know why Apple would approve theirs and not mine. I tried to submit for the critical alerts feature but apple didn’t approve it saying the app is not the use case and im kinda lost. The whole app could be ruined because of this. So my question is. is there any way how i can use my custom sounds as a notif
Topic:
App & System Services
SubTopic:
Notifications
I’m running into a problem with SwiftUI/AppKit event handling on macOS Tahoe 26.2. I have a layered view setup: Bottom: AppKit NSView (NSViewRepresentable) Middle: SwiftUI view in an NSHostingView with drag/tap gestures Top: Another SwiftUI view in an NSHostingView On macOS 26.2, the middle NSHostingView no longer receives mouse or drag events when the top NSHostingView is present. Events pass through to the AppKit view below. Removing the top layer immediately restores interaction. Everything works correctly on macOS Sequoia. I’ve posted a full reproducible example and detailed explanation on Stack Overflow, including a single-file demo: Stack Overflow post: https://stackoverflow.com/q/79862332 I also found a related older discussion here, but couldn’t get the suggested workaround to apply: https://developer.apple.com/forums/thread/759081 Any guidance would be appreciated. Thanks!
Thanks for the replies. You are both quite right that I should have provided more information. When I say that notarization succeeds, I mean that I submit the dmg file produced by the build to the Apple notarization service and receive a status of 'Accepted'. I take this to mean all is well. When I say that notarization fails, I mean that the notarization step produces a status of 'Invalid'. Retrieving the notarization log indicates that the binaries were not signed. I've just gone through this again with my two machines. The build here is performed by scripts that are maintained in source code control and forced to be identical in both setups. The build infrastructure is also the same for both. Before beginning, both machines were powered off for a period of time. Power up one machine. Ensure the source tree is up-to-date. Run the build to produce a signed dmg. Submit it for notarization. The submission produces a status of Accepted. Power down the first machine. Power up the second machine. Again e
Topic:
Code Signing
SubTopic:
Notarization
Running print operation on WKWebView I hit EXC_BREAKPOINT and there is all kinds of console spew that looks concerning: ERROR: The NSPrintOperation view's frame was not initialized properly before knowsPageRange: returned. (WKPrintingView) ** CGContextClipToRect: invalid context 0x0. If you want to see the backtrace, please set CG_CONTEXT_SHOW_BACKTRACE environmental variable.** WebContent[7743] networkd_settings_read_from_file Sandbox is preventing this process from reading networkd settings file at /Library/Preferences/com.apple.networkd.plist, please add an exception. CRASHSTRING: XPC_ERROR_CONNECTION_INVALID from launchservicesd CRASHSTRING: rdar://problem/28724618 Process unable to create connection because the sandbox denied the right to lookup com.apple.coreservices.launchservicesd and so this process cannot talk to launchservicesd. WebContent[7921] The sandbox in this process does not allow access to RunningBoard. Safe to ignore all this?
Dimension can be subclassed to create the custom units. You probably don't want to subclass Dimension for reciprocal units since there's no good way to convert between measurements of different Dimension types. Instead, you can extend the existing classes to add the new reciprocal units, and then conversion will work like usual. subclass [UnitConverter] to something like UnitConverterInverse Yep you'll need that. Interestingly, there's already a reciprocal unit buried in the existing measurement API. It uses a reciprocal converter that is exactly what you need, but unfortunately isn't part of the public API. Check this out: Welcome to Apple Swift version 6.2.3 (swiftlang-6.2.3.3.21 clang-1700.6.3.2). Type :help for assistance. 1> import Foundation 2> print(UnitFuelEfficiency.litersPer100Kilometers.converter) <_NSStatic_NSStaticUnitConverterLinear_NoConst: 0x20761bfc8> coefficient = 1.000000, constant = 0.000000 3> print(UnitFuelEfficiency.milesPerGallon.converter) reciprocalValue = 235.215000
Topic:
App & System Services
SubTopic:
General
Tags:
On MacBook Pro M3 14 I can profile the Metal App performance by running it, then clicking on the M icon and choosing profile after replay. On Mac Studio M2 Ultra I cannot: the profiler starts and crashes. I have tried everything including reinstalling the OS, Xcode, the Metal SDK, you name it. The app uses the Metal 4 API. The content of the replayer errorinfo report is shown at the end. Any ideas what is going on here and/or what else I can do do root cause this and fix it? FWIW, it was worse on 26.1 (Xcode just reported Metal 4 profiling not available). In 26.2 Xcode attempts to profile and invariably crashes. === Error summary: === 1x DYErrorDomain (512) - guest app crashed (512) 1x com.apple.gputools.MTLReplayer (100) - Abort trap: 6 === First Error === Domain: DYErrorDomain Error code: 512 Description: guest app crashed (512) GTErrorKeyPID: 26913 GTErrorKeyProcessName: GPUToolsReplayService GTErrorKeyCrashDate: 2026-01-09 19:22:52 +0000 === Underlying Error #1 === Domain: com.apple.gputools.MTLR
Topic:
Graphics & Games
SubTopic:
Metal
Hello, Is there a 2-device limit for CoreBluetooth on visionOS 2.1? My app connects to 4 BLE peripherals on iOS but fails at the 3rd device on Vision Pro. The 3rd call to centralManager.connect() is successful and the peripheral enters .connecting state, but didConnect never fires and it stays in .connecting forever. No errors reported. First 2 devices work perfectly. Same code on iOS connects all 4. Has anyone else had this problem? Is there any documentation I can refer to that states something like this? Environment: visionOS 2.1, CoreBluetooth, Apple Vision Pro. My BLE Peripherals are running on nRF52840.
Issue When an Entity with a ViewAttachmentComponent is: disabled using isEnabled = false removed using removeFromParent() and then enabled or added back again, the attached SwiftUI view is rendered correctly, but tap interactions stop working. Specifically: Button actions inside the attached view do not fire TapGesture closures on child views do not respond Expected Behavior Tap interactions inside the attached view should continue to work after the Entity is re-enabled or re-added. Actual Behavior After being disabled or removed once, all tap interactions stop responding. Comparison When displaying the same SwiftUI view using RealityViewAttachments, this issue does not occur. Removing and re-displaying the attachment still allows taps to work correctly. Reproduction Attached sample code reproduces the issue: A RealityView with an Entity that has a ViewAttachmentComponent The attached SwiftUI view contains a Toggle The toggle updates isEnabled on the Entity After toggling off and on, tap interactions stop res
Thanks for reporting. While I investigate, you can achieve the pre-iOS 26 result by wrapping the picker in a Menu with a label that shows the selected choice. Travis Trotto - DTS Engineer
Topic:
UI Frameworks
SubTopic:
SwiftUI
Tags:
Description: I’m noticing that the code coverage metrics in Xcode 26 are not accurate compared to earlier versions. In Xcode 15, the same set of unit tests shows around 38% coverage, but in Xcode 26, even though all the tests are running successfully (for example, the SegmentedUI test cases), the code coverage is displayed as 0%. Has anyone else observed this behavior in Xcode 26? Is there any known issue, workaround, or configuration change required to get the correct coverage report? Environment: Xcode 26 iOS 18 SDK Unit tests running under XCTest Any insights or suggestions would be appreciated.
We have a Matter 1.2 certified device, with device type On/Off Light Switch (0x0103) that we are launching soon. The last tests include the updatability via DCL published updates: showing up immediately via Google Home, and Home Assistant not showing up at all in Apple Home, waited for >1 week We successfully tested with the TestNet DCL profile.
I try to update remoteHandle using CXCallUpdate for outgoing calls, but this works only on iOS 15 but not on 17 or 18 (16 didn't test). What's actually not working here? Are you looking at what the lock screen UI shows or what you see in other locations (particularly in the Recent call list)? There is a longstanding issue (r.126348631) where call updates aren't being displayed on the lock screen interface; however, this is purely about UI showing old data, not about the update itself completely failing. Notably: I use this handle value to implement recall by tapping on a call in the Recents tab of the system address book. But since my calls can transform from p2p to group calls, I need to update the handle value or find some other way to pass call identification info. ...I'd expect this sort of thing to work exactly the way you need/expect, regardless of what the lock screen UI shows. __ Kevin Elliott DTS Engineer, CoreOS/Hardware
Topic:
App & System Services
SubTopic:
General
Tags:
I have something like this drawing in an MTKView (see at bottom). I am finding it difficult to figure out when can the Swift-land resources used in making the MTLBuffer(s) be released? Below, for example, is it ok if args goes out of scope (or is otherwise deallocated) at point 1, 2, or 3? Or perhaps even earlier, as soon as argsBuffer has been created? I have been reading through various articles such as Setting resource storage modes Choosing a resource storage mode for Apple GPUs Copying data to a private resource but it's a lot to absorb and I haven't been really able to find an authoritative description of the required lifetime of the resources in CPU land. I should mention that this is Metal 4 code. In previous versions of Metal, the MTLCommandBuffer had the ability to add a completion handler to be called by the GPU after it has finished running the commands in the buffer but in Metal 4 there is no such thing (it it were even needed for the purpose I am interested in). Any advice and/or pointe
Topic:
Graphics & Games
SubTopic:
Metal