I'm a macOS app developer, and I'm facing an issue with custom notification sounds in my app. After upgrading the app to include new custom notification sounds, the changes do not reflect until the system is restarted. The sounds do not update immediately after the app upgrade.
Is there a way to refresh or reload the custom notification sounds without needing a full system restart? Any guidance or best practices to handle this would be greatly appreciated.
Thank you!
General
RSS for tagExplore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Post
Replies
Boosts
Views
Activity
Hello,
I'm a macOS app developer, and I'm facing an issue with custom notification sounds in my app. After upgrading the app to include new custom notification sounds, the changes do not reflect until the system is restarted. The sounds do not update immediately after the app upgrade.
Is there a way to refresh or reload the custom notification sounds without needing a full system restart? Any guidance or best practices to handle this would be greatly appreciated.
Thank you!
Issue Description
We have an azure pipeline that runs iOS UI tests. These tests can be triggered by a PR, CI, or manually.
The UI tests run without issues when triggered by a PR, but they encounter the following error when triggered by CI or manually. All configurations and macOS images are identical. Could someone help us understand the issue and suggest a solution?
2024-07-12 21:13:19.217 xcodebuild[7872:72894] iOSSimulator: 📱<DVTiPhoneSimulator (0x7fa0bcfda2f0), Clone 1 of iPhone 14, unknown class, 17.2 (21C62), CDC2D287-AC4B-49AB-9824-61CEFBCEAEC5> unable to connect to "com.apple.instruments.deviceservice.lockdown" - timed out after 120 seconds
2024-07-12 21:13:19.547 xcodebuild[7872:73563] iOSSimulator: 📱<DVTiPhoneSimulator (0x7fa0bcfe8590), Clone 2 of iPhone 14, unknown class, 17.2 (21C62), D1BE199B-C7B1-4A2D-8FCC-1D70DAC1F461> unable to connect to "com.apple.instruments.deviceservice.lockdown" - timed out after 120 seconds
2024-07-12 21:16:41.608 xcodebuild[7872:72894] iOSSimulator: 📱<DVTiPhoneSimulator (0x7fa0bcfda2f0), Clone 1 of iPhone 14, unknown class, 17.2 (21C62), CDC2D287-AC4B-49AB-9824-61CEFBCEAEC5> unable to connect to "com.apple.instruments.deviceservice.lockdown" - timed out after 120 seconds
2024-07-12 21:16:41.654 xcodebuild[7872:73563] iOSSimulator: 📱<DVTiPhoneSimulator (0x7fa0bcfe8590), Clone 2 of iPhone 14, unknown class, 17.2 (21C62), D1BE199B-C7B1-4A2D-8FCC-1D70DAC1F461> unable to connect to "com.apple.instruments.deviceservice.lockdown" - timed out after 120 seconds
2024-07-12 21:20:00.875 xcodebuild[7872:41113] [MT] IDETestOperationsObserverDebug: 766.921 elapsed -- Testing started completed.
2024-07-12 21:20:00.875 xcodebuild[7872:41113] [MT] IDETestOperationsObserverDebug: 0.000 sec, +0.000 sec -- start
2024-07-12 21:20:00.875 xcodebuild[7872:41113] [MT] IDETestOperationsObserverDebug: 766.921 sec, +766.921 sec -- end
2024-07-12 21:20:06.478 xcodebuild[7872:77327] iOSSimulator: 📱<DVTiPhoneSimulator (0x7fa0bcfe8590), Clone 2 of iPhone 14, unknown class, 17.2 (21C62), D1BE199B-C7B1-4A2D-8FCC-1D70DAC1F461> unable to connect to "com.apple.instruments.deviceservice.lockdown" - timed out after 120 seconds
2024-07-12 21:20:06.575 xcodebuild[7872:79397] iOSSimulator: 📱<DVTiPhoneSimulator (0x7fa0bcfda2f0), Clone 1 of iPhone 14, unknown class, 17.2 (21C62), CDC2D287-AC4B-49AB-9824-61CEFBCEAEC5> unable to connect to "com.apple.instruments.deviceservice.lockdown" - timed out after 120 seconds
I'm trying to capture all trackpad events at OS level and disable few of them - say the ones in left half of trackpad. Following this question, I could level listen to events in current window view with following code.
final class AppKitTouchesView: NSView {
override init(frame frameRect: NSRect) {
super.init(frame: frameRect)
// We're interested in `.indirect` touches only.
allowedTouchTypes = [.indirect]
// We'd like to receive resting touches as well.
wantsRestingTouches = true
}
required init?(coder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
private func handleTouches(with event: NSEvent) {
// 1. Change `in` parameter to listen events at OS level
// 2. Disable all events with `touch.normalizedPosition.x < 0.5`
let touches = event.touches(matching: .touching, in: self)
}
override func touchesBegan(with event: NSEvent) {
handleTouches(with: event)
}
override func touchesEnded(with event: NSEvent) {
handleTouches(with: event)
}
override func touchesMoved(with event: NSEvent) {
handleTouches(with: event)
}
override func touchesCancelled(with event: NSEvent) {
handleTouches(with: event)
}
}
I'd to accomplish two things further.
Change in parameter to listen events at OS level
Disable all touch events on some condition - say touch.normalizedPosition.x < 0.5
We noticed that AVPlayerViewController does not always show the "Multi-channel" label in the audio setting in the player when playing a video asset with surround sound as an audio track. (see image)
We only serve in the HLS master manifest a multichannel audio track, like this
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="audio_0",CHANNELS="6",NAME="Surround",LANGUAGE....
Different tvOS versions will give us different outcomes on whether or not the "multi-channel" label is shown
DOES NOT SHOW (the label Multi-channel will not show)
Model A1842 (tvOS v 17.5.1)
Model A1625 (tvOS v 16.6)
DOES SHOW (see image)
Model A1625 (tvOS v 15.6)
This gives us the impression that the label being shown depends on tvOS version.. Any reason why? This is an ideal way for the user to see that the audio track has surround..
I have the same issue. It was reported to Apple in November, 2023. Anyone else see this as a huge issue?
In WWDC24 video https://developer.apple.com/videos/play/wwdc2024/10131/ Jennifer mentioned that the app she's talking about would be available in the link below, but there's no such link provided.
Where the mentioned app code could be found?
Hello!
I'm building a Countdown Timer for the Dynamic Island using a Live Activity.
I have two issues for which I can't find any solution:
We want to display the time in the "X minutes" format, like in this example. I went through the forum but all the answers were wrong, because they were using a Text(:format) which never updates in the live activity, or a Text(:timerInterval) which we can't format.
I want the Live Activity to end once the timer gets to zero. I found this staleDate parameter that I thought would helped, but is actually only adding loaders on my design once the date is reached. I tried to implement a solution like the answer of this post, but the if (context.isStale) {...} part is never being rendered. It also looks like the stale sate gets activated only when the app is focus again.
I tried several fixes, went through a lot of forum and posts, but I can't find any solution.
Thanks!
Hey, I am natively an ML Engineer so not really well versed in swift.
I am currently trying to use a model of mine in a swift app. The model detects knives and has an input size of 640x640.
i have set up AV capture and that works well I can just see the camera on my screen. I have also set up the coreML model and it also works and detects the objects.
so what doesnt work is the way I draw boxes around the object. e.g. the objects box params are Raw model output: x=342.25, y=293.0, w=105.375, h=150.5 when Im in the middle of the screen. seems ok for me, I want to draw a box now with these params. But the CGRect object is really weird to me, firstly the x value is the y value in the box (i notice this because when I move the object up and down , the x val will change but not the y val). then if I just switch the values, the drawn box will move ok in the y direction but mirrored in the x direction.
what things need to be considered/changed here, is it something about the layer setup? See my implementation here:
func setupLayers() {
previewLayer = AVCaptureVideoPreviewLayer(session: session)
previewLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
rootLayer = previewView.layer
previewLayer.frame = rootLayer.bounds
rootLayer.addSublayer(previewLayer)
inferenceTimeBounds = CGRect(x: rootLayer.frame.midX-75, y: rootLayer.frame.maxY-70, width: 150, height: 17)
inferenceTimeLayer = createRectLayer(inferenceTimeBounds, [1,1,1,1])
inferenceTimeLayer.cornerRadius = 7
rootLayer.addSublayer(inferenceTimeLayer)
detectionLayer = CALayer()
detectionLayer.frame = rootLayer.bounds
detectionLayer.position = CGPoint(x: rootLayer.bounds.midX, y: rootLayer.bounds.midY)
rootLayer.addSublayer(detectionLayer)
}
I have a live activity, that works fine when the Lock Screen showing, but as soon as it "sleeps" dims down for always on display, everything in the widget disappears and an Activity Indicator(spinner), displays in its place, but non-animating.
On the WWDC24 session video 'Enhance your UI animations and transitions', Appls shows these new animation methods for UIKIT:
switch gesture.state {
case .changed:
UIView. animate(.interactiveSpring) {
bead.center = gesture.translation
}
case .ended:
UIView. animate(spring) {
bead.center = endOfBracelet
}
}
As of iOS 18 Beta 2, I get an error for `UIView. animate(.interactiveSpring)`
These new methods are not available yet?
After installation of beta update. The app icons are not loading in home screen even after multiple phone restarts
My app uses AttributedString for both text formatting and storage using MarkdownDecodableAttributedStringKey. The new API for Genmoji looks very interesting but I don't see an AttributeScope for it.
Is it possible to work around this and stick with AttributedString, am I in the wrong room or should I send in an enhancement request via Feedback Assistant?
Thanks a lot for your time!
I want my app to be able to work upside down on an iphone so the mic can point up when it's set on a stand. But it seems that there's no longer support for this on home-button-less iPhones. Am I missing something?
Whenever I make a new app I end up getting bug reports from people who say they can't see text, or they can just about see white text against an almost white background. It always turns out to be because their phone is in dark mode, which I don't use.
Almost all of my views have the background color set to "System background color" which is the default value when you create a view. Why does dark mode change the color of the text but not the background?
Hello,
I'm building apps for iPhone and iPad.
Those are working together like a 2-screen Nintendo switch.
Is there any way for communication between iPhone and iPad without an outside server?
PAN(personal area network) could be a solution, but I could not find any information to use it for my case.
If the iPad is connected to iPhone as a personal hotspot (though there's no cell tower), they should be communicate each other. I think there's a better way for my case.
Could anyone can tell me where I start to looking for?
Thanks,
JJ
Hi team,
Currently, I am using two Modal prompts consecutively to display force update popups based on a condition. However, there's an issue where the UI thread is occasionally not binding properly after dismissing the first prompt. Please ensure that after dismissing the first prompt, the second prompt is not bound.
After reviewing everything, I understand this is an iOS core library binding issue and it's occurring from the latest iOS version onwards. Could you please provide me with a solution to resolve this issue?
Thank you!
after this bug happen i couldn’t turn it back to normal screen
i had to restart all for this to gone
Hello,
We represent an AI-powered content monetization platform. Recently, we encountered challenges during the approval process for our app on the Apple platform. Specifically, our efforts to disable screenshot and screen recording features within the app were unsuccessful. Consequently, sensitive content uploaded by content creators on our platform has been vulnerable to unauthorized distribution by users, negatively impacting our business model.
We've observed that platforms like WhatsApp have successfully implemented features to prevent screenshots in view-once messages, prompting us to seek guidance on potential solutions or best practices to safeguard content creators' copyrights by disabling screenshot and screen recording capabilities on users' devices.
We appreciate any insights or recommendations you can provide to address this issue effectively.
Thank you for your attention to this matter. We eagerly await your response.
I can render text from TextKit2 into a PDF everything is fine.
But in this case the font is embedded into the PDF.
I need the Pdf to contains only the paths / glyphs and not font.
I can't find a solution yet. I don't want to create an image or using UIViews etc.
It would be nice to get the bezier path of the text
I have done this with TextKit1 but the glyphs are gone with TextKit2
Can anyone help me ?
Thanks :)