Hello,
The Apple Pencil Pro brought with it the UICanvasFeedbackGenerator API, which lets us trigger haptic feedback on discrete events initiated by the pencil. That works fine.
My question then: is it possible / are we "allowed" to trigger haptic feedback on events that weren't initiated by the pencil?
For example, say the user is using a left hand finger to drag a slider, while holding the pencil in their right hand-- would it be possible to make the pencil vibrate to indicate the dragged slider knob reached a certain point?
Or is the rule that vibration is only possible/allowed when the pencil itself generated a touch?
Thanks!
Sound and Haptics
RSS for tagCreate meaningful and delightful experiences that engage a wider range of senses with sound and haptic design.
Posts under Sound and Haptics tag
17 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
My MacBook speakers have started crackling on every sound since macOS 26 Beta 1, the problem is still the same on Beta 9.
Happens especially when Simulator is opened.
I want to understand which component types are intended to have an associated hint text, haptic feedback, or earcon associated with it for VoiceOver screen reader users. Is there a list somewhere or a HIG guideline for which transition types should have a sound?
Some transitions in Apple apps generally include different beep sounds, such as
opening a new screen
screen dimming
when a VoiceOver user swipes from the header / navbar to the body
a scraping sound when swiping up or down a page.
the beginning or end of the body section
in Calculator when swiping from one row to the next.
opening a pop up menu
I would also appreciate any direction on what code strings are associated with these sounds and how custom components can capture these sounds or haptics or hints where it is expected? On the other hand, I don't want to get that info and then dictate that every component needs a specific beep type since these sounds appear to be used for specific purposes.
Hello,
I had submitted a question to clarify which components have accessibility APIs that trigger haptics for VoiceOver users https://developer.apple.com/forums/thread/773182.
The question stems from perhaps a more direct question about specific components: do tablists and disclosures natively intend to include haptics or screen reader hint or other state or properties to indicate to screen reader users where the component begins or ends?
In some web experiences there are screen reader hint text stating "end of..." or "entering" as a way to define the boundaries of these inline dialogs.
I had asked about haptics in the prior thread because I do not recall natively implemented version of this except in some haptic cues but have not experienced them consistently so I am not sure if that is an intended native Swift implementation or perhaps something custom.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
iOS
Accessibility
Sound and Haptics
Core Haptics
Is there a way to configure the APNS notification sound volume to be louder?
I am implementing some custom sounds(narrative sentences) for APNS, it does play the custom sound, but the volume of the custom sound is not loud enough even though I had set the device's volume and "RingTone and Alerts" volume to max.
I tried to amplify the custom sound file, it does play louder but the result is minimum if I want to maintain the quality of the sound without it been distorted.
I tried to use Notification Service Extension, AVAudioPlayer and AVAudioSession to play the sound, it does play louder in max volume compare with relying on default sound payload in APNS, but the problem is AVAudioPlayer and AVAudioSession do not seems to be usable when the application is in background or killed state, is there any other alternative I could use?
I want to implement a feature where a custom notification sound file is downloaded from the server when the app is first launched and stored locally on the device. When a push notification arrives, the stored sound should be played in all app states, including foreground, background, and terminated (killed) state.
Does anyone have an idea on how to implement this in iOS? Specifically, I am looking for guidance on:
1)Downloading and storing the sound file securely on the device.
2)Using the locally stored file for push notification sounds.
3)Ensuring the sound plays correctly in all states, including when the app is not running.
Topic:
Community
SubTopic:
Apple Developers
Tags:
APNS
User Notifications
Sound and Haptics
Notification Center
I've made the code in xcode for apple watch with 2 swift view (contentView.swift and interfaceController.swift).The swift for sound and haptic feedback is in InterfaceController.swift. But the the sound does not appear with haptic feedback in apple watch after complete the xcode.
the app is done but no sound appear with haptic feedback when rotate apple watch digital crown. when crown rotated but sound appear
code
import WatchKit
import AVFoundation
import WatchKit
class InterfaceController: WKInterfaceController {
// ... your UI elements
func playSelectionHapticAndSound() {
// Play a haptic feedback pattern
WKInterfaceDevice.current().play(.success)
// Load and play a selection sound effect
guard let soundURL = Bundle.main.url(forResource: "spin", withExtension: "wav") else { return }
do {
let player = try AVAudioPlayer(contentsOf: soundURL)
player.play()
} catch {
print("Error playing sound: \(error)")
}
}
}
Hello, my submission is based on Haptics. Without it the App doesn't make sense. And only real iPhone can give this opportunity. But it says that Xcode playgrounds will be tested on Simulator.
Is it indeed like this? What can I do?
Thank you in advance!
I would love to see a feature where the iPhone vibrates when a call is answered by the other person. This would be helpful for users who are walking or holding their phone away from their ear while waiting for the call to connect. The vibration would provide a subtle yet effective notification that the call has been picked up, reducing unnecessary screen-checking.
This feature could be optional and toggled in settings under Sounds & Haptics > Call Feedback for users who prefer it.
I checked the latest release notes for latest beta, and there doesn't seem to be a fix for this. But basically, the vibrations that you receive for when you long press a message to react, or hold down on an app in Home Screen, seem to stop working after a while.
This issue is reoccurring randomly.
Steps to repro:
Not fully sure on this, but I'm on iPhone 16 pro max and running the iOS 18.3 dev beta described in the title. I have the default haptics enabled in which you receive a vibration when you long press on a message in iMessage or Messenger, and also when you long press on an app on the Home Screen.
These seem to stop working, along with any other vibrations apart from calls and notifications) after a while. The only workaround is to restart the iPhone entirely.
anyone else face the same?
Our watch app, Regatta Timer, is a specialised countdown timer for sailing competitions. It is crucial that the beeps & haptics continue when 'wrist down' on alway on displays. We tried to enable this by adding 'background mode' but that only works in the Xcode Apple Watch simulator, not on an actual device with always on display. Any idea how we can get this working also on the Apple Watch device?
In ContentView.swift we currently added this code:
WKInterfaceDevice.current().play(sound)
}
but that doesnt work - regardless of adding , phase == .active`
or not.
STEPS TO REPRODUCE
Install on an ACTUAL DEVICE with always on display
start the countdown timer: beeps & sounds are OK (each minute,...)
do 'wrist down': the countdown timer continues on the dimmed display, but the sounds & haptics stop working until you raise your wrist to wake up the display.
Hi everyone,
I'm currently facing an issue with AVAudioPlayer in my SwiftUI project. Despite ensuring that the sound file "buttonsound.mp3" is properly added to the project's resources (I dragged and dropped it into Xcode), the application is still unable to locate the file when attempting to play it.
Here's the simplified version of the code I'm using:
import SwiftUI
import AVFoundation
struct ContentView: View {
var body: some View {
VStack {
Button("Play sound") {
playSound(named: "buttonsound", ofType: "mp3")
}
}
}
}
func playSound(named name: String, ofType type: String) {
guard let soundURL = Bundle.main.url(forResource: name, withExtension: type) else {
print("Sound file not found")
return
}
do {
let audioPlayer = try AVAudioPlayer(contentsOf: soundURL)
audioPlayer.prepareToPlay()
audioPlayer.play()
} catch let error {
print("Error playing sound: \(error.localizedDescription)")
}
}
I updated developers beta 18.3 on my iPad 7th generation. Post the update sound isn’t working. I tried to restart and reset device completely But no luck. Also, youtube app isn’t working post update
I have an iPhone 14 with iOS 18 installed on which I noticed the vibration no longer works. The haptic feedback setting is set to "Always on", I have no vibration on notifications, nor on incoming calls and the keyboard feedback does not work either.
I also noticed another strange thing: going to the ringtones menu, these do not play if I select them to try them.
I tried to update to iOS 18.1 but
even with this version I have the same problem. Could it therefore be a hardware problem and not a software problem?
Is anyone else having the same problem as me?
Thanks
New in iOS 17.5 is UIImpactFeedbackGenerator/impactOccurred(at:), which generates haptic feedback at a specific screen location.
https://developer.apple.com/documentation/uikit/uiimpactfeedbackgenerator/4403143-impactoccurred
Which devices support this?
How does this work if the Taptic Engine has a fixed physical location?
After installing iOS 18, the iPhone would have consistent low volume and from this low, it wants to go more down suddenly when playing some media and gets back to its normal condition (already low) itself. I have tried several things from Internet like forced restart, several times restart, turning on vocal shortcut and turning it off back, but nothing helped. Any suggestions? Its Iphone 16 pro max by the way
Just installed iOS 18 Beta 3.
I am seeing my AccessibilityUIServer using the microphone and this is causing no notification sounds, inability to use Siri by voice and volume is grayed out.
If I start to play anything with sound AccessibilityUIServer releases the microphone and I am able to use the app.
Calls still work since AccessibilityUIServer will release and the phone will ring.
Feed back ID is FB14241838.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Audio
Accessibility
Sound Analysis
Sound and Haptics