I've made the code in xcode for apple watch with 2 swift view (contentView.swift and interfaceController.swift).The swift for sound and haptic feedback is in InterfaceController.swift. But the the sound does not appear with haptic feedback in apple watch after complete the xcode.
the app is done but no sound appear with haptic feedback when rotate apple watch digital crown. when crown rotated but sound appear
code
import WatchKit
import AVFoundation
import WatchKit
class InterfaceController: WKInterfaceController {
// ... your UI elements
func playSelectionHapticAndSound() {
// Play a haptic feedback pattern
WKInterfaceDevice.current().play(.success)
// Load and play a selection sound effect
guard let soundURL = Bundle.main.url(forResource: "spin", withExtension: "wav") else { return }
do {
let player = try AVAudioPlayer(contentsOf: soundURL)
player.play()
} catch {
print("Error playing sound: \(error)")
}
}
}
Sound and Haptics
RSS for tagCreate meaningful and delightful experiences that engage a wider range of senses with sound and haptic design.
Posts under Sound and Haptics tag
22 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hello, my submission is based on Haptics. Without it the App doesn't make sense. And only real iPhone can give this opportunity. But it says that Xcode playgrounds will be tested on Simulator.
Is it indeed like this? What can I do?
Thank you in advance!
I would love to see a feature where the iPhone vibrates when a call is answered by the other person. This would be helpful for users who are walking or holding their phone away from their ear while waiting for the call to connect. The vibration would provide a subtle yet effective notification that the call has been picked up, reducing unnecessary screen-checking.
This feature could be optional and toggled in settings under Sounds & Haptics > Call Feedback for users who prefer it.
I want to understand which component types are intended to have an associated hint text, haptic feedback, or earcon associated with it for VoiceOver screen reader users. Is there a list somewhere or a HIG guideline for which transition types should have a sound?
Some transitions in Apple apps generally include different beep sounds, such as
opening a new screen
screen dimming
when a VoiceOver user swipes from the header / navbar to the body
a scraping sound when swiping up or down a page.
the beginning or end of the body section
in Calculator when swiping from one row to the next.
opening a pop up menu
I would also appreciate any direction on what code strings are associated with these sounds and how custom components can capture these sounds or haptics or hints where it is expected? On the other hand, I don't want to get that info and then dictate that every component needs a specific beep type since these sounds appear to be used for specific purposes.
I checked the latest release notes for latest beta, and there doesn't seem to be a fix for this. But basically, the vibrations that you receive for when you long press a message to react, or hold down on an app in Home Screen, seem to stop working after a while.
This issue is reoccurring randomly.
Steps to repro:
Not fully sure on this, but I'm on iPhone 16 pro max and running the iOS 18.3 dev beta described in the title. I have the default haptics enabled in which you receive a vibration when you long press on a message in iMessage or Messenger, and also when you long press on an app on the Home Screen.
These seem to stop working, along with any other vibrations apart from calls and notifications) after a while. The only workaround is to restart the iPhone entirely.
anyone else face the same?
Our watch app, Regatta Timer, is a specialised countdown timer for sailing competitions. It is crucial that the beeps & haptics continue when 'wrist down' on alway on displays. We tried to enable this by adding 'background mode' but that only works in the Xcode Apple Watch simulator, not on an actual device with always on display. Any idea how we can get this working also on the Apple Watch device?
In ContentView.swift we currently added this code:
WKInterfaceDevice.current().play(sound)
}
but that doesnt work - regardless of adding , phase == .active`
or not.
STEPS TO REPRODUCE
Install on an ACTUAL DEVICE with always on display
start the countdown timer: beeps & sounds are OK (each minute,...)
do 'wrist down': the countdown timer continues on the dimmed display, but the sounds & haptics stop working until you raise your wrist to wake up the display.
I updated developers beta 18.3 on my iPad 7th generation. Post the update sound isn’t working. I tried to restart and reset device completely But no luck. Also, youtube app isn’t working post update
I have an iPhone 14 with iOS 18 installed on which I noticed the vibration no longer works. The haptic feedback setting is set to "Always on", I have no vibration on notifications, nor on incoming calls and the keyboard feedback does not work either.
I also noticed another strange thing: going to the ringtones menu, these do not play if I select them to try them.
I tried to update to iOS 18.1 but
even with this version I have the same problem. Could it therefore be a hardware problem and not a software problem?
Is anyone else having the same problem as me?
Thanks
New in iOS 17.5 is UIImpactFeedbackGenerator/impactOccurred(at:), which generates haptic feedback at a specific screen location.
https://developer.apple.com/documentation/uikit/uiimpactfeedbackgenerator/4403143-impactoccurred
Which devices support this?
How does this work if the Taptic Engine has a fixed physical location?
After installing iOS 18, the iPhone would have consistent low volume and from this low, it wants to go more down suddenly when playing some media and gets back to its normal condition (already low) itself. I have tried several things from Internet like forced restart, several times restart, turning on vocal shortcut and turning it off back, but nothing helped. Any suggestions? Its Iphone 16 pro max by the way
Hello everyone,
I’m experiencing occasional crashes in my app related to the Core Haptics framework, specifically when creating haptic pattern players. The crashes are intermittent and not easily reproducible, so I’m hoping to get some guidance on what might be causing them.
It's seems it's connected to Audio Resource I'm using within AHAP file.
Setup:
I use AVAudioSession and AVAudioEngine to record and play continuous audio. After activating the audio session and setting up the audio engine, I initialize the CHHapticEngine as follows:
let engine = try CHHapticEngine(audioSession: .sharedInstance())
...
try engine?.start()
// Recreate all haptic pattern players you had created.
let pattern = createPatternFromAHAP(Pattern.thinking.rawValue)!
thinkingHapticPlayer = try? engine?.makePlayer(with: pattern)
// Repeat for other players...
AHAP file:
"Pattern":
[
... haptic events
{
"Event":
{
"Time": 0.0,
"EventType": "AudioCustom",
"EventWaveformPath": "voice.chat.thinking.mp3",
"EventParameters":
[
{ "ParameterID": "AudioVolume", "ParameterValue": 0.7 }
]
}
}
]
I’m receiving the following crash report:
Crashed: com.apple.main-thread
EXC_BREAKPOINT 0x00000001ba525c68
0
CoreHaptics
+[CHHapticEngine(CHHapticEngineInternal) doRegisterAudioResource:options:fromPattern:player:error:].cold.1 + 104
1
CoreHaptics
+[CHHapticEngine(CHHapticEngineInternal) doRegisterAudioResource:options:fromPattern:player:error:].cold.1 + 104
2
CoreHaptics
+[CHHapticEngine(CHHapticEngineInternal) doRegisterAudioResource:options:fromPattern:player:error:] + 3784
3
CoreHaptics
-[CHHapticPattern resolveExternalResources:error:] + 388
4
CoreHaptics
-[PatternPlayer initWithPlayable:engine:privileged:error:] + 560
5
CoreHaptics
-[CHHapticEngine createPlayerWithPattern:error:] + 256
6
Mind
VoiceChatHapticEngine.swift - Line 170
VoiceChatHapticEngine.createThinkingHapticPlayer() + 170
Has anyone encountered similar crashes when working with CHHapticEngine and haptic patterns that contains audioCustom event?
Thank you so much for your help.
Ondrej
Haptics are often represented as audio for presentation purposes by Apple in videos and learning resources.
I am curious if:
...Apple has released, or is willing to release any tools that may have been used synthesize audio to represent a haptic patterns (such as in their WWDC19 Audio-Haptic presentation)?
...there are any current tools available that take haptic instruction as input (like AHAP) and outputs an audio file?
...there is some low-level access to the signal that drives the Taptic Engine, so that it can be repurposed as an audio stream?
...you have any other suggestions!
I can imagine some crude solutions that hack together preexisting synthesizers and fudging together a process to convert AHAP to MIDI instructions, dialing in some synth settings to mimic the behaviour of an actuator, but I'm not too interested in that rabbit hole just yet.
Any thoughts? Very curious what the process was for the WWDC videos and audio examples in their documentation...
Thank you!
I updated my Vision Pro to VisionOS 2.0 Beta yesterday, and now everything is very quiet even at max volume. I tested with the built in speakers, Beats Pro and Airpods Pro Gen 2 as well and same problem with all of them.
If I turn the volume down to 50% you cant tell what audio is being played anymore.
I tried restarting the headset and it makes no difference.
Anything else I can try to resolve this issue?
Based on info online I'm under the impression we can add spatial audio to USDZ files using Reality Composer Pro, however I've been unable to hear this audio outside of the preview audio in the scene inspector. Attached is a screenshot with how I've laid out the scene.
I see the 3D object fine on mobile and Vision Pro, but can't get audio to loop. I have ensured the audio file is in the scene linked as the resource for the spatial audio node. Am I off on setting this up, it's broken or this simply isn't a feature to save back to USDZ? In the following link they note their USDZ could "play an audio track while viewing the model", but the model isn't there anymore.
Can someone confirm where I might be off please?
Just installed iOS 18 Beta 3.
I am seeing my AccessibilityUIServer using the microphone and this is causing no notification sounds, inability to use Siri by voice and volume is grayed out.
If I start to play anything with sound AccessibilityUIServer releases the microphone and I am able to use the app.
Calls still work since AccessibilityUIServer will release and the phone will ring.
Feed back ID is FB14241838.
I'm developing an iPhone App that is using on-device speech recognition. I'd like to create a 'buzz' to confirm the word was recognised without having to look at the screen.
I can get haptics working from a button; but when I try to attach sensoryfeedback to any view element, it does not fire despite my toggle from successful voice commands variable correctly flipping.
Has anyone tried a similar thing with success, or have any ideas?
Thanks very much.
I had this new pair of airpods since 2022 bundled with my iPhone 12. I tried connecting to PC as of now, and what's annoying is that my Airpods keep disconnecting by itself from my Windows PC! I am so upset now, and I need your help!
Just got my hands on the Apple Pencil Pro, and was looking forward to being able to add tactile feedback to my apps through the Pencil.
Apple appears to have updated their docs to make mention of the possibility of haptic feedback via the pencil and makes vague references to a certain SwiftUI modifier, but doesn't give any pencil-specific guidelines, code or speak about its capabilities at all. I'd like to ask if its possible to use CoreHaptic to queue custom haptic feedback (as of now the standard code which works on iPhone doesn't seem to work on an iPad with a paired Pencil Pro), or if that's not possible if there are any updated resources/example code for triggering predefined Pencil Pro haptics.
Can I use Apple's sound recognition in my augmented reality app to trigger content ? Or is there another source I can use?
Hi, I noticed that several of the top Vision Pro apps such as Disney+ and Max have similar if not identical SFX for basic navigation like Select and Back. I was wondering if Apple has provided an audio SFX library as a resource, or if the similarity is coincidental.
I'm not familiar with Apple providing anything like this in the past, but figured on a totally new platform where they might be trying to establish a paradigm, it was worth looking into!