Sound and Haptics

RSS for tag

Create meaningful and delightful experiences that engage a wider range of senses with sound and haptic design.

Posts under Sound and Haptics tag

38 results found
Sort by:
Post not yet marked as solved
23 Views

Custom sound and haptic

Hi, what is the best approach of customising the sound (notification) and haptic feedback on apple watch 6 and later? I wonder if I can play a sound/haptic feedback when receiving a remote (can be silent?) notification to apple watch? Thanks
Asked
by el2000.
Last updated
.
Post not yet marked as solved
389 Views

Custom iOS Notification Sounds?

Hi, What are the requirements / restrictions for a custom iOS Notification sound? Is this widely available to new apps on the App Store? I know BBC News & Tinder have custom sounds. Thanks
Asked
by Dan-S3.
Last updated
.
Post not yet marked as solved
35 Views

'GCVirtualController?' has no member 'changeElement'

In the example code:- vc.changeElement(GCInputButtonA) { I get the error. Value of type 'GCVirtualController?' has no member 'changeElement' I thought the vc reference was view controller but this did not work. Any help would be great. Thanks in advance
Asked
by Parystec.
Last updated
.
Post not yet marked as solved
2k Views

Glitch in Pubg sound

Hi, when I turn on my mic my speaker volume is low and I cannot hear any game sounds . This happened since I updated to iOS 14 . Can anyone help me with how to solve this issue.
Asked Last updated
.
Post not yet marked as solved
210 Views

Play AVSpeechSynthesizer (TTS) via Apple Watch Speaker

Hi everyone. This is my first post on these forums, because this is first time I have not found any clues on internet regarding this question. I need to play sound (text to speech generated by AVSpeechSynthesizer) using a built-in speaker on Apple Watch Series 4+ (WatchOS 6.2.6+). I was able to make it work for Series 3. However, for series 4, it doesn't play via speakers, it plays via Bluetooth AirPods if they are connected, but they are not – there is no sound at all. Is it even possible to do? Any advice would be highly appreciated.
Asked Last updated
.
Post not yet marked as solved
345 Views

Custom Haptics on Apple Watch

Is it possible to customize haptics on Apple Watch (something like CoreHaptics) rather than using standard haptics out of WatchKit? I'm looking for a way to play a continuous waveform (like a more complex version of Breathe).
Asked Last updated
.
Post not yet marked as solved
158 Views

Haptic Feedback in Safari Web Extension

Hi! We're currently building a web extension with iOS as a primary target. Since haptics on iOS are really nice, we wanted to make use of it to give the experience a special feel on iPhone. However, haptics don't seem to work when called from SafariWebExtensionHandler... maybe because the messaging layer between the web extension and the native code is in the background (is it?), and haptics don't work in apps that are in the background. Anyway, is there any way we can make haptics work regardless?
Asked Last updated
.
Post not yet marked as solved
88 Views

Notification Sounds

On iOS 15 has anyone else noticed that sometimes notification sounds do not play? Works fine for a while and then nothing, returning again later ?
Asked
by paulb00.
Last updated
.
Post not yet marked as solved
94 Views

iOS 15 Background Sound

Will there be more background sounds that will be added? I have a few more suggestions such as: Rain with Thunder Light & Heavy Rain Fall Restaurant Customers talking Forest Surroundings I feel like those will be great additions towards the Background sounds and how to improve it well!
Asked Last updated
.
Post not yet marked as solved
423 Views

Haptic Grip Support for Sony DualSense Game Controller Grips?

Is there support for playing custom haptic files to the grips of Sony's DualSense Game Controller? The video mentions supporting the adaptive triggers on the DualSense. We are not finding a way to reference the haptic actuators in the grips. We'd like to play custom haptic files to them as well as audio to the speaker inside the DualSense. Searched the WWDC21 content as well as the documentation. Have not found the answer yet. Thank you!
Asked Last updated
.
Post marked as solved
397 Views

RemoteIO Glitch With Sound Recognition Feature

My app is using RemoteIO to record audio. It doesn’t do any playback. RemoteIO seems to be broadly compatible with the new Sound Recognition feature in iOS14, but I’m seeing a glitch when sound recognition is first enabled. If my app is started and I initialise RemoteIO, and then turn on Sound Recognition (say via control centre), the RemoteIO input callback is not called thereafter, until I tear down the audio unit and set it back up. So something like the following: Launch app RemoteIO is initialised and working, can record Turn on Sound Recognition via Settings or control centre widget Start recording with already-set up RemoteIO Recording callback is never again called Though no input callbacks are seen, kAudioOutputUnitProperty_IsRunning is reported as true, so the audio unit thinks it is active Tear down audio unit Set up audio unit again Recording works Buffer size is changed, reflecting some effect on the audio session of the Sound Recognition feature I also noticed that when Sound Recognition is enabled, I see several (usually 3) AVAudioSession.routeChangeNotifications in quick succession. When Sound Recognition is disabled while RemoteIO is set up, I don’t see this problem. I’m allocating my own buffers so it’s not a problem with their size. What could be going on here? Am I not handling a route change properly? There doesn’t seem to be a reliable sequence of events I can catch to know when to reset the audio unit. The only fix I’ve found here is to hack in a timer that checks for callback activity shortly after starting recording, and resets the audio unit if no callback activity is seen. Better than nothing, but not super reliable.
Asked Last updated
.
Post not yet marked as solved
111 Views

button haptic feedback in app

Hey everyone, I am a naive in developing of apps but recently created a basic app which asks about coding questions by a quiz taking 4 MCQ as options. So the problem encountered is that whenever a person selects a wrong answer from those MCQ choices, I want the haptic engine on the iPhone to vibrate i.e. to create a feedback so if anyone knows how to do it please give me the guidance and the steps :)
Asked
by Vikhyat.
Last updated
.
Post not yet marked as solved
273 Views

Error on IOS 14.6 update?

I have experienced some issue with ios 14.6 on my iphone 7 plus. Issues are : system hang! sometime i have experienced on fluent running the apps and phone unlock. Apps get hung during screening and if i get to lock and unlock again it's working back. sound is also low then before updating ios. I mean loud speaker is not heard as clear and high as before updating to ios 14.6 ringtone volume is not heard as high as before.
Asked
by gaanes.
Last updated
.
Post not yet marked as solved
388 Views

Is there an equivalent accessibilityElement API for SwiftUI - specifically when making visualizations using the new Canvas view in SwiftUI

What's the most effective equivalent to the accessibilityElement API when you're creating visualizations in SwiftUI using the new Canvas drawing mechanism? When I'm drawing with Canvas, I'd know the same roughly positioning information as drawing in UIKit or AppKit and making charts visualizations, but I didn't spot an equivalent API for marking sections of the visualization to be associated with specific values. (akin to setting the frame coordinates in a UIView accessibility element) I've only just started playing with Canvas, so I may be wrong - but it seems like a monolithic style view element that I can't otherwise easily break down into sub-components like I can by applying separate CALayers or UIViews on which to hang accessibility indicators. For making the equivalent bar chart in straight up SwiftUI, is there a similar mechanism, or is this a place where I'd need to pop the escape hatch to UIKit or AppKit and do the relevant work there, and then host it inside a SwiftUI view?
Asked
by heckj.
Last updated
.