Post not yet marked as solved
Is it possible to customize haptics on Apple Watch (something like CoreHaptics) rather than using standard haptics out of WatchKit? I'm looking for a way to play a continuous waveform (like a more complex version of Breathe).
Post not yet marked as solved
Hey everyone, I am a naive in developing of apps but recently created a basic app which asks about coding questions by a quiz taking 4 MCQ as options. So the problem encountered is that whenever a person selects a wrong answer from those MCQ choices, I want the haptic engine on the iPhone to vibrate i.e. to create a feedback so if anyone knows how to do it please give me the guidance and the steps :)
Hi everyone,
I’m having a problem with the use of core Haptics with an iPhone 7.
I’m getting the error -4809.
Did someone know what is the problem ?
Hello,
We are trying to do some advanced stuff with CoreHaptics where we need to navigate in the haptic pattern (start, pause, resume, rewind or fast forward). But it seems like the seek(toOffset:) method of CHHapticAdvancedPatternPlayer doesn't work on iOS 14 and iOS 15.
Here's a sample code :
import CoreHaptics
import Foundation
import SwiftUI
let events = [
CHHapticEvent(
eventType: .hapticContinuous,
parameters: [
CHHapticEventParameter(parameterID: .hapticSharpness, value: 1),
CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.8)
],
relativeTime: 0,
duration: 3
)
]
let parameters = [
CHHapticDynamicParameter(parameterID: .hapticIntensityControl, value: 1, relativeTime: 0),
CHHapticDynamicParameter(parameterID: .hapticIntensityControl, value: 0.5, relativeTime: 0.5),
CHHapticDynamicParameter(parameterID: .hapticIntensityControl, value: 1, relativeTime: 1),
CHHapticDynamicParameter(parameterID: .hapticIntensityControl, value: 0.5, relativeTime: 1.5),
CHHapticDynamicParameter(parameterID: .hapticIntensityControl, value: 1, relativeTime: 2),
CHHapticDynamicParameter(parameterID: .hapticIntensityControl, value: 0.5, relativeTime: 2.5)
]
let pattern = try! CHHapticPattern(events: events, parameters: parameters)
struct ContentView: View {
private var player: CHHapticAdvancedPatternPlayer?
var body: some View {
VStack(spacing: 20) {
Button("Start") {
try? player?.start(atTime: CHHapticTimeImmediate)
}
Button("Pause") {
try? player?.pause(atTime: CHHapticTimeImmediate)
}
Button("Resume") {
try? player?.resume(atTime: CHHapticTimeImmediate)
}
Button("Stop") {
try? player?.stop(atTime: CHHapticTimeImmediate)
}
Button("Seek to offset 1 sec") {
do {
try player?.seek(toOffset: 1)
print("No error")
} catch {
print(error)
}
}
}
}
init() {
guard CHHapticEngine.capabilitiesForHardware().supportsHaptics,
let engine = try? CHHapticEngine()
else { return }
engine.resetHandler = { try? engine.start() }
try? engine.start()
self.player = try? engine.makeAdvancedPlayer(with: pattern)
print(pattern.duration)
}
}
Is there anything wrong with the code ? Can someone provide a working example ?
Thanks !
Post not yet marked as solved
Hi,
I'm trying to add haptic feedback to a tabItem inside a tabView with this code (just a part of the ContentView file):
@State private var tabSelection = 1
var body: some View {
TabView(selection: $tabSelection) {
view(request: URLRequest(url: URL(string: "https://gnets.myds.me/stundenplan/gymnasium.php")!)).previewLayout(.fixed(width: .infinity, height: .infinity))
.tabItem {
Image("1")
Text("Gymnasium")
}.onTapGesture {
self.tabSelection = 1
}
view(request: URLRequest(url: URL(string: "https://gnets.myds.me/stundenplan/lyzeum.php")!)).previewLayout(.fixed(width: .infinity, height: .infinity))
.tabItem {
Image("2")
Text("Lyzeum")
}.onTapGesture {
self.tabSelection = 2
}
view(request: URLRequest(url: URL(string: "https://gnets.myds.me/stundenplan/nou/")!)).previewLayout(.fixed(width: .infinity, height: .infinity))
.tabItem {
Image("3")
Text("Neuigkeiten")
}.onTapGesture {
self.tabSelection = 3
}
}.onTapGesture {
let impactLight = UIImpactFeedbackGenerator(style: .light)
impactLight.impactOccurred()
}
}
After testing on a iPhone, the haptic feedback acts in on click of a tabItem, and also on click of the entire view (which is fine), but the view (tab) doesn't change. (I added the variable tabSelection to make the tabItem force change the view, but it still doesn't work.)
What I need is that on click of a tabItem, the view changes (as normal), but also gets a light haptic feedback.
Post not yet marked as solved
Dear Developers who play music … would you consider adding a haptic metronome to the watch to keep a constant beat for musicians as we play gigs?
Post not yet marked as solved
We are using the following code to check for the haptics support on the device
import CoreHaptics
let hapticCapability = CHHapticEngine.capabilitiesForHardware()
let supportsHaptics: Bool = hapticCapability.supportsHaptics
For some reason it returns false on iPhone 7 not only on my personal device (iOS 13) but some of our users reported the haptic settings missing from our app on iPhones 7 (iOS 15) as well.
I have just checked with this private api call that in fact the devices does support haptics (obviously, it has not physical home button)
extension UIDevice {
var feedbackSupportLevel: Int? {
value(forKey: "_feedbackSupportLevel") as? Int
}
}
let supportsHaptics: Bool = UIDevice.current.feedbackSupportLevel == 2
And it returns true, as expected.
Weird thing is that the haptics work perfectly fine, it's just the haptics support status is incorrect.
So my question is: will we be rejected for checking the value of the _feedbackSupportLevel on UIDevice just to make sure we are correctly displaying available system features for our users on all of the devices?
Post not yet marked as solved
I'm working on an app that uses CoreHaptics to play a synchronised pattern of vibrations and audio.
The problem is that the audio only gets played through the iPhones speakers (if the mute switch is not turned on). As soon as I connect my AirPods to the phone the audio stops playing, but the haptics continue.
My code looks something like this:
let engine = CHHapticEngine()
...
var events = [CHHapticEvent]()
...
let volume: Float = 1
let decay: Float = 0.5
let sustained: Float = 0.5
let audioParameters = [
CHHapticEventParameter(parameterID: .audioVolume, value: volume),
CHHapticEventParameter(parameterID: .decayTime, value: decay),
CHHapticEventParameter(parameterID: .sustained, value: sustained)
]
let breathingTimes = pacer.breathingTimeInSeconds
let combinedTimes = breathingTimes.inhale + breathingTimes.exhale
let audioEvent = CHHapticEvent(
audioResourceID: selectedAudio,
parameters: audioParameters,
relativeTime: 0,
duration: combinedTimes
)
events.append(audioEvent)
...
let pattern = try CHHapticPattern(events: events, parameterCurves: [])
let player = try engine.makeAdvancedPlayer(with: pattern)
...
try player.start(atTime: CHHapticTimeImmediate)
My idea to activate an audio session before the player starts, to indicate to the system that audio is played, also didn't changed the outcome:
try AVAudioSession.sharedInstance().setActive(true)
Is there a different way to route the audio from CoreHaptics to a different output other than the integrated speakers?
Post not yet marked as solved
Loading is hard part of accessibility: you should clarify that
loadings has been started, is loading and when ends.
Safari handles loading with vibration, 1 per secods and vibrate twice at the end of loading.
I did the development, by analogy with this framework -
https://github.com/akaDuality/AccessibleLoading
This worked fine in iOS13 - iOS14, but from iOS15 until VoiceOver finishes speaking, CoreHaptic doesn't start.
In the release note, I did not find information
Post not yet marked as solved
Is it possible to render the multiplication of two sinusoidal vibrations as sin(2 * pi * 8 * t) * sin(2 * pi * 200 * t) in .AHAP format or any other methods?
So far, we've made 200Hz vibration using .AHAP by controlling HapticSharpnessControl to about 0.9, and modulated its envelope by 1) controlling HapticIntensity/HapticIntensityControl or 2) ParameterCurve, but both of approaches generate ABS[sin(2 * pi * 8 * t)] * sin(2 * pi * 200 * t) that is different to our intention.
Post not yet marked as solved
import UIKit
import CoreHaptics
import AVFoundation
enum ShiftStatus{
case capitalized
case normal
}
@available(iOSApplicationExtension 13.0, *)
class KeyboardViewController: UIInputViewController {
var engine: CHHapticEngine?
lazy var supportsHaptics: Bool = {
let appDelegate = UIApplication.shared.delegate as! AppDelegate
return appDelegate.supportsHaptics
}()
Currently working here 'shared' is unavailable in application extensions for iOS: Use view controller based solutions where appropriate instead. A warning message pops up. I have to use it in an extension. Does anyone know a solution?
Post not yet marked as solved
Hello. I tried to port corehaptic to extension custum keyboard, but failed.
So I adopted hapticfeedback as a suboptimal solution, but I haven't solved the audio yet.
Do you know which chord to use for the sound effect that we usually use on the basic ios keyboard? Thank you for helping me.