axiom-haptics by charleswiltgen/axiom
npx skills add https://github.com/charleswiltgen/axiom --skill axiom-haptics在 iOS 上实现触觉反馈的综合指南。每一个苹果设计奖的获奖者都使用了出色的触觉反馈——相机、地图、天气应用都巧妙地运用触觉来创造愉悦、响应迅速的用户体验。
触觉反馈为用户操作和系统事件提供了触觉确认。当使用因果-和谐-效用框架进行深思熟虑的设计时,axiom-haptics 能将界面从功能性转变为令人愉悦的体验。
本技能涵盖简单的触觉反馈(UIFeedbackGenerator)和高级的自定义模式(Core Haptics),并提供实际示例和音频-触觉同步技术。
苹果的音频和触觉设计团队为多模态反馈确立了三个核心原则:
问题:用户无法分辨是什么触发了触觉反馈 解决方案:触觉时机必须与视觉/交互时刻匹配
:
广告位招租
在这里展示您的产品或服务
触达数万 AI 开发者,精准高效
代码模式:
// ✅ 触摸时立即反馈
@objc func buttonTapped() {
let generator = UIImpactFeedbackGenerator(style: .medium)
generator.impactOccurred() // 立即触发
performAction()
}
// ❌ 延迟的反馈失去因果性
@objc func buttonTapped() {
performAction()
DispatchQueue.main.asyncAfter(deadline: .now() + 0.2) {
let generator = UIImpactFeedbackGenerator(style: .medium)
generator.impactOccurred() // 太晚了!
}
}
问题:视觉、音频和触觉不匹配 解决方案:所有三种感官应该感觉像是统一的体验
WWDC 示例:
关键见解:一个大型物体应该感觉沉重,听起来低沉而共鸣,并且看起来有分量。所有三种感官都强化了相同的体验。
问题:触觉被到处使用“仅仅因为我们可以” 解决方案:将触觉保留在对用户有益的重要时刻
何时使用触觉:
何时不使用触觉:
对于大多数应用,UIFeedbackGenerator 提供了 3 种简单的触觉类型,无需自定义模式。
物理碰撞或冲击感。
样式(按从轻到重排序):
.light - 轻微、细腻的敲击.medium - 标准敲击(最常见).heavy - 强烈、坚实的冲击.rigid - 坚定、精确的敲击.soft - 轻柔、有缓冲的敲击使用模式:
class MyViewController: UIViewController {
let impactGenerator = UIImpactFeedbackGenerator(style: .medium)
override func viewDidLoad() {
super.viewDidLoad()
// 准备可以减少下一次冲击的延迟
impactGenerator.prepare()
}
@objc func userDidTap() {
impactGenerator.impactOccurred()
}
}
强度变化 (iOS 13+):
// intensity: 0.0 (最轻) 到 1.0 (最强)
impactGenerator.impactOccurred(intensity: 0.5)
常见用例:
.medium).light).heavy).rigid)离散的选择变化(选择器滚轮、分段控件)。
用法:
class PickerViewController: UIViewController {
let selectionGenerator = UISelectionFeedbackGenerator()
func pickerView(_ picker: UIPickerView, didSelectRow row: Int,
inComponent component: Int) {
selectionGenerator.selectionChanged()
}
}
感觉像:点击带有定位点的物理滚轮
常见用例:
系统级的成功/警告/错误反馈。
类型:
.success - 任务成功完成.warning - 需要注意,但不关键.error - 发生关键错误用法:
let notificationGenerator = UINotificationFeedbackGenerator()
func submitForm() {
// 验证表单
if isValid {
notificationGenerator.notificationOccurred(.success)
saveData()
} else {
notificationGenerator.notificationOccurred(.error)
showValidationErrors()
}
}
最佳实践:将触觉类型与用户结果匹配
.success.error.warning在触觉发生前调用 prepare() 以减少延迟:
// ✅ 良好 - 在用户操作前准备
@IBAction func buttonTouchDown(_ sender: UIButton) {
impactGenerator.prepare() // 用户手指按下
}
@IBAction func buttonTouchUpInside(_ sender: UIButton) {
impactGenerator.impactOccurred() // 立即触觉反馈
}
// ❌ 不好 - 未准备的触觉可能会延迟
@IBAction func buttonTapped(_ sender: UIButton) {
let generator = UIImpactFeedbackGenerator()
generator.impactOccurred() // 可能有 10-20 毫秒延迟
}
准备时机:系统在 prepare() 后大约保持引擎就绪 1 秒。
对于需要自定义模式的应用,Core Haptics 提供了对触觉波形的完全控制。
CHHapticEngine) - 连接到手机的致动器CHHapticPatternPlayer) - 播放控制CHHapticPattern) - 随时间变化的事件集合CHHapticEvent) - 指定体验的构建块import CoreHaptics
class HapticManager {
var engine: CHHapticEngine?
func initializeHaptics() {
// 检查设备支持
guard CHHapticEngine.capabilitiesForHardware().supportsHaptics else {
print("Device doesn't support haptics")
return
}
do {
// 创建引擎
engine = try CHHapticEngine()
// 处理中断(来电、Siri 等)
engine?.stoppedHandler = { reason in
print("Engine stopped: \(reason)")
self.restartEngine()
}
// 处理重置(音频会话更改)
engine?.resetHandler = {
print("Engine reset")
self.restartEngine()
}
// 启动引擎
try engine?.start()
} catch {
print("Failed to create haptic engine: \(error)")
}
}
func restartEngine() {
do {
try engine?.start()
} catch {
print("Failed to restart engine: \(error)")
}
}
}
关键:始终设置 stoppedHandler 和 resetHandler 以处理系统中断。
短暂、离散的反馈(如点击)。
let intensity = CHHapticEventParameter(
parameterID: .hapticIntensity,
value: 1.0 // 0.0 到 1.0
)
let sharpness = CHHapticEventParameter(
parameterID: .hapticSharpness,
value: 0.5 // 0.0 (沉闷) 到 1.0 (清脆)
)
let event = CHHapticEvent(
eventType: .hapticTransient,
parameters: [intensity, sharpness],
relativeTime: 0.0 // 从模式开始的时间(秒)
)
参数:
hapticIntensity:强度(0.0 = 几乎感觉不到,1.0 = 最大)hapticSharpness:特性(0.0 = 沉闷的撞击声,1.0 = 清脆的啪嗒声)随时间持续的反馈(如振动电机)。
let intensity = CHHapticEventParameter(
parameterID: .hapticIntensity,
value: 0.8
)
let sharpness = CHHapticEventParameter(
parameterID: .hapticSharpness,
value: 0.3
)
let event = CHHapticEvent(
eventType: .hapticContinuous,
parameters: [intensity, sharpness],
relativeTime: 0.0,
duration: 2.0 // 秒
)
用例:
func playCustomPattern() {
// 创建事件
let tap1 = CHHapticEvent(
eventType: .hapticTransient,
parameters: [
CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.5),
CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.5)
],
relativeTime: 0.0
)
let tap2 = CHHapticEvent(
eventType: .hapticTransient,
parameters: [
CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.7),
CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.7)
],
relativeTime: 0.3
)
let tap3 = CHHapticEvent(
eventType: .hapticTransient,
parameters: [
CHHapticEventParameter(parameterID: .hapticIntensity, value: 1.0),
CHHapticEventParameter(parameterID: .hapticSharpness, value: 1.0)
],
relativeTime: 0.6
)
do {
// 从事件创建模式
let pattern = try CHHapticPattern(
events: [tap1, tap2, tap3],
parameters: []
)
// 创建播放器
let player = try engine?.makePlayer(with: pattern)
// 播放
try player?.start(atTime: CHHapticTimeImmediate)
} catch {
print("Failed to play pattern: \(error)")
}
}
对于持续反馈(滚动纹理、电机),使用高级播放器:
func startRollingTexture() {
let event = CHHapticEvent(
eventType: .hapticContinuous,
parameters: [
CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.4),
CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.2)
],
relativeTime: 0.0,
duration: 0.5
)
do {
let pattern = try CHHapticPattern(events: [event], parameters: [])
// 使用高级播放器进行循环
let player = try engine?.makeAdvancedPlayer(with: pattern)
// 启用循环
try player?.loopEnabled = true
// 开始
try player?.start(atTime: CHHapticTimeImmediate)
// 根据球速动态更新强度
updateTextureIntensity(player: player)
} catch {
print("Failed to start texture: \(error)")
}
}
func updateTextureIntensity(player: CHHapticAdvancedPatternPlayer?) {
let newIntensity = calculateIntensityFromBallSpeed()
let intensityParam = CHHapticDynamicParameter(
parameterID: .hapticIntensityControl,
value: newIntensity,
relativeTime: 0
)
try? player?.sendParameters([intensityParam], atTime: CHHapticTimeImmediate)
}
关键区别:CHHapticPatternPlayer 播放一次,CHHapticAdvancedPatternPlayer 支持循环和动态参数更新。
AHAP(苹果触觉音频模式)文件是结合触觉事件和音频的 JSON 文件。
{
"Version": 1.0,
"Metadata": {
"Project": "My App",
"Created": "2024-01-15"
},
"Pattern": [
{
"Event": {
"Time": 0.0,
"EventType": "HapticTransient",
"EventParameters": [
{
"ParameterID": "HapticIntensity",
"ParameterValue": 1.0
},
{
"ParameterID": "HapticSharpness",
"ParameterValue": 0.5
}
]
}
}
]
}
{
"Version": 1.0,
"Pattern": [
{
"Event": {
"Time": 0.0,
"EventType": "AudioCustom",
"EventParameters": [
{
"ParameterID": "AudioVolume",
"ParameterValue": 0.8
}
],
"EventWaveformPath": "ShieldA.wav"
}
},
{
"Event": {
"Time": 0.0,
"EventType": "HapticContinuous",
"EventDuration": 0.5,
"EventParameters": [
{
"ParameterID": "HapticIntensity",
"ParameterValue": 0.6
}
]
}
}
]
}
func loadAHAPPattern(named name: String) -> CHHapticPattern? {
guard let url = Bundle.main.url(forResource: name, withExtension: "ahap") else {
print("AHAP file not found")
return nil
}
do {
return try CHHapticPattern(contentsOf: url)
} catch {
print("Failed to load AHAP: \(error)")
return nil
}
}
// 用法
if let pattern = loadAHAPPattern(named: "ShieldTransient") {
let player = try? engine?.makePlayer(with: pattern)
try? player?.start(atTime: CHHapticTimeImmediate)
}
示例迭代:护盾最初使用了 3 个瞬时脉冲(触觉)+ 渐进连续声音(音频)→ 不和谐。解决方案:切换到连续触觉 + ShieldA.wav 音频 → 统一体验。
class ViewController: UIViewController {
let animationDuration: TimeInterval = 0.5
func performShieldTransformation() {
// 与动画同时开始触觉/音频
playShieldPattern()
UIView.animate(withDuration: animationDuration) {
self.shieldView.transform = CGAffineTransform(scaleX: 1.2, y: 1.2)
self.shieldView.alpha = 0.8
}
}
func playShieldPattern() {
if let pattern = loadAHAPPattern(named: "ShieldContinuous") {
let player = try? engine?.makePlayer(with: pattern)
try? player?.start(atTime: CHHapticTimeImmediate)
}
}
}
关键:在视觉变化发生的精确时刻触发触觉,而不是之前或之后。
import AVFoundation
class AudioHapticCoordinator {
let audioPlayer: AVAudioPlayer
let hapticEngine: CHHapticEngine
func playCoordinatedExperience() {
// 准备两个系统
hapticEngine.notifyWhenPlayersFinished { _ in
return .stopEngine
}
// 在完全相同的时刻开始
let startTime = CACurrentMediaTime() + 0.05 // 用于同步的小延迟
// 开始音频
audioPlayer.play(atTime: startTime)
// 开始触觉
if let pattern = loadAHAPPattern(named: "CoordinatedPattern") {
let player = try? hapticEngine.makePlayer(with: pattern)
try? player?.start(atTime: CHHapticTimeImmediate)
}
}
}
class HapticButton: UIButton {
let impactGenerator = UIImpactFeedbackGenerator(style: .medium)
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesBegan(touches, with: event)
impactGenerator.prepare()
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesEnded(touches, with: event)
impactGenerator.impactOccurred()
}
}
class HapticSlider: UISlider {
let selectionGenerator = UISelectionFeedbackGenerator()
var lastValue: Float = 0
@objc func valueChanged() {
let threshold: Float = 0.1
if abs(value - lastValue) >= threshold {
selectionGenerator.selectionChanged()
lastValue = value
}
}
}
class PullToRefreshController: UIViewController {
let impactGenerator = UIImpactFeedbackGenerator(style: .medium)
var isRefreshing = false
func scrollViewDidScroll(_ scrollView: UIScrollView) {
let threshold: CGFloat = -100
let offset = scrollView.contentOffset.y
if offset <= threshold && !isRefreshing {
impactGenerator.impactOccurred()
isRefreshing = true
beginRefresh()
}
}
}
func handleServerResponse(_ result: Result<Data, Error>) {
let notificationGenerator = UINotificationFeedbackGenerator()
switch result {
case .success:
notificationGenerator.notificationOccurred(.success)
showSuccessMessage()
case .failure:
notificationGenerator.notificationOccurred(.error)
showErrorAlert()
}
}
触觉在模拟器中不起作用。您将看到:
解决方案:始终在物理设备上测试(iPhone 8 或更新型号)。
func playHaptic() {
#if DEBUG
print("🔔 Playing haptic - Engine running: \(engine?.currentTime ?? -1)")
#endif
do {
let player = try engine?.makePlayer(with: pattern)
try player?.start(atTime: CHHapticTimeImmediate)
#if DEBUG
print("✅ Haptic started successfully")
#endif
} catch {
#if DEBUG
print("❌ Haptic failed: \(error.localizedDescription)")
#endif
}
}
症状:CHHapticEngine.start() 抛出错误
原因:
解决方案:
func safelyStartEngine() {
guard CHHapticEngine.capabilitiesForHardware().supportsHaptics else {
print("Device doesn't support haptics")
return
}
do {
try engine?.start()
} catch {
print("Engine start failed: \(error)")
// 回退到 UIFeedbackGenerator
useFallbackHaptics()
}
}
症状:代码运行但在设备上感觉不到触觉
调试步骤:
症状:音频播放但触觉延迟,反之亦然
原因:
prepare()解决方案:
// ✅ 同步开始
func playCoordinated() {
impactGenerator.prepare() // 减少延迟
// 同时开始两者
audioPlayer.play()
impactGenerator.impactOccurred()
}
症状:AHAP 模式无法加载或播放
原因:音频文件 > 4.2 MB 或 > 23 秒
解决方案:保持音频文件短小。使用压缩格式(AAC)并修剪到必要时长。
WWDC:2021-10278, 2019-520, 2019-223
文档:/corehaptics, /corehaptics/chhapticengine
技能:axiom-swiftui-animation-ref, axiom-ui-testing, axiom-accessibility-diag
每周安装量
103
仓库
GitHub 星标数
601
首次出现
2026 年 1 月 21 日
安全审计
安装于
opencode88
codex82
gemini-cli81
claude-code80
github-copilot77
cursor76
Comprehensive guide to implementing haptic feedback on iOS. Every Apple Design Award winner uses excellent haptic feedback - Camera, Maps, Weather all use haptics masterfully to create delightful, responsive experiences.
Haptic feedback provides tactile confirmation of user actions and system events. When designed thoughtfully using the Causality-Harmony-Utility framework, axiom-haptics transform interfaces from functional to delightful.
This skill covers both simple haptics (UIFeedbackGenerator) and advanced custom patterns (Core Haptics), with real-world examples and audio-haptic synchronization techniques.
Apple's audio and haptic design teams established three core principles for multimodal feedback:
Problem : User can't tell what triggered the haptic Solution : Haptic timing must match the visual/interaction moment
Example from WWDC :
Code pattern :
// ✅ Immediate feedback on touch
@objc func buttonTapped() {
let generator = UIImpactFeedbackGenerator(style: .medium)
generator.impactOccurred() // Fire immediately
performAction()
}
// ❌ Delayed feedback loses causality
@objc func buttonTapped() {
performAction()
DispatchQueue.main.asyncAfter(deadline: .now() + 0.2) {
let generator = UIImpactFeedbackGenerator(style: .medium)
generator.impactOccurred() // Too late!
}
}
Problem : Visual, audio, and haptic don't match Solution : All three senses should feel like a unified experience
Example from WWDC :
Key insight : A large object should feel heavy, sound low and resonant, and look substantial. All three senses reinforce the same experience.
Problem : Haptics used everywhere "just because we can" Solution : Reserve haptics for significant moments that benefit the user
When to use haptics :
When NOT to use haptics :
For most apps, UIFeedbackGenerator provides 3 simple haptic types without custom patterns.
Physical collision or impact sensation.
Styles (ordered light → heavy):
.light - Small, delicate tap.medium - Standard tap (most common).heavy - Strong, solid impact.rigid - Firm, precise tap.soft - Gentle, cushioned tapUsage pattern :
class MyViewController: UIViewController {
let impactGenerator = UIImpactFeedbackGenerator(style: .medium)
override func viewDidLoad() {
super.viewDidLoad()
// Prepare reduces latency for next impact
impactGenerator.prepare()
}
@objc func userDidTap() {
impactGenerator.impactOccurred()
}
}
Intensity variation (iOS 13+):
// intensity: 0.0 (lightest) to 1.0 (strongest)
impactGenerator.impactOccurred(intensity: 0.5)
Common use cases :
.medium).light).heavy).rigid)Discrete selection changes (picker wheels, segmented controls).
Usage :
class PickerViewController: UIViewController {
let selectionGenerator = UISelectionFeedbackGenerator()
func pickerView(_ picker: UIPickerView, didSelectRow row: Int,
inComponent component: Int) {
selectionGenerator.selectionChanged()
}
}
Feels like : Clicking a physical wheel with detents
Common use cases :
System-level success/warning/error feedback.
Types :
.success - Task completed successfully.warning - Attention needed, but not critical.error - Critical error occurredUsage :
let notificationGenerator = UINotificationFeedbackGenerator()
func submitForm() {
// Validate form
if isValid {
notificationGenerator.notificationOccurred(.success)
saveData()
} else {
notificationGenerator.notificationOccurred(.error)
showValidationErrors()
}
}
Best practice : Match haptic type to user outcome
.success.error.warningCall prepare() before the haptic to reduce latency:
// ✅ Good - prepare before user action
@IBAction func buttonTouchDown(_ sender: UIButton) {
impactGenerator.prepare() // User's finger is down
}
@IBAction func buttonTouchUpInside(_ sender: UIButton) {
impactGenerator.impactOccurred() // Immediate haptic
}
// ❌ Bad - unprepared haptic may lag
@IBAction func buttonTapped(_ sender: UIButton) {
let generator = UIImpactFeedbackGenerator()
generator.impactOccurred() // May have 10-20ms delay
}
Prepare timing : System keeps engine ready for ~1 second after prepare().
For apps needing custom patterns, Core Haptics provides full control over haptic waveforms.
CHHapticEngine) - Link to the phone's actuatorCHHapticPatternPlayer) - Playback controlCHHapticPattern) - Collection of events over timeCHHapticEvent) - Building blocks specifying the experienceimport CoreHaptics
class HapticManager {
var engine: CHHapticEngine?
func initializeHaptics() {
// Check device support
guard CHHapticEngine.capabilitiesForHardware().supportsHaptics else {
print("Device doesn't support haptics")
return
}
do {
// Create engine
engine = try CHHapticEngine()
// Handle interruptions (calls, Siri, etc.)
engine?.stoppedHandler = { reason in
print("Engine stopped: \(reason)")
self.restartEngine()
}
// Handle reset (audio session changes)
engine?.resetHandler = {
print("Engine reset")
self.restartEngine()
}
// Start engine
try engine?.start()
} catch {
print("Failed to create haptic engine: \(error)")
}
}
func restartEngine() {
do {
try engine?.start()
} catch {
print("Failed to restart engine: \(error)")
}
}
}
Critical : Always set stoppedHandler and resetHandler to handle system interruptions.
Short, discrete feedback (like a tap).
let intensity = CHHapticEventParameter(
parameterID: .hapticIntensity,
value: 1.0 // 0.0 to 1.0
)
let sharpness = CHHapticEventParameter(
parameterID: .hapticSharpness,
value: 0.5 // 0.0 (dull) to 1.0 (sharp)
)
let event = CHHapticEvent(
eventType: .hapticTransient,
parameters: [intensity, sharpness],
relativeTime: 0.0 // Seconds from pattern start
)
Parameters :
hapticIntensity: Strength (0.0 = barely felt, 1.0 = maximum)hapticSharpness: Character (0.0 = dull thud, 1.0 = crisp snap)Sustained feedback over time (like a vibration motor).
let intensity = CHHapticEventParameter(
parameterID: .hapticIntensity,
value: 0.8
)
let sharpness = CHHapticEventParameter(
parameterID: .hapticSharpness,
value: 0.3
)
let event = CHHapticEvent(
eventType: .hapticContinuous,
parameters: [intensity, sharpness],
relativeTime: 0.0,
duration: 2.0 // Seconds
)
Use cases :
func playCustomPattern() {
// Create events
let tap1 = CHHapticEvent(
eventType: .hapticTransient,
parameters: [
CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.5),
CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.5)
],
relativeTime: 0.0
)
let tap2 = CHHapticEvent(
eventType: .hapticTransient,
parameters: [
CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.7),
CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.7)
],
relativeTime: 0.3
)
let tap3 = CHHapticEvent(
eventType: .hapticTransient,
parameters: [
CHHapticEventParameter(parameterID: .hapticIntensity, value: 1.0),
CHHapticEventParameter(parameterID: .hapticSharpness, value: 1.0)
],
relativeTime: 0.6
)
do {
// Create pattern from events
let pattern = try CHHapticPattern(
events: [tap1, tap2, tap3],
parameters: []
)
// Create player
let player = try engine?.makePlayer(with: pattern)
// Play
try player?.start(atTime: CHHapticTimeImmediate)
} catch {
print("Failed to play pattern: \(error)")
}
}
For continuous feedback (rolling textures, motors), use advanced player:
func startRollingTexture() {
let event = CHHapticEvent(
eventType: .hapticContinuous,
parameters: [
CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.4),
CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.2)
],
relativeTime: 0.0,
duration: 0.5
)
do {
let pattern = try CHHapticPattern(events: [event], parameters: [])
// Use advanced player for looping
let player = try engine?.makeAdvancedPlayer(with: pattern)
// Enable looping
try player?.loopEnabled = true
// Start
try player?.start(atTime: CHHapticTimeImmediate)
// Update intensity dynamically based on ball speed
updateTextureIntensity(player: player)
} catch {
print("Failed to start texture: \(error)")
}
}
func updateTextureIntensity(player: CHHapticAdvancedPatternPlayer?) {
let newIntensity = calculateIntensityFromBallSpeed()
let intensityParam = CHHapticDynamicParameter(
parameterID: .hapticIntensityControl,
value: newIntensity,
relativeTime: 0
)
try? player?.sendParameters([intensityParam], atTime: CHHapticTimeImmediate)
}
Key difference : CHHapticPatternPlayer plays once, CHHapticAdvancedPatternPlayer supports looping and dynamic parameter updates.
AHAP (Apple Haptic Audio Pattern) files are JSON files combining haptic events and audio.
{
"Version": 1.0,
"Metadata": {
"Project": "My App",
"Created": "2024-01-15"
},
"Pattern": [
{
"Event": {
"Time": 0.0,
"EventType": "HapticTransient",
"EventParameters": [
{
"ParameterID": "HapticIntensity",
"ParameterValue": 1.0
},
{
"ParameterID": "HapticSharpness",
"ParameterValue": 0.5
}
]
}
}
]
}
{
"Version": 1.0,
"Pattern": [
{
"Event": {
"Time": 0.0,
"EventType": "AudioCustom",
"EventParameters": [
{
"ParameterID": "AudioVolume",
"ParameterValue": 0.8
}
],
"EventWaveformPath": "ShieldA.wav"
}
},
{
"Event": {
"Time": 0.0,
"EventType": "HapticContinuous",
"EventDuration": 0.5,
"EventParameters": [
{
"ParameterID": "HapticIntensity",
"ParameterValue": 0.6
}
]
}
}
]
}
func loadAHAPPattern(named name: String) -> CHHapticPattern? {
guard let url = Bundle.main.url(forResource: name, withExtension: "ahap") else {
print("AHAP file not found")
return nil
}
do {
return try CHHapticPattern(contentsOf: url)
} catch {
print("Failed to load AHAP: \(error)")
return nil
}
}
// Usage
if let pattern = loadAHAPPattern(named: "ShieldTransient") {
let player = try? engine?.makePlayer(with: pattern)
try? player?.start(atTime: CHHapticTimeImmediate)
}
Example iteration : Shield initially used 3 transient pulses (haptic) + progressive continuous sound (audio) → no harmony. Solution: Switch to continuous haptic + ShieldA.wav audio → unified experience.
class ViewController: UIViewController {
let animationDuration: TimeInterval = 0.5
func performShieldTransformation() {
// Start haptic/audio simultaneously with animation
playShieldPattern()
UIView.animate(withDuration: animationDuration) {
self.shieldView.transform = CGAffineTransform(scaleX: 1.2, y: 1.2)
self.shieldView.alpha = 0.8
}
}
func playShieldPattern() {
if let pattern = loadAHAPPattern(named: "ShieldContinuous") {
let player = try? engine?.makePlayer(with: pattern)
try? player?.start(atTime: CHHapticTimeImmediate)
}
}
}
Critical : Fire haptic at the exact moment the visual change occurs, not before or after.
import AVFoundation
class AudioHapticCoordinator {
let audioPlayer: AVAudioPlayer
let hapticEngine: CHHapticEngine
func playCoordinatedExperience() {
// Prepare both systems
hapticEngine.notifyWhenPlayersFinished { _ in
return .stopEngine
}
// Start at exact same moment
let startTime = CACurrentMediaTime() + 0.05 // Small delay for sync
// Start audio
audioPlayer.play(atTime: startTime)
// Start haptic
if let pattern = loadAHAPPattern(named: "CoordinatedPattern") {
let player = try? hapticEngine.makePlayer(with: pattern)
try? player?.start(atTime: CHHapticTimeImmediate)
}
}
}
class HapticButton: UIButton {
let impactGenerator = UIImpactFeedbackGenerator(style: .medium)
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesBegan(touches, with: event)
impactGenerator.prepare()
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
super.touchesEnded(touches, with: event)
impactGenerator.impactOccurred()
}
}
class HapticSlider: UISlider {
let selectionGenerator = UISelectionFeedbackGenerator()
var lastValue: Float = 0
@objc func valueChanged() {
let threshold: Float = 0.1
if abs(value - lastValue) >= threshold {
selectionGenerator.selectionChanged()
lastValue = value
}
}
}
class PullToRefreshController: UIViewController {
let impactGenerator = UIImpactFeedbackGenerator(style: .medium)
var isRefreshing = false
func scrollViewDidScroll(_ scrollView: UIScrollView) {
let threshold: CGFloat = -100
let offset = scrollView.contentOffset.y
if offset <= threshold && !isRefreshing {
impactGenerator.impactOccurred()
isRefreshing = true
beginRefresh()
}
}
}
func handleServerResponse(_ result: Result<Data, Error>) {
let notificationGenerator = UINotificationFeedbackGenerator()
switch result {
case .success:
notificationGenerator.notificationOccurred(.success)
showSuccessMessage()
case .failure:
notificationGenerator.notificationOccurred(.error)
showErrorAlert()
}
}
Haptics DO NOT work in Simulator. You will see:
Solution : Always test on physical device (iPhone 8 or newer).
func playHaptic() {
#if DEBUG
print("🔔 Playing haptic - Engine running: \(engine?.currentTime ?? -1)")
#endif
do {
let player = try engine?.makePlayer(with: pattern)
try player?.start(atTime: CHHapticTimeImmediate)
#if DEBUG
print("✅ Haptic started successfully")
#endif
} catch {
#if DEBUG
print("❌ Haptic failed: \(error.localizedDescription)")
#endif
}
}
Symptom : CHHapticEngine.start() throws error
Causes :
Solution :
func safelyStartEngine() {
guard CHHapticEngine.capabilitiesForHardware().supportsHaptics else {
print("Device doesn't support haptics")
return
}
do {
try engine?.start()
} catch {
print("Engine start failed: \(error)")
// Fall back to UIFeedbackGenerator
useFallbackHaptics()
}
}
Symptom : Code runs but no haptic felt on device
Debug steps :
Symptom : Audio plays but haptic delayed or vice versa
Causes :
prepare() before hapticSolution :
// ✅ Synchronized start
func playCoordinated() {
impactGenerator.prepare() // Reduce latency
// Start both simultaneously
audioPlayer.play()
impactGenerator.impactOccurred()
}
Symptom : AHAP pattern fails to load or play
Cause : Audio file > 4.2 MB or > 23 seconds
Solution : Keep audio files small and short. Use compressed formats (AAC) and trim to essential duration.
WWDC : 2021-10278, 2019-520, 2019-223
Docs : /corehaptics, /corehaptics/chhapticengine
Skills : axiom-swiftui-animation-ref, axiom-ui-testing, axiom-accessibility-diag
Weekly Installs
103
Repository
GitHub Stars
601
First Seen
Jan 21, 2026
Security Audits
Gen Agent Trust HubPassSocketPassSnykPass
Installed on
opencode88
codex82
gemini-cli81
claude-code80
github-copilot77
cursor76
agentation - AI智能体可视化UI反馈工具,连接人眼与代码的桥梁
5,400 周安装