axiom-camera-capture by charleswiltgen/axiom
npx skills add https://github.com/charleswiltgen/axiom --skill axiom-camera-capture指导您实现相机捕捉功能:会话设置、照片捕捉、视频录制、响应式捕捉用户体验、旋转处理以及会话生命周期管理。
当您需要时使用:
"如何在 SwiftUI 中设置相机预览?" "我的相机在接到电话时卡住了" "前置摄像头上的照片预览旋转错误" "如何让照片捕捉感觉瞬间完成?" "我应该使用延迟处理吗?" "我的相机拍摄时间太长" "如何在前置和后置摄像头之间切换?" "如何录制带音频的视频?"
表明您把事情变得比实际需要的更复杂:
startRunning()(会阻塞用户界面数秒)videoOrientation 而不是 RotationCoordinator(iOS 17+).photo 预设(格式错误)photoQualityPrioritization(导致捕捉缓慢)广告位招租
在这里展示您的产品或服务
触达数万 AI 开发者,精准高效
.notAuthorizedbeginConfiguration()/commitConfiguration() 就修改会话在实现任何相机功能之前:
What do you need?
┌─ Just let user pick a photo?
│ └─ Don't use AVFoundation - use PHPicker or PhotosPicker
│ See: /skill axiom-photo-library
│
├─ Simple photo/video capture with system UI?
│ └─ UIImagePickerController (but limited customization)
│
├─ Custom camera UI with photo capture?
│ └─ AVCaptureSession + AVCapturePhotoOutput
│ → Continue with this skill
│
├─ Custom camera UI with video recording?
│ └─ AVCaptureSession + AVCaptureMovieFileOutput
│ → Continue with this skill
│
└─ Both photo and video in same session?
└─ AVCaptureSession + both outputs
→ Continue with this skill
import AVFoundation
func requestCameraAccess() async -> Bool {
let status = AVCaptureDevice.authorizationStatus(for: .video)
switch status {
case .authorized:
return true
case .notDetermined:
return await AVCaptureDevice.requestAccess(for: .video)
case .denied, .restricted:
// Show settings prompt
return false
@unknown default:
return false
}
}
Info.plist 必需项 :
<key>NSCameraUsageDescription</key>
<string>Take photos and videos</string>
对于音频(视频录制):
<key>NSMicrophoneUsageDescription</key>
<string>Record audio with video</string>
AVCaptureSession
├─ Inputs
│ ├─ AVCaptureDeviceInput (camera)
│ └─ AVCaptureDeviceInput (microphone, for video)
│
├─ Outputs
│ ├─ AVCapturePhotoOutput (photos)
│ ├─ AVCaptureMovieFileOutput (video files)
│ └─ AVCaptureVideoDataOutput (raw frames)
│
└─ Connections (automatic between compatible input/output)
关键规则:所有会话配置都在专用的串行队列上进行,绝不在主线程。
使用场景:设置具有照片捕捉功能的相机预览。
import AVFoundation
class CameraManager: NSObject {
let session = AVCaptureSession()
let photoOutput = AVCapturePhotoOutput()
// CRITICAL: Dedicated serial queue for session work
private let sessionQueue = DispatchQueue(label: "camera.session")
func setupSession() {
sessionQueue.async { [self] in
session.beginConfiguration()
defer { session.commitConfiguration() }
// 1. Set session preset
session.sessionPreset = .photo
// 2. Add camera input
guard let camera = AVCaptureDevice.default(.builtInWideAngleCamera,
for: .video,
position: .back),
let input = try? AVCaptureDeviceInput(device: camera),
session.canAddInput(input) else {
return
}
session.addInput(input)
// 3. Add photo output
guard session.canAddOutput(photoOutput) else { return }
session.addOutput(photoOutput)
// 4. Configure photo output
photoOutput.isHighResolutionCaptureEnabled = true
photoOutput.maxPhotoQualityPrioritization = .quality
}
}
func startSession() {
sessionQueue.async { [self] in
if !session.isRunning {
session.startRunning() // Blocking call - never on main thread!
}
}
}
func stopSession() {
sessionQueue.async { [self] in
if session.isRunning {
session.stopRunning()
}
}
}
}
成本:30 分钟实现
使用场景:在 SwiftUI 视图中显示相机预览。
import SwiftUI
import AVFoundation
struct CameraPreview: UIViewRepresentable {
let session: AVCaptureSession
func makeUIView(context: Context) -> PreviewView {
let view = PreviewView()
view.previewLayer.session = session
view.previewLayer.videoGravity = .resizeAspectFill
return view
}
func updateUIView(_ uiView: PreviewView, context: Context) {}
class PreviewView: UIView {
override class var layerClass: AnyClass { AVCaptureVideoPreviewLayer.self }
var previewLayer: AVCaptureVideoPreviewLayer { layer as! AVCaptureVideoPreviewLayer }
}
}
// Usage in SwiftUI
struct CameraView: View {
@StateObject private var camera = CameraManager()
var body: some View {
CameraPreview(session: camera.session)
.ignoresSafeArea()
.onAppear { camera.startSession() }
.onDisappear { camera.stopSession() }
}
}
成本:20 分钟实现
使用场景:无论设备如何旋转,都保持预览和捕捉的照片方向正确。
为什么使用 RotationCoordinator:已弃用的 videoOrientation 需要手动观察设备方向。RotationCoordinator 自动跟踪重力并提供角度。
import AVFoundation
class CameraManager {
private var rotationCoordinator: AVCaptureDevice.RotationCoordinator?
private var rotationObservation: NSKeyValueObservation?
func setupRotationCoordinator(device: AVCaptureDevice, previewLayer: AVCaptureVideoPreviewLayer) {
// Create coordinator with device and preview layer
rotationCoordinator = AVCaptureDevice.RotationCoordinator(
device: device,
previewLayer: previewLayer
)
// Observe preview rotation changes
rotationObservation = rotationCoordinator?.observe(
\.videoRotationAngleForHorizonLevelPreview,
options: [.new]
) { [weak previewLayer] coordinator, _ in
// Update preview layer rotation on main thread
DispatchQueue.main.async {
previewLayer?.connection?.videoRotationAngle = coordinator.videoRotationAngleForHorizonLevelPreview
}
}
// Set initial rotation
previewLayer.connection?.videoRotationAngle = rotationCoordinator!.videoRotationAngleForHorizonLevelPreview
}
func captureRotationAngle() -> CGFloat {
// Use this angle when capturing photos
rotationCoordinator?.videoRotationAngleForHorizonLevelCapture ?? 0
}
}
捕捉时:
func capturePhoto() {
let settings = AVCapturePhotoSettings()
// Apply rotation angle from coordinator
if let connection = photoOutput.connection(with: .video) {
connection.videoRotationAngle = captureRotationAngle()
}
photoOutput.capturePhoto(with: settings, delegate: self)
}
成本:45 分钟实现,可避免 2 小时以上的旋转问题调试
使用场景:通过零快门延迟、重叠捕捉和响应式按钮状态,使照片捕捉感觉瞬间完成。
iOS 17+ 引入了四个互补的 API,它们协同工作以实现最大响应速度:
使用最近帧的环形缓冲区来"时间旅行",回到您按下快门的精确时刻。对于 iOS 17+ 应用会自动启用。
// Check if supported for current format
if photoOutput.isZeroShutterLagSupported {
// Enabled by default for apps linking iOS 17+
// Opt out if causing issues:
// photoOutput.isZeroShutterLagEnabled = false
}
为什么重要:没有 ZSL,点击和帧捕捉之间会有延迟。对于动作镜头,瞬间已经过去。
要求:iPhone XS 及更新机型。不适用于闪光灯捕捉、手动曝光、包围曝光或组成照片交付。
允许在前一个捕捉仍在处理时开始新的捕捉:
// Check support first
if photoOutput.isZeroShutterLagSupported {
photoOutput.isZeroShutterLagEnabled = true // Required for responsive capture
if photoOutput.isResponsiveCaptureSupported {
photoOutput.isResponsiveCaptureEnabled = true
}
}
权衡:增加峰值内存使用量。如果您的应用内存受限,请考虑保持禁用。
要求:A12 Bionic(iPhone XS)及更新机型。
在快速连续拍摄多张照片时(如连拍模式)自动调整质量:
if photoOutput.isFastCapturePrioritizationSupported {
photoOutput.isFastCapturePrioritizationEnabled = true
// When enabled, rapid captures use "balanced" quality instead of "quality"
// to maintain consistent shot-to-shot time
}
何时启用:面向用户的开关(Camera.app 中的"优先考虑更快拍摄")。默认关闭,因为它会降低质量。
对用户体验至关重要:提供快门按钮状态的同步更新,无需异步延迟。
class CameraManager {
private var readinessCoordinator: AVCapturePhotoOutputReadinessCoordinator!
func setupReadinessCoordinator() {
readinessCoordinator = AVCapturePhotoOutputReadinessCoordinator(photoOutput: photoOutput)
readinessCoordinator.delegate = self
}
func capturePhoto() {
var settings = AVCapturePhotoSettings()
settings.photoQualityPrioritization = .balanced
// Tell coordinator to track this capture BEFORE calling capturePhoto
readinessCoordinator.startTrackingCaptureRequest(using: settings)
photoOutput.capturePhoto(with: settings, delegate: self)
}
}
extension CameraManager: AVCapturePhotoOutputReadinessCoordinatorDelegate {
func readinessCoordinator(_ coordinator: AVCapturePhotoOutputReadinessCoordinator,
captureReadinessDidChange captureReadiness: AVCapturePhotoOutput.CaptureReadiness) {
DispatchQueue.main.async {
switch captureReadiness {
case .ready:
self.shutterButton.isEnabled = true
self.shutterButton.alpha = 1.0
case .notReadyMomentarily:
// Brief delay - disable to prevent double-tap
self.shutterButton.isEnabled = false
case .notReadyWaitingForCapture:
// Flash is firing - dim button
self.shutterButton.alpha = 0.5
case .notReadyWaitingForProcessing:
// Processing previous photo - show spinner
self.showProcessingIndicator()
case .sessionNotRunning:
self.shutterButton.isEnabled = false
@unknown default:
break
}
}
}
}
为什么使用就绪协调器:没有它,您需要手动跟踪捕捉状态,用户可能会在处理过程中连按快门按钮。
即使没有新 API 也仍然有用:
func capturePhoto() {
var settings = AVCapturePhotoSettings()
// Speed vs Quality tradeoff
// .speed - Fastest capture, lower quality
// .balanced - Good default
// .quality - Best quality, may have delay
settings.photoQualityPrioritization = .speed
// For specific use cases:
// - Social sharing: .speed (users expect instant)
// - Document scanning: .quality (accuracy matters)
// - General photography: .balanced
photoOutput.capturePhoto(with: settings, delegate: self)
}
延迟处理(iOS 17+):
为了获得最大响应速度,捕捉会立即返回代理图像,完整的 Deep Fusion 处理在后台进行:
// Check support and enable deferred processing
if photoOutput.isAutoDeferredPhotoDeliverySupported {
photoOutput.isAutoDeferredPhotoDeliveryEnabled = true
}
延迟处理的委托回调:
// Called for BOTH regular photos AND deferred proxies
func photoOutput(_ output: AVCapturePhotoOutput,
didFinishProcessingPhoto photo: AVCapturePhoto,
error: Error?) {
guard error == nil else { return }
// Non-deferred photo - save directly
if !photo.isRawPhoto, let data = photo.fileDataRepresentation() {
savePhotoToLibrary(data)
}
}
// Called ONLY for deferred proxies - save to PhotoKit for later processing
func photoOutput(_ output: AVCapturePhotoOutput,
didFinishCapturingDeferredPhotoProxy deferredPhotoProxy: AVCaptureDeferredPhotoProxy,
error: Error?) {
guard error == nil else { return }
// CRITICAL: Save proxy to library ASAP before app is backgrounded
// App may be force-quit if memory pressure is high during backgrounding
guard let proxyData = deferredPhotoProxy.fileDataRepresentation() else { return }
Task {
try await PHPhotoLibrary.shared().performChanges {
let request = PHAssetCreationRequest.forAsset()
// Use .photoProxy resource type - triggers deferred processing in Photos
request.addResource(with: .photoProxy, data: proxyData, options: nil)
}
}
}
最终处理何时发生:
具有延迟处理意识的图像获取:
// Request with secondary degraded image for smoother UX
let options = PHImageRequestOptions()
options.allowSecondaryDegradedImage = true // New in iOS 17
PHImageManager.default().requestImage(
for: asset,
targetSize: targetSize,
contentMode: .aspectFill,
options: options
) { image, info in
let isDegraded = info?[PHImageResultIsDegradedKey] as? Bool ?? false
if isDegraded {
// First: Low quality (immediate)
// Second: Medium quality (new - while processing)
// Third callback will be final quality
self.showTemporaryImage(image)
} else {
// Final quality - processing complete
self.showFinalImage(image)
}
}
要求:iPhone 11 Pro 及更新机型。不适用于闪光灯捕捉或无法从扩展处理中受益的格式。
重要注意事项:
成本:1 小时实现,可避免"相机感觉慢"的投诉
使用场景:处理电话呼叫、多任务处理、系统相机使用。
class CameraManager {
private var interruptionObservers: [NSObjectProtocol] = []
func setupInterruptionHandling() {
// Session was interrupted
let interruptedObserver = NotificationCenter.default.addObserver(
forName: .AVCaptureSessionWasInterrupted,
object: session,
queue: .main
) { [weak self] notification in
guard let reason = notification.userInfo?[AVCaptureSessionInterruptionReasonKey] as? Int,
let interruptionReason = AVCaptureSession.InterruptionReason(rawValue: reason) else {
return
}
switch interruptionReason {
case .videoDeviceNotAvailableInBackground:
// App went to background - normal, will resume
self?.showPausedOverlay()
case .audioDeviceInUseByAnotherClient:
// Another app using audio
self?.showInterruptedBanner("Audio in use by another app")
case .videoDeviceInUseByAnotherClient:
// Another app using camera
self?.showInterruptedBanner("Camera in use by another app")
case .videoDeviceNotAvailableWithMultipleForegroundApps:
// Split View/Slide Over - camera not available
self?.showInterruptedBanner("Camera unavailable in Split View")
case .videoDeviceNotAvailableDueToSystemPressure:
// Thermal state - reduce quality or stop
self?.handleThermalPressure()
@unknown default:
self?.showInterruptedBanner("Camera interrupted")
}
}
interruptionObservers.append(interruptedObserver)
// Session interruption ended
let endedObserver = NotificationCenter.default.addObserver(
forName: .AVCaptureSessionInterruptionEnded,
object: session,
queue: .main
) { [weak self] _ in
self?.hideInterruptedBanner()
self?.hidePausedOverlay()
// Session automatically resumes - no need to call startRunning()
}
interruptionObservers.append(endedObserver)
}
deinit {
interruptionObservers.forEach { NotificationCenter.default.removeObserver($0) }
}
}
成本:30 分钟实现,可防止"相机冻结"的错误报告
使用场景:在前置和后置摄像头之间切换。
func switchCamera() {
sessionQueue.async { [self] in
guard let currentInput = session.inputs.first as? AVCaptureDeviceInput else {
return
}
let currentPosition = currentInput.device.position
let newPosition: AVCaptureDevice.Position = currentPosition == .back ? .front : .back
guard let newDevice = AVCaptureDevice.default(
.builtInWideAngleCamera,
for: .video,
position: newPosition
) else {
return
}
session.beginConfiguration()
defer { session.commitConfiguration() }
// Remove old input
session.removeInput(currentInput)
// Add new input
do {
let newInput = try AVCaptureDeviceInput(device: newDevice)
if session.canAddInput(newInput) {
session.addInput(newInput)
// Update rotation coordinator for new device
if let previewLayer = previewLayer {
setupRotationCoordinator(device: newDevice, previewLayer: previewLayer)
}
} else {
// Fallback: restore old input
session.addInput(currentInput)
}
} catch {
session.addInput(currentInput)
}
}
}
前置摄像头镜像:前置摄像头预览默认是镜像的(符合用户预期)。捕捉的照片不镜像(便于共享)。这是有意为之。
成本:20 分钟实现
使用场景:录制带音频的视频到文件。
class CameraManager: NSObject {
let movieOutput = AVCaptureMovieFileOutput()
private var currentRecordingURL: URL?
func setupVideoRecording() {
sessionQueue.async { [self] in
session.beginConfiguration()
defer { session.commitConfiguration() }
// Set video preset
session.sessionPreset = .high // Or .hd1920x1080, .hd4K3840x2160
// Add microphone input
if let microphone = AVCaptureDevice.default(for: .audio),
let audioInput = try? AVCaptureDeviceInput(device: microphone),
session.canAddInput(audioInput) {
session.addInput(audioInput)
}
// Add movie output
if session.canAddOutput(movieOutput) {
session.addOutput(movieOutput)
}
}
}
func startRecording() {
guard !movieOutput.isRecording else { return }
let outputURL = FileManager.default.temporaryDirectory
.appendingPathComponent(UUID().uuidString)
.appendingPathExtension("mov")
currentRecordingURL = outputURL
// Apply rotation
if let connection = movieOutput.connection(with: .video) {
connection.videoRotationAngle = captureRotationAngle()
}
movieOutput.startRecording(to: outputURL, recordingDelegate: self)
}
func stopRecording() {
guard movieOutput.isRecording else { return }
movieOutput.stopRecording()
}
}
extension CameraManager: AVCaptureFileOutputRecordingDelegate {
func fileOutput(_ output: AVCaptureFileOutput,
didFinishRecordingTo outputFileURL: URL,
from connections: [AVCaptureConnection],
error: Error?) {
if let error = error {
print("Recording error: \(error)")
return
}
// Video saved to outputFileURL
saveVideoToPhotoLibrary(outputFileURL)
}
}
成本:45 分钟实现
错误:
func startCamera() {
session.startRunning() // Blocks UI for 1-3 seconds!
}
正确:
func startCamera() {
sessionQueue.async { [self] in
session.startRunning()
}
}
为什么重要:startRunning() 是阻塞的。在主线程上,用户界面会冻结。
错误(iOS 17 之前):
// Manually tracking orientation
NotificationCenter.default.addObserver(
forName: UIDevice.orientationDidChangeNotification,
object: nil,
queue: .main
) { _ in
// Manual rotation logic...
}
正确(iOS 17+):
let coordinator = AVCaptureDevice.RotationCoordinator(device: camera, previewLayer: preview)
// Automatically tracks gravity, provides angles
为什么重要:RotationCoordinator 处理手动跟踪会遗漏的边缘情况(正面朝上、正面朝下)。
错误:
// No interruption handling - camera freezes on phone call
正确:
NotificationCenter.default.addObserver(
forName: .AVCaptureSessionWasInterrupted,
object: session,
queue: .main
) { notification in
// Show UI feedback
}
为什么重要:如果不处理,相机在中断时看起来会冻结。
错误:
session.removeInput(oldInput)
session.addInput(newInput) // May fail mid-stream
正确:
session.beginConfiguration()
session.removeInput(oldInput)
session.addInput(newInput)
session.commitConfiguration() // Atomic change
为什么重要:没有配置块,会话可能在调用之间进入无效状态。
背景:产品希望发布相机功能。您正在考虑跳过中断处理。
压力:"我测试时它能工作,我们发布吧。"
现实:第一个在使用相机时接到电话的用户会看到冻结的用户界面。App Store 审核可能会发现这个问题。
正确做法:
反驳模板:"相机捕捉功能可以工作,但如果来电,应用会冻结。我需要 30 分钟来正确处理中断,以避免一星差评。"
背景:QA 报告照片捕捉感觉迟缓。产品经理希望它"像系统相机一样瞬间完成"。
压力:"想办法让它更快。"
现实:默认设置优先考虑质量而非速度。系统相机使用延迟处理。
正确做法:
photoQualityPrioritization = .speed反驳模板:"我们目前针对图像质量进行了优化。我可以通过优先考虑速度并立即显示预览(同时处理在后台继续)来使捕捉感觉瞬间完成。这就是系统相机应用的做法。"
背景:设计师报告前置摄像头照片看起来"不对"——它们不像预览那样是镜像的。
压力:"预览显示一种方式,照片应该匹配。"
现实:预览是镜像的(用户期望——像镜子一样)。照片不镜像(便于共享——文字阅读正确)。这是符合系统相机行为的有意设计。
正确做法:
反驳模板:"这是苹果的有意行为。预览像镜子一样镜像,以便用户构图,但捕捉的照片未镜像,以便共享时文字阅读正确。如果我们的用例需要,我们可以在后期处理中添加可选的镜像功能。"
发布相机功能前:
会话设置:
startRunning().photo 用于照片,.high 用于视频)beginConfiguration()/commitConfiguration() 中权限:
NSCameraUsageDescriptionNSMicrophoneUsageDescription旋转:
响应速度:
中断:
相机切换:
视频录制(如果适用):
WWDC:2021-10247, 2023-10105
文档:/avfoundation/avcapturesession, /avfoundation/avcapturedevice/rotationcoordinator, /avfoundation/avcapturephotosettings, /avfoundation/avcapturephotooutputreadinesscoordinator
技能:axiom-camera-capture-ref, axiom-camera-capture-diag, axiom-photo-library
每周安装次数
125
仓库
GitHub 星标数
601
首次出现
2026年1月21日
安全审计
安装于
opencode110
codex104
gemini-cli101
cursor101
github-copilot97
claude-code88
Guides you through implementing camera capture: session setup, photo capture, video recording, responsive capture UX, rotation handling, and session lifecycle management.
Use when you need to:
"How do I set up a camera preview in SwiftUI?" "My camera freezes when I get a phone call" "The photo preview is rotated wrong on front camera" "How do I make photo capture feel instant?" "Should I use deferred processing?" "My camera takes too long to capture" "How do I switch between front and back cameras?" "How do I record video with audio?"
Signs you're making this harder than it needs to be:
startRunning() on main thread (blocks UI for seconds)videoOrientation instead of RotationCoordinator (iOS 17+).photo preset for video (wrong format)photoQualityPrioritization (slow captures).notAuthorized permission statebeginConfiguration()/commitConfiguration()Before implementing any camera feature:
What do you need?
┌─ Just let user pick a photo?
│ └─ Don't use AVFoundation - use PHPicker or PhotosPicker
│ See: /skill axiom-photo-library
│
├─ Simple photo/video capture with system UI?
│ └─ UIImagePickerController (but limited customization)
│
├─ Custom camera UI with photo capture?
│ └─ AVCaptureSession + AVCapturePhotoOutput
│ → Continue with this skill
│
├─ Custom camera UI with video recording?
│ └─ AVCaptureSession + AVCaptureMovieFileOutput
│ → Continue with this skill
│
└─ Both photo and video in same session?
└─ AVCaptureSession + both outputs
→ Continue with this skill
import AVFoundation
func requestCameraAccess() async -> Bool {
let status = AVCaptureDevice.authorizationStatus(for: .video)
switch status {
case .authorized:
return true
case .notDetermined:
return await AVCaptureDevice.requestAccess(for: .video)
case .denied, .restricted:
// Show settings prompt
return false
@unknown default:
return false
}
}
Info.plist required :
<key>NSCameraUsageDescription</key>
<string>Take photos and videos</string>
For audio (video recording):
<key>NSMicrophoneUsageDescription</key>
<string>Record audio with video</string>
AVCaptureSession
├─ Inputs
│ ├─ AVCaptureDeviceInput (camera)
│ └─ AVCaptureDeviceInput (microphone, for video)
│
├─ Outputs
│ ├─ AVCapturePhotoOutput (photos)
│ ├─ AVCaptureMovieFileOutput (video files)
│ └─ AVCaptureVideoDataOutput (raw frames)
│
└─ Connections (automatic between compatible input/output)
Key rule : All session configuration happens on a dedicated serial queue , never main thread.
Use case : Set up camera preview with photo capture capability.
import AVFoundation
class CameraManager: NSObject {
let session = AVCaptureSession()
let photoOutput = AVCapturePhotoOutput()
// CRITICAL: Dedicated serial queue for session work
private let sessionQueue = DispatchQueue(label: "camera.session")
func setupSession() {
sessionQueue.async { [self] in
session.beginConfiguration()
defer { session.commitConfiguration() }
// 1. Set session preset
session.sessionPreset = .photo
// 2. Add camera input
guard let camera = AVCaptureDevice.default(.builtInWideAngleCamera,
for: .video,
position: .back),
let input = try? AVCaptureDeviceInput(device: camera),
session.canAddInput(input) else {
return
}
session.addInput(input)
// 3. Add photo output
guard session.canAddOutput(photoOutput) else { return }
session.addOutput(photoOutput)
// 4. Configure photo output
photoOutput.isHighResolutionCaptureEnabled = true
photoOutput.maxPhotoQualityPrioritization = .quality
}
}
func startSession() {
sessionQueue.async { [self] in
if !session.isRunning {
session.startRunning() // Blocking call - never on main thread!
}
}
}
func stopSession() {
sessionQueue.async { [self] in
if session.isRunning {
session.stopRunning()
}
}
}
}
Cost : 30 min implementation
Use case : Display camera preview in SwiftUI view.
import SwiftUI
import AVFoundation
struct CameraPreview: UIViewRepresentable {
let session: AVCaptureSession
func makeUIView(context: Context) -> PreviewView {
let view = PreviewView()
view.previewLayer.session = session
view.previewLayer.videoGravity = .resizeAspectFill
return view
}
func updateUIView(_ uiView: PreviewView, context: Context) {}
class PreviewView: UIView {
override class var layerClass: AnyClass { AVCaptureVideoPreviewLayer.self }
var previewLayer: AVCaptureVideoPreviewLayer { layer as! AVCaptureVideoPreviewLayer }
}
}
// Usage in SwiftUI
struct CameraView: View {
@StateObject private var camera = CameraManager()
var body: some View {
CameraPreview(session: camera.session)
.ignoresSafeArea()
.onAppear { camera.startSession() }
.onDisappear { camera.stopSession() }
}
}
Cost : 20 min implementation
Use case : Keep preview and captured photos correctly oriented regardless of device rotation.
Why RotationCoordinator : Deprecated videoOrientation requires manual observation of device orientation. RotationCoordinator automatically tracks gravity and provides angles.
import AVFoundation
class CameraManager {
private var rotationCoordinator: AVCaptureDevice.RotationCoordinator?
private var rotationObservation: NSKeyValueObservation?
func setupRotationCoordinator(device: AVCaptureDevice, previewLayer: AVCaptureVideoPreviewLayer) {
// Create coordinator with device and preview layer
rotationCoordinator = AVCaptureDevice.RotationCoordinator(
device: device,
previewLayer: previewLayer
)
// Observe preview rotation changes
rotationObservation = rotationCoordinator?.observe(
\.videoRotationAngleForHorizonLevelPreview,
options: [.new]
) { [weak previewLayer] coordinator, _ in
// Update preview layer rotation on main thread
DispatchQueue.main.async {
previewLayer?.connection?.videoRotationAngle = coordinator.videoRotationAngleForHorizonLevelPreview
}
}
// Set initial rotation
previewLayer.connection?.videoRotationAngle = rotationCoordinator!.videoRotationAngleForHorizonLevelPreview
}
func captureRotationAngle() -> CGFloat {
// Use this angle when capturing photos
rotationCoordinator?.videoRotationAngleForHorizonLevelCapture ?? 0
}
}
When capturing :
func capturePhoto() {
let settings = AVCapturePhotoSettings()
// Apply rotation angle from coordinator
if let connection = photoOutput.connection(with: .video) {
connection.videoRotationAngle = captureRotationAngle()
}
photoOutput.capturePhoto(with: settings, delegate: self)
}
Cost : 45 min implementation, prevents 2+ hours debugging rotation issues
Use case : Make photo capture feel instant with zero-shutter-lag, overlapping captures, and responsive button states.
iOS 17+ introduces four complementary APIs that work together for maximum responsiveness:
Uses a ring buffer of recent frames to "time travel" back to the exact moment you tapped the shutter. Enabled automatically for iOS 17+ apps.
// Check if supported for current format
if photoOutput.isZeroShutterLagSupported {
// Enabled by default for apps linking iOS 17+
// Opt out if causing issues:
// photoOutput.isZeroShutterLagEnabled = false
}
Why it matters : Without ZSL, there's a delay between tap and frame capture. For action shots, the moment is already over.
Requirements : iPhone XS and newer. Does NOT apply to flash captures, manual exposure, bracketed captures, or constituent photo delivery.
Allows a new capture to start while the previous one is still processing:
// Check support first
if photoOutput.isZeroShutterLagSupported {
photoOutput.isZeroShutterLagEnabled = true // Required for responsive capture
if photoOutput.isResponsiveCaptureSupported {
photoOutput.isResponsiveCaptureEnabled = true
}
}
Tradeoff : Increases peak memory usage. If your app is memory-constrained, consider leaving disabled.
Requirements : A12 Bionic (iPhone XS) and newer.
Automatically adapts quality when taking multiple photos rapidly (like burst mode):
if photoOutput.isFastCapturePrioritizationSupported {
photoOutput.isFastCapturePrioritizationEnabled = true
// When enabled, rapid captures use "balanced" quality instead of "quality"
// to maintain consistent shot-to-shot time
}
When to enable : User-facing toggle ("Prioritize Faster Shooting" in Camera.app). Off by default because it reduces quality.
Critical for UX : Provides synchronous updates for shutter button state without async lag.
class CameraManager {
private var readinessCoordinator: AVCapturePhotoOutputReadinessCoordinator!
func setupReadinessCoordinator() {
readinessCoordinator = AVCapturePhotoOutputReadinessCoordinator(photoOutput: photoOutput)
readinessCoordinator.delegate = self
}
func capturePhoto() {
var settings = AVCapturePhotoSettings()
settings.photoQualityPrioritization = .balanced
// Tell coordinator to track this capture BEFORE calling capturePhoto
readinessCoordinator.startTrackingCaptureRequest(using: settings)
photoOutput.capturePhoto(with: settings, delegate: self)
}
}
extension CameraManager: AVCapturePhotoOutputReadinessCoordinatorDelegate {
func readinessCoordinator(_ coordinator: AVCapturePhotoOutputReadinessCoordinator,
captureReadinessDidChange captureReadiness: AVCapturePhotoOutput.CaptureReadiness) {
DispatchQueue.main.async {
switch captureReadiness {
case .ready:
self.shutterButton.isEnabled = true
self.shutterButton.alpha = 1.0
case .notReadyMomentarily:
// Brief delay - disable to prevent double-tap
self.shutterButton.isEnabled = false
case .notReadyWaitingForCapture:
// Flash is firing - dim button
self.shutterButton.alpha = 0.5
case .notReadyWaitingForProcessing:
// Processing previous photo - show spinner
self.showProcessingIndicator()
case .sessionNotRunning:
self.shutterButton.isEnabled = false
@unknown default:
break
}
}
}
}
Why use Readiness Coordinator : Without it, you'd need to track capture state manually and users might spam the shutter button during processing.
Still useful even without the new APIs:
func capturePhoto() {
var settings = AVCapturePhotoSettings()
// Speed vs Quality tradeoff
// .speed - Fastest capture, lower quality
// .balanced - Good default
// .quality - Best quality, may have delay
settings.photoQualityPrioritization = .speed
// For specific use cases:
// - Social sharing: .speed (users expect instant)
// - Document scanning: .quality (accuracy matters)
// - General photography: .balanced
photoOutput.capturePhoto(with: settings, delegate: self)
}
Deferred Processing (iOS 17+) :
For maximum responsiveness, capture returns immediately with proxy image, full Deep Fusion processing happens in background:
// Check support and enable deferred processing
if photoOutput.isAutoDeferredPhotoDeliverySupported {
photoOutput.isAutoDeferredPhotoDeliveryEnabled = true
}
Delegate callbacks with deferred processing :
// Called for BOTH regular photos AND deferred proxies
func photoOutput(_ output: AVCapturePhotoOutput,
didFinishProcessingPhoto photo: AVCapturePhoto,
error: Error?) {
guard error == nil else { return }
// Non-deferred photo - save directly
if !photo.isRawPhoto, let data = photo.fileDataRepresentation() {
savePhotoToLibrary(data)
}
}
// Called ONLY for deferred proxies - save to PhotoKit for later processing
func photoOutput(_ output: AVCapturePhotoOutput,
didFinishCapturingDeferredPhotoProxy deferredPhotoProxy: AVCaptureDeferredPhotoProxy,
error: Error?) {
guard error == nil else { return }
// CRITICAL: Save proxy to library ASAP before app is backgrounded
// App may be force-quit if memory pressure is high during backgrounding
guard let proxyData = deferredPhotoProxy.fileDataRepresentation() else { return }
Task {
try await PHPhotoLibrary.shared().performChanges {
let request = PHAssetCreationRequest.forAsset()
// Use .photoProxy resource type - triggers deferred processing in Photos
request.addResource(with: .photoProxy, data: proxyData, options: nil)
}
}
}
When final processing happens :
Fetching images with deferred processing awareness :
// Request with secondary degraded image for smoother UX
let options = PHImageRequestOptions()
options.allowSecondaryDegradedImage = true // New in iOS 17
PHImageManager.default().requestImage(
for: asset,
targetSize: targetSize,
contentMode: .aspectFill,
options: options
) { image, info in
let isDegraded = info?[PHImageResultIsDegradedKey] as? Bool ?? false
if isDegraded {
// First: Low quality (immediate)
// Second: Medium quality (new - while processing)
// Third callback will be final quality
self.showTemporaryImage(image)
} else {
// Final quality - processing complete
self.showFinalImage(image)
}
}
Requirements : iPhone 11 Pro and newer. Not used for flash captures or formats that don't benefit from extended processing.
Important considerations :
Cost : 1 hour implementation, prevents "camera feels slow" complaints
Use case : Handle phone calls, multitasking, system camera usage.
class CameraManager {
private var interruptionObservers: [NSObjectProtocol] = []
func setupInterruptionHandling() {
// Session was interrupted
let interruptedObserver = NotificationCenter.default.addObserver(
forName: .AVCaptureSessionWasInterrupted,
object: session,
queue: .main
) { [weak self] notification in
guard let reason = notification.userInfo?[AVCaptureSessionInterruptionReasonKey] as? Int,
let interruptionReason = AVCaptureSession.InterruptionReason(rawValue: reason) else {
return
}
switch interruptionReason {
case .videoDeviceNotAvailableInBackground:
// App went to background - normal, will resume
self?.showPausedOverlay()
case .audioDeviceInUseByAnotherClient:
// Another app using audio
self?.showInterruptedBanner("Audio in use by another app")
case .videoDeviceInUseByAnotherClient:
// Another app using camera
self?.showInterruptedBanner("Camera in use by another app")
case .videoDeviceNotAvailableWithMultipleForegroundApps:
// Split View/Slide Over - camera not available
self?.showInterruptedBanner("Camera unavailable in Split View")
case .videoDeviceNotAvailableDueToSystemPressure:
// Thermal state - reduce quality or stop
self?.handleThermalPressure()
@unknown default:
self?.showInterruptedBanner("Camera interrupted")
}
}
interruptionObservers.append(interruptedObserver)
// Session interruption ended
let endedObserver = NotificationCenter.default.addObserver(
forName: .AVCaptureSessionInterruptionEnded,
object: session,
queue: .main
) { [weak self] _ in
self?.hideInterruptedBanner()
self?.hidePausedOverlay()
// Session automatically resumes - no need to call startRunning()
}
interruptionObservers.append(endedObserver)
}
deinit {
interruptionObservers.forEach { NotificationCenter.default.removeObserver($0) }
}
}
Cost : 30 min implementation, prevents "camera freezes" bug reports
Use case : Toggle between front and back cameras.
func switchCamera() {
sessionQueue.async { [self] in
guard let currentInput = session.inputs.first as? AVCaptureDeviceInput else {
return
}
let currentPosition = currentInput.device.position
let newPosition: AVCaptureDevice.Position = currentPosition == .back ? .front : .back
guard let newDevice = AVCaptureDevice.default(
.builtInWideAngleCamera,
for: .video,
position: newPosition
) else {
return
}
session.beginConfiguration()
defer { session.commitConfiguration() }
// Remove old input
session.removeInput(currentInput)
// Add new input
do {
let newInput = try AVCaptureDeviceInput(device: newDevice)
if session.canAddInput(newInput) {
session.addInput(newInput)
// Update rotation coordinator for new device
if let previewLayer = previewLayer {
setupRotationCoordinator(device: newDevice, previewLayer: previewLayer)
}
} else {
// Fallback: restore old input
session.addInput(currentInput)
}
} catch {
session.addInput(currentInput)
}
}
}
Front camera mirroring : Front camera preview is mirrored by default (matches user expectation). Captured photos are NOT mirrored (correct for sharing). This is intentional.
Cost : 20 min implementation
Use case : Record video with audio to file.
class CameraManager: NSObject {
let movieOutput = AVCaptureMovieFileOutput()
private var currentRecordingURL: URL?
func setupVideoRecording() {
sessionQueue.async { [self] in
session.beginConfiguration()
defer { session.commitConfiguration() }
// Set video preset
session.sessionPreset = .high // Or .hd1920x1080, .hd4K3840x2160
// Add microphone input
if let microphone = AVCaptureDevice.default(for: .audio),
let audioInput = try? AVCaptureDeviceInput(device: microphone),
session.canAddInput(audioInput) {
session.addInput(audioInput)
}
// Add movie output
if session.canAddOutput(movieOutput) {
session.addOutput(movieOutput)
}
}
}
func startRecording() {
guard !movieOutput.isRecording else { return }
let outputURL = FileManager.default.temporaryDirectory
.appendingPathComponent(UUID().uuidString)
.appendingPathExtension("mov")
currentRecordingURL = outputURL
// Apply rotation
if let connection = movieOutput.connection(with: .video) {
connection.videoRotationAngle = captureRotationAngle()
}
movieOutput.startRecording(to: outputURL, recordingDelegate: self)
}
func stopRecording() {
guard movieOutput.isRecording else { return }
movieOutput.stopRecording()
}
}
extension CameraManager: AVCaptureFileOutputRecordingDelegate {
func fileOutput(_ output: AVCaptureFileOutput,
didFinishRecordingTo outputFileURL: URL,
from connections: [AVCaptureConnection],
error: Error?) {
if let error = error {
print("Recording error: \(error)")
return
}
// Video saved to outputFileURL
saveVideoToPhotoLibrary(outputFileURL)
}
}
Cost : 45 min implementation
Wrong :
func startCamera() {
session.startRunning() // Blocks UI for 1-3 seconds!
}
Right :
func startCamera() {
sessionQueue.async { [self] in
session.startRunning()
}
}
Why it matters : startRunning() is blocking. On main thread, UI freezes.
Wrong (pre-iOS 17):
// Manually tracking orientation
NotificationCenter.default.addObserver(
forName: UIDevice.orientationDidChangeNotification,
object: nil,
queue: .main
) { _ in
// Manual rotation logic...
}
Right (iOS 17+):
let coordinator = AVCaptureDevice.RotationCoordinator(device: camera, previewLayer: preview)
// Automatically tracks gravity, provides angles
Why it matters : RotationCoordinator handles edge cases (face-up, face-down) that manual tracking misses.
Wrong :
// No interruption handling - camera freezes on phone call
Right :
NotificationCenter.default.addObserver(
forName: .AVCaptureSessionWasInterrupted,
object: session,
queue: .main
) { notification in
// Show UI feedback
}
Why it matters : Without handling, camera appears frozen when interrupted.
Wrong :
session.removeInput(oldInput)
session.addInput(newInput) // May fail mid-stream
Right :
session.beginConfiguration()
session.removeInput(oldInput)
session.addInput(newInput)
session.commitConfiguration() // Atomic change
Why it matters : Without configuration block, session may enter invalid state between calls.
Context : Product wants camera feature shipped. You're considering skipping interruption handling.
Pressure : "It works when I test it, let's ship."
Reality : First user who gets a phone call while using camera will see frozen UI. App Store review may catch this.
Correct action :
Push-back template : "Camera captures work, but the app freezes if a phone call comes in. I need 30 minutes to handle interruptions properly and avoid 1-star reviews."
Context : QA reports photo capture feels sluggish. PM wants it "instant like the system camera."
Pressure : "Just make it faster somehow."
Reality : Default settings prioritize quality over speed. System camera uses deferred processing.
Correct action :
photoQualityPrioritization = .speed for social/sharing use casesPush-back template : "We're currently optimizing for image quality. I can make capture feel instant by prioritizing speed and showing the preview immediately while processing continues in background. This is what the system Camera app does."
Context : Designer reports front camera photos look "wrong" - they're not mirrored like the preview.
Pressure : "The preview shows it one way, the photo should match."
Reality : Preview is mirrored (user expectation - like a mirror). Photo is NOT mirrored (correct for sharing - text reads correctly). This is intentional behavior matching system camera.
Correct action :
Push-back template : "This is intentional Apple behavior. The preview is mirrored like a mirror so users can frame themselves, but the captured photo is unmirrored so text reads correctly when shared. We can add optional mirroring in post-processing if our use case requires it."
Before shipping camera features:
Session Setup :
startRunning() never called on main thread.photo for photos, .high for video)beginConfiguration()/commitConfiguration()Permissions :
NSCameraUsageDescription in Info.plistNSMicrophoneUsageDescription if recording audioRotation :
Responsiveness :
Interruptions :
Camera Switching :
Video Recording (if applicable):
WWDC : 2021-10247, 2023-10105
Docs : /avfoundation/avcapturesession, /avfoundation/avcapturedevice/rotationcoordinator, /avfoundation/avcapturephotosettings, /avfoundation/avcapturephotooutputreadinesscoordinator
Skills : axiom-camera-capture-ref, axiom-camera-capture-diag, axiom-photo-library
Weekly Installs
125
Repository
GitHub Stars
601
First Seen
Jan 21, 2026
Security Audits
Gen Agent Trust HubPassSocketPassSnykPass
Installed on
opencode110
codex104
gemini-cli101
cursor101
github-copilot97
claude-code88