Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Posts under Photos & Camera subtopic

Post

Replies

Boosts

Views

Activity

Impact on iOS Due to Image Policy Changes with Android Target SDK 34
As the image access policy has changed with Android targeting SDK 34, I’m planning to update the way our app accesses photos. We are using the react-native-image-picker library to access images. On Android, the system no longer prompts the user for image access permissions, but on iOS, permission requests still appear. Since Android no longer requires explicit permissions, I’ve removed the permission request logic for Android. In this case, is it also safe to remove the permission request for iOS? In our app, photo access is only used for changing the user profile picture and attaching images when writing a post on the bulletin board. Are there any limitations or considerations for this kind of usage?
1
0
94
Apr ’25
Failed to launch Photo Editing Extension from Mac Catalyst app
I have an iOS app that includes a Photo Editing Extension and is optimized for Mac Catalyst so you can edit photos in the Photos app on your Mac. This has worked really well but now I am encountering an error alert trying to open the photo editing extension: RBSLaunchRequest error trying to launch plugin com.company.TestEditor. TestPhotoEditor (B7A616A7-2 5A8-4E02-8B32-5CAB37C8B4B2): ErrorDomain=RBSRequestErrorDomain Code=5 "Launch failed." UserInfo={NSLocalizedFailureReason=Launch failed., NSUnderlyingError=0x7f08fafd0 {ErrorDomain=NSPOSIXErrorDomain Code=153 "Unknown error: 153" UserInfo={NSLocalizedDescription=Launchd job spawn failed}}} Create a new iOS app project in Xcode Create a new target and choose iOS > Photo Editing Extension For both targets in the project, add Mac Catalyst as a supported destination Run the app on My Mac (Mac Catalyst) Open the Photos app, double click a photo, click Edit, click the more plugins button, and click TestPhotoEditor in the list macOS 15.4.1 + Xcode 16.3
1
0
182
May ’25
ImageIO failed to encode HECIS in macOS 15.5
ImageIO encoding to HEICS fails in macOS 15.5. log writeImageAtIndex:1246: *** CMPhotoCompressionSessionAddImageToSequence: err = kCMPhotoError_UnsupportedOperation [-16994] (codec: 'hvc1') seems to be related with https://github.com/SDWebImage/SDWebImage/issues/3732 affected version iOS 18.4 (sim and device), macOS 15.5 unaffected version iOS 18.3 (sim and device), macOS 15.3
1
0
115
Jun ’25
iOS Camera access issues in Developer mode on real device - PermissionStatus.permanentlyDenied
Xcode Version 16.3 (16E140) App developed in Flutter Flutter 3.29.3 Test iPhone device: iPhone 16 Pro running iOS 18.5 I have an app that requires Camera access. This used to work before with iOS 18.4.x. I have dumbed down my app to just get Camera permission. Even then it fails flutter: Camera permission: PermissionStatus.denied flutter: Photos permission: PermissionStatus.denied flutter: Microphone permission: PermissionStatus.denied flutter: --- End Debug Info --- flutter: Loaded translations from asset for en_US container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled flutter: CAMERA PERMISSION STATUS: PermissionStatus.permanentlyDenied Camera permissions don't show up in my App settings or under "Settings -> Privacy and Security -> Camera" and I am at loss to understand why this is happening.
1
0
143
May ’25
PHAssetChangeRequest revertAssetContentToOriginal without original asset content
The documentation for PHAssetChangeRequest.revertAssetContentToOriginal says it will fail if the original asset content is not on the current device so you should use PHAssetResourceManager to download it first, but this no longer seems to be the case in the latest iOS versions because an error no longer occurs when I take a photo on my iPhone, edit it, open Photos on my iPad and let it sync, then open my app on iPad and call revertAssetContentToOriginal for that asset. Does the system now take care of downloading the original when needed?
1
0
130
Jun ’25
Memory leak when performing DetectHumanBodyPose3DRequest request
Hi, I'm developing an application for macos and ios that has to run DetectHumanBodyPose3DRequest model in real time for retrieving the 3d skeleton from the camera. I'm experiencing a memory leak every time the model is used (when i comment that line, the memory stays constant). After a minute it uses about 1GB of ram running with mac catalyst. I attached a minimal project that has this problem Code Camera View import SwiftUI import Combine import Vision struct CameraView: View { @StateObject private var viewModel = CameraViewModel() var body: some View { HStack { ZStack { GeometryReader { geometry in if let image = viewModel.currentFrame { Image(decorative: image, scale: 1) .resizable() .scaledToFill() .frame(width: geometry.size.width, height: geometry.size.height) .clipped() } else { ProgressView() } } } } } } class CameraViewModel: ObservableObject { @Published var currentFrame: CGImage? @Published var frameRate: Double = 0 @Published var currentVisionBodyPose: HumanBodyPose3DObservation? // Store current body pose @Published var currentImageSize: CGSize? // Store current image size private var cameraManager: CameraManager? private var humanBodyPose = HumanBodyPose3DDetector() private var lastClassificationTime = Date() private var frameCount = 0 private var lastFrameTime = Date() private let classificationThrottleInterval: TimeInterval = 1.0 private var lastPoseSendTime: Date = .distantPast init() { cameraManager = CameraManager() startPreview() startClassification() } private func startPreview() { Task { guard let previewStream = cameraManager?.previewStream else { return } for await frame in previewStream { let size = CGSize(width: frame.width, height: frame.height) Task { @MainActor in self.currentFrame = frame self.currentImageSize = size self.updateFrameRate() } } } } private func startClassification() { Task { guard let classificationStream = cameraManager?.classificationStream else { return } for await pixelBuffer in classificationStream { self.classifyFrame(pixelBuffer: pixelBuffer) } } } private func classifyFrame(pixelBuffer: CVPixelBuffer) { humanBodyPose.runHumanBodyPose3DRequestOnImage(pixelBuffer: pixelBuffer) { [weak self] observation in guard let self = self else { return } DispatchQueue.main.async { if let observation = observation { self.currentVisionBodyPose = observation print(observation) } else { self.currentVisionBodyPose = nil } } } } private func updateFrameRate() { frameCount += 1 let now = Date() let elapsed = now.timeIntervalSince(lastFrameTime) if elapsed >= 1.0 { frameRate = Double(frameCount) / elapsed frameCount = 0 lastFrameTime = now } } } HumanBodyPose3DDetector import Foundation import Vision class HumanBodyPose3DDetector: NSObject, ObservableObject { @Published var humanObservation: HumanBodyPose3DObservation? = nil private let queue = DispatchQueue(label: "humanbodypose.queue") private let request = DetectHumanBodyPose3DRequest() private struct SendablePixelBuffer: @unchecked Sendable { let buffer: CVPixelBuffer } public func runHumanBodyPose3DRequestOnImage(pixelBuffer: CVPixelBuffer, completion: @escaping (HumanBodyPose3DObservation?) -> Void) { let sendableBuffer = SendablePixelBuffer(buffer: pixelBuffer) queue.async { [weak self] in Task { [weak self, sendableBuffer] in do { guard let self = self else { return } let result = try await self.request.perform(on: sendableBuffer.buffer) //process result DispatchQueue.main.async { if result.isEmpty { completion(nil) } else { completion(result[0]) } } } catch { DispatchQueue.main.async { completion(nil) } } } } } }
1
0
146
Jun ’25
After iPadOS 26 Beta and iOS 26 Beta, AVCaptureMetadataOutput no longer detects Face on some devices.
I'm creating an app that uses AVCaptureSession to pass camera input to AVCaptureMetadataOutput type set [metaout setMetadataObjectTypes:@[AVMetadataObjectTypeFace]] and scan Face. After updating to OS 26 Beta2 and iOS 26 Beta2, an issue has occurred where the delegate method of AVCaptureMetadataOutputObjectsDelegate is not called on some devices. The following devices are experiencing this issue. iPad (9th Gen) iPad air (4th Gen) iPhone 15 This issue has not occur on any other devices I have. I tried running the AVFoundation sample code on the Apple Developer site on the above device. The same problem still occurs. [https://developer.apple.com/documentation/avfoundation/capture_setup/avcambarcode_detecting_barcodes_and_faces] Are any additional settings required after OS 26 beta and iOS 26 beta? Or is there some problem on the OS side?
1
6
482
Sep ’25
Camera become black for few propduction users during photo capture
PLATFORM AND VERSION :iOS 18.5 I wanted to bring to your attention a critical issue some of our production users are experiencing with the CoinOut app. Specifically, users are encountering a problem when attempting to capture photos of receipts using the app's customized camera feature. The camera, which utilizes AVCaptureVideoPreviewLayer and AVCaptureDevice, occasionally fails to load the preview, resulting in a black screen instead of the expected camera view. This camera blackout issue is significantly impacting the user experience as it prevents them from snapping photos of their receipts, which is a core functionality of the CoinOut app. Any help/suggestion to this issue would be greatly appreciated. STEPS TO REPRODUCE Open the app and click on camera icon. It will display camera to capture photo. Camera shows black for few production user's. class ViewController: UIViewController { @IBOutlet private weak var captureButton: UIButton! private var fillLayer: CAShapeLayer! private var previewLayer : AVCaptureVideoPreviewLayer! private var output: AVCapturePhotoOutput! private var device: AVCaptureDevice! private var session : AVCaptureSession! private var highResolutionEnabled: Bool = false private let sessionQueue = DispatchQueue(label: "session queue") override func viewDidLoad() { super.viewDidLoad() setupCamera() customiseUI() } @IBAction func startCamera(sender: UIButton) { didTapTakePhoto() } private func setupCamera() { let session = AVCaptureSession() session.sessionPreset = AVCaptureSession.Preset.high previewLayer = AVCaptureVideoPreviewLayer(session: session) output = AVCapturePhotoOutput() device = AVCaptureDevice.default(.builtInWideAngleCamera, for: AVMediaType.video, position: .back) if let device = self.device{ do{ let input = try AVCaptureDeviceInput(device: device) if session.canAddInput(input){ session.addInput(input)} else { print("\(#fileID):\(#function):\(#line) : Session Input addition failed") } if session.canAddOutput(output){ output.isHighResolutionCaptureEnabled = self.highResolutionEnabled session.addOutput(output) } else { print("\(#fileID):\(#function):\(#line) : Session Input high resolution failed") } previewLayer.videoGravity = .resizeAspectFill previewLayer.session = session sessionQueue.async { session.startRunning() } self.session = session self.session.accessibilityElementIsFocused() try device.lockForConfiguration() if device.isWhiteBalanceModeSupported(AVCaptureDevice.WhiteBalanceMode.autoWhiteBalance) { device.whiteBalanceMode = .autoWhiteBalance } else { print("\(#fileID):\(#function):\(#line) : isWhiteBalanceModeSupported no supported") } if device.isWhiteBalanceModeSupported(AVCaptureDevice.WhiteBalanceMode.continuousAutoWhiteBalance) { device.whiteBalanceMode = .continuousAutoWhiteBalance } else { print("\(#fileID):\(#function):\(#line) : isWhiteBalanceModeSupported no supported") } if device.isFocusModeSupported(.continuousAutoFocus) { device.focusMode = .continuousAutoFocus} else if device.isFocusModeSupported(.autoFocus) { device.focusMode = .autoFocus } device.unlockForConfiguration() } catch { print("\(#fileID):\(#function):\(#line) : \(error.localizedDescription)") } } else { print("\(#fileID):\(#function):\(#line) : Device found as nil") } } private func customiseUI() { let path = UIBezierPath(roundedRect: CGRect(x: 0, y: 0, width: self.view.bounds.width, height: self.view.bounds.height), cornerRadius: 0) let rectangleWidth = view.frame.width - (view.frame.width * 0.16) let x = (view.frame.width - rectangleWidth) / 2 let rectangleHeight = view.frame.height - (view.frame.height * 0.16) let y = (view.frame.height - rectangleHeight) / 2 let roundRect = UIBezierPath(roundedRect: CGRect(x: x, y: y, width: rectangleWidth, height: rectangleHeight), byRoundingCorners:.allCorners, cornerRadii: CGSize(width: 0, height: 0)) roundRect.move(to: CGPoint(x: self.view.center.x , y: self.view.center.y)) path.append(roundRect) path.usesEvenOddFillRule = true fillLayer = CAShapeLayer() fillLayer.path = path.cgPath fillLayer.fillRule = .evenOdd fillLayer.opacity = 0.4 previewLayer.addSublayer(fillLayer) previewLayer.frame = view.bounds view.layer.addSublayer(previewLayer) view.bringSubviewToFront(captureButton) } private func didTapTakePhoto() { let settings = self.getSettings(camera: self.device) if device.isAdjustingFocus { do { try device.lockForConfiguration() device.focusMode = .continuousAutoFocus device.unlockForConfiguration() device.addObserver(self, forKeyPath: "adjustingFocus", options: [.new], context: nil) } catch { print(error) } } else { output.capturePhoto(with: settings, delegate: self) } } func getSettings(camera: AVCaptureDevice) -> AVCapturePhotoSettings { var settings = AVCapturePhotoSettings() if let rawFormat = output.availableRawPhotoPixelFormatTypes.first { settings = AVCapturePhotoSettings(rawPixelFormatType: OSType(rawFormat)) } settings.isHighResolutionPhotoEnabled = self.highResolutionEnabled let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first! let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType] as [String : Any] settings.previewPhotoFormat = previewFormat return settings } } extension ViewController: AVCapturePhotoCaptureDelegate { func photoOutput(_ output: AVCapturePhotoOutput, willCapturePhotoFor resolvedSettings: AVCaptureResolvedPhotoSettings) { AudioServicesDisposeSystemSoundID(1108) } func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { guard let data = photo.fileDataRepresentation() else { return } let image = UIImage(data: data)! showImage(cropped: image) } func showImage(cropped: UIImage) { let vc = self.storyboard?.instantiateViewController(withIdentifier: "ImagePreviewViewController") as? ImagePreviewViewController vc?.captured = cropped self.present(vc!, animated: true) } }```
1
0
195
Jul ’25
Is a Locked Capture Extension allowed to just "open the app" when the device is unlocked?
Hey, Quick question. I noticed that Adobe's new app, Project Indigo, allows you to open the app using the Camera Control button. However, when your device is locked it just shows this screen: Would this normally be approved by the Appstore approval process? I ask because I would like to do something similar with my camera app. I know that this is not the best user experience, but my apps UI is not built in Swift and I don't have the resources to build the UI again. At least this way the user experience would be improved from what it is now, where users cannot even launch the app. I get many requests per week about this feature and would love to improve the UX for my users, even if it's not the best possible. Thanks, Alex
1
0
290
Jul ’25
Live Photos created with PHLivePhoto API show "Motion not available" when setting as wallpaper
I'm creating Live Photos programmatically in my app using the Photos and AVFoundation frameworks. While the Live Photos work perfectly in the Photos app (long press shows motion), users cannot set them as motion wallpapers. The system shows "Motion not available" message. Here's my approach for creating Live Photos: // 1. Create video with required metadata let writer = try AVAssetWriter(outputURL: videoURL, fileType: .mov) let contentIdentifier = AVMutableMetadataItem() contentIdentifier.identifier = .quickTimeMetadataContentIdentifier contentIdentifier.value = assetIdentifier as NSString writer.metadata = [contentIdentifier] // Video settings: 882x1920, H.264, 30fps, 2 seconds // Added still-image-time metadata at middle frame // 2. Create HEIC image with asset identifier var makerAppleDict: [String: Any] = [:] makerAppleDict["17"] = assetIdentifier // Required key for Live Photo metadata[kCGImagePropertyMakerAppleDictionary as String] = makerAppleDict // 3. Generate Live Photo PHLivePhoto.request( withResourceFileURLs: [photoURL, videoURL], placeholderImage: nil, targetSize: .zero, contentMode: .aspectFit ) { livePhoto, info in // Success - Live Photo created } // 4. Save to Photos library PHAssetCreationRequest.forAsset().addResource(with: .photo, fileURL: photoURL, options: nil) PHAssetCreationRequest.forAsset().addResource(with: .pairedVideo, fileURL: videoURL, options: nil) What I've Tried Matching exact video specifications from Camera app (882x1920, H.264, 30fps) Adding all documented metadata (content identifier, still-image-time) Testing various video durations (1.5s, 2s, 3s) Different image formats (HEIC, JPEG) Comparing with exiftool against working Live Photos Expected Behavior Live Photos created programmatically should be eligible for motion wallpapers, just like those from the Camera app. Actual Behavior System shows "Motion not available" and only allows setting as static wallpaper. Any insights or workarounds would be greatly appreciated. This is affecting our users who want to use their created content as wallpapers. Questions Are there additional undocumented requirements for Live Photos to be wallpaper-eligible? Is this a deliberate restriction for third-party apps, or a bug? Has anyone successfully created Live Photos that work as motion wallpapers? Environment iOS 17.0 - 18.1 Xcode 16.0 Tested on iPhone 16 Pro
1
1
271
Aug ’25
Implementation of Audio-Video Synchronization in Swift
I have a feature requirement: to switch the writer for file writing every 5 minutes, and then quickly merge the last two files. How can I ensure that the merged file is seamlessly combined and that the audio and video information remains synchronized? Currently, the merged video has glitches, and the audio is also out of sync. If there are experts who can provide solutions in this area, I would be extremely grateful.
1
0
218
Aug ’25
How can I create my own Genlock hardware for the iPhone 17 Pro?
What options do I have if I don't want to use Blackmagic's Camera ProDock as the external Sync Hardware, but instead I want to create my own USB-C hardware accessory which would show up as an AVExternalSyncDevice on the iPhone 17 Pro? Which protocol does my USB-C device have to implement to show up as an eligible clock device in AVExternalSyncDevice.DiscoverySession?
1
0
845
Sep ’25
Blurry Depth Data since iPhone 13
I tested the accuracy of the depth map on iPhone 12, 13, 14, 15, and 16, and found that the variance of the depth map after iPhone 12 is significantly greater than that of iPhone 12. Enabling depth filtering will cause the depth data to be affected by the previous frame, adding more unnecessary noise, especially when the phone is moving. This is not friendly for high-precision reconstruction. I tried to add depth map smoothing in post-processing to solve the problem of large depth map deviation, but the performance is still poor. Is there any depth map smoothing solutions already announced by Apple?
1
0
70
Sep ’25
Orientation does not work on iPhone 17 and above.
I'm receiving output from avcapturesession and capturing an image using Vision, but the image is output in landscape orientation instead of portrait. Even when I set the orientation to up in ciimage, cgimage, and uiimage, the image is still output in landscape orientation. On iPhones 16 and below, the image is output in portrait orientation. But on iPhones 17 and above, the image is output in landscape orientation. Please help.
1
1
207
17h
Error capturing ProRAW using iPhone 17 Pro Telephoto with photoQualityPrioritization set to .Quality
Hey, I'm having a very strange issue on my iPhone 17 Pro. I'm trying to capture a 12MP ProRAW image using the Telephoto Lens with the photoQualityPrioritization set to .Quality. Unfortunately I receive this error when trying to capture the image: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSUnderlyingError=0x134f7a1f0 {Error Domain=NSOSStatusErrorDomain Code=-16802 "(null)"}, NSLocalizedFailureReason=An unknown error occurred (-16802), AVErrorRecordingFailureDomainKey=4, NSLocalizedDescription=The operation could not be completed} The photo captures correctly at 7.9x zoom, it's only a problem when the zoom goes over 8x. Also, it's only this particular configuration of settings which causes the issue. I'm able to capture an image if I either: Set quality to ".balanced" Set max dimensions to 48MP Capture a JPEG image instead of a ProRAW image Use the TripleCamera fusion lens Any help would be greatly appreciated. Alex
1
0
246
Sep ’25