Device: iPhone 16 Pro Max
OS: iOS 26.0.1
AVCaptureDevice.DeviceType: builtInUltraWideCamera
avtiveFormat: <AVCaptureDeviceFormat: 0x10ffb9ac0 'vide'/'420v' 1440x1080, { 1- 60 fps}, photo dims:{1440x1080,2016x1512}, fov:101.022, gdc fov:103.625, binned, max zoom:94.50 (upscales @1.40), system zoom range:1.0-3.0, AF System:1, ISO:15.0-3600.0, SS:0.000023-1.000000, system exposure bias range:-2.0-2.0, supports multicam, supports CS RoI, supports Smart Style, supports Smudge Detection>
API: device.isFocusModeSupported(.continuousAutoFocus) == true
setting: device.focusMode = .continuousAutoFocus
setting is ok, but it's not working actually with continuousAutoFocus
Photos & Camera
RSS for tagExplore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
The iPhone 17’s front camera with the new 18MP square sensor and iOS 26’s Center Stage feature can auto-rotate between portrait and landscape like the native iOS Camera app.
Is there a Swift or AVFoundation API that allows developers to manually control front camera orientation in the same way the native Camera app does? Or is this auto-rotation strictly handled by the system without public API access?
The introduction of PHBackgroundResourceUploadExtension is a welcome addition in iOS 26.1. I wonder however, how to attach a debugger and actually get the system to call the process() method of the extension. I tried to run the extension both inside photos app (and also the main app for testing), but when I take a photo or add photos to the library (saving), the process() method does never get called. Any hints would be appreciated to debug the PHBackgroundResourceUploadExtension during development.
We are planning to develop an app that connects to a UVC camera to capture and display video via AVFoundation. Could you please advise on which iPhone models support UVC cameras?
Hello, We are trying to use the new Background Upload Extension to improve uploads of assets (Photos, Live Photos, Videos) in the background in our application.
1-The assets have finished uploading, but I'm unable to retrieve successful records using PHAssetResourceUploadJob.fetchJobs(action: .acknowledge, options: nil). When will the successful records be returned?
2-How to retrieve the system's pending tasks ? We want to cancel tasks handed over to the system when returning to the main app to avoid duplicate resource uploads.
3-When we set UploadJobExtensionEnabled = true, will tasks handed over to the system still execute after returning to the main app? Do we need to set UploadJobExtensionEnabled = false upon returning to the main app? If we set UploadJobExtensionEnabled = false, will previously submitted upload tasks be cleared?
Topic:
Media Technologies
SubTopic:
Photos & Camera
Hello, We are trying to use the new Background Upload Extension to improve uploads of assets (Photos, Live Photos, Videos) in the background in our application.
1-The assets have finished uploading, but I'm unable to retrieve successful records using PHAssetResourceUploadJob.fetchJobs(action: .acknowledge, options: nil). When will the successful records be returned?
2-How to retrieve the system's pending tasks? We want to cancel tasks handed over to the system when returning to the main app to avoid duplicate resource uploads.
3-When we set UploadJobExtensionEnabled = true, will tasks handed over to the system still execute after returning to the main app? Do we need to set UploadJobExtensionEnabled = false upon returning to the main app? If we set UploadJobExtensionEnabled = false, will previously submitted upload tasks be cleared?
Topic:
Media Technologies
SubTopic:
Photos & Camera
I am developing an iOS camera app that can record video directly to external storage connected to an iPhone.
To detect whether an external USB storage device is connected and to obtain its URL, I am considering using AVExternalStorageDeviceDiscoverySession.
However, when checking support using AVExternalStorageDeviceDiscoverySession.isSupported, I observe that it returns true only on Pro model iPhones, and false on non-Pro models in my environment.
I have reviewed Apple’s official documentation, but I could not find any clear description of the supported devices or requirements (for example, whether this API is limited to Pro models or requires specific hardware capabilities).
I would appreciate any information regarding the following points:
●The actual requirements for AVExternalStorageDeviceDiscoverySession to be supported
Device limitations (Pro vs non-Pro models)
Hardware requirements (USB controller, external recording capability, etc.)
iOS version dependencies
●Whether support for non-Pro models is planned in the future
Tested environments
iPhone 16 Pro (iOS 18.7.1) → isSupported == true
iPhone 16e (iOS 26.2) → isSupported == false
iPhone 17 (iOS 26.2) → isSupported == false
iPhone Air (iOS 26.2) → isSupported == false
If anyone has observed similar behavior or has official information from Apple regarding this API, I would greatly appreciate your insights.
I am developing an iOS camera app that can record video directly to external storage connected to an iPhone.
To detect whether an external USB storage device is connected and to obtain its URL, I am considering using AVExternalStorageDeviceDiscoverySession.
However, when checking support using AVExternalStorageDeviceDiscoverySession.isSupported, I observe that it returns true only on Pro model iPhones, and false on non-Pro models in my environment.
I have reviewed Apple’s official documentation, but I could not find any clear description of the supported devices or requirements (for example, whether this API is limited to Pro models or requires specific hardware capabilities).
I would appreciate any information regarding the following points:
①The actual requirements for AVExternalStorageDeviceDiscoverySession to be supported
Device limitations (Pro vs non-Pro models)
Hardware requirements (USB controller, external recording capability, etc.)
iOS version dependencies
②Whether support for non-Pro models is planned in the future
Tested environments
iPhone 16 Pro (iOS 18.7.1) → isSupported == true
iPhone 16e (iOS 26.2) → isSupported == false
iPhone 17 (iOS 26.2) → isSupported == false
iPhone Air (iOS 26.2) → isSupported == false
If anyone has observed similar behavior or has official information from Apple regarding this API, I would greatly appreciate your insights.
Hi everyone,
I'm developing a camera application that requires precise, predictable control over the focus system. I'm encountering unexpected behavior with face-driven autofocus in continuous autofocus mode.
Issue:
When using AVCaptureDevice.FocusMode.continuousAutoFocus, the system continues to prioritize faces for focus even after attempting to disable face-driven autofocus with:
device.automaticallyAdjustsFaceDrivenAutoFocusEnabled = false
device.isFaceDrivenAutoFocusEnabled = false
Observations:
The behavior is inconsistent across different scenes.
In well-lit/properly exposed scenes: focus persistently locks onto faces, ignoring my configuration.
.
In underexposed scenes: the intended focus behavior is more consistently respected.
Hi everyone, does anybody have any resources I could check out regarding the 48->12mp binning behavior on supported sensors? I know the 48mp sensor on iPhone can automatically bin pixels for better low light performance. But not sure how to reliably make this happen in practice.
On iPhone 14 Pro+ with a 48MP sensor, I want the best of both worlds for ProRAW:
∙ Bright light: 48MP full resolution
∙ Low light: 12MP pixel-binned for better noise
`photoOutput.maxPhotoDimensions = CMVideoDimensions(width: 8064, height: 6048)
let settings = AVCapturePhotoSettings(rawPixelFormatType: proRawFormat, processedFormat: [...])
settings.photoQualityPrioritization = .quality
// NOT setting settings.maxPhotoDimensions — always get 12MP`
When I omit maxPhotoDimensions, iOS always returns 12MP regardless of lighting. When I set it to 48MP, I always get 48MP.
Is there an API to let iOS automatically choose the optimal resolution based on conditions, or should I detect low light myself (via device.iso / exposureDuration) and set maxPhotoDimensions accordingly?
Any help or direction would be much appreciated!
Dear Apple Technical Support Team,
Greetings! I am an iOS app developer, currently upgrading the functions of the photo app I developed Recently, I noticed the new Spatial Photos feature added in the iOS 26 system, which brings an immersive 3D photo experience to users. We hope to integrate similar capabilities into our own app to provide users with a richer photo viewing experience.
Through technical research, we found that on Apple Vision devices, the similar spatial photo display effect can be achieved through the ImagePresentationComponent.Spatial3DImage interface. However, our tests show that this interface only supports visionOS and cannot be called in the iOS system. At present, iOS 26 already natively supports the Spatial Photos feature, and we hope to know how to enable third-party photo apps to also have this capability.
Here, we sincerely request your team to provide relevant technical support, mainly to understand the following questions:
Are there any official APIs, SDKs, or development frameworks applicable to the iOS 26 system that can support third-party apps to implement core functions such as the generation and display of spatial photos?
If there are no public adaptive interfaces available at present, are there any other compliant technical solutions or alternative paths to achieve similar effects?
For third-party apps to integrate the spatial photo feature, are there any relevant development documents, technical specifications, or review requirements that need to be followed?
We have completed the basic function iteration of the app and have the technical capability to quickly adapt to new functions. We hope to receive guidance and support from your team to help us bring a better product experience to iOS users.
Attached are the relevant information of our app and the detailed report on interface compatibility during the test for your reference. If you need any further supplementary information, please feel free to inform us.
Thank you for reviewing this email in your busy schedule, and we look forward to your reply!
Topic:
Media Technologies
SubTopic:
Photos & Camera
On iPhone 16 Pro Max (not tested other devices) there's a noticeable jump in the framing of the preview video when you record in the iOS AVCam Sample App. The same jump in camera framing can be observed by switching to the front facing camera and then back to the rear one.
It looks roughly consistent with switching between the 0.5x and 1x camera (but not quite a match for the same viewable area in the Camera app) - and it's only when it's initially loaded, once recording is started it retains the 'closer' image no matter how many times it's stopped/started thereafter.
I'm relatively new to Swift and haven't done anything with the camera before, so odd 'buggy' behaviour in the sample code isn't helping me understand it! :-)
Is there any way to fix this?
Some of our app's users are repeatedly running into a crash on NeutrinoCore -[NUIdentifier initWithNamespace:name:version:] + 2352. It looks from the stack trace like multiple threads are performing PHFetchRequests, but that shouldn't be causing a crash. It's isolated to a small number of users, which makes me think that it's something related to their specific Photos databases (e.g., data corruption.)
Do you have any suggestions how I might be able to resolve this?
2
libsystem_c.dylib
abort + 124
3
NeutrinoCore
-[NUAssertionPolicyCrashReport notifyAssertion:] + 66
4
NeutrinoCore
-[NUAssertionPolicyComposite notifyAssertion:] + 160
5
NeutrinoCore
-[NUAssertionPolicyUnique notifyAssertion:] + 176
6
NeutrinoCore
-[NUAssertionHandler handleFailureInFunction:file:lineNumber:currentlyExecutingJobName:description:arguments:] + 156
7
NeutrinoCore
_NUAssertFailHandler + 176
8
NeutrinoCore
-[NUIdentifier initWithNamespace:name:version:] + 2352
9
NeutrinoCore
-[NUIdentifier initWithName:version:] + 84
10
NeutrinoCore
-[NUIdentifier initWithName:] + 68
11
PhotoImaging
+[PISchema identifier] + 36
12
PhotoImaging
+[PISchema registeredPhotosSchemaIdentifier] + 32
13
PhotoImaging
+[PIPhotoEditHelper newComposition] + 28
14
PhotoImaging
+[PICompositionSerializer deserializeCompositionFromAdjustments:metadata:formatIdentifier:formatVersion:sidecarData:error:] + 160
15
PhotoImaging
+[PICompositionSerializer deserializeCompositionFromData:formatIdentifier:formatVersion:sidecarData:error:] + 224
16
PhotoLibraryServices
-[PLPhotoEditPersistenceManager loadCompositionFrom:formatIdentifier:formatVersion:sidecarData:error:] + 1848
17
PhotoLibraryServices
+[PLPhotoEditPersistenceManager validateAdjustmentData:formatIdentifier:formatVersion:error:] + 108
18
Photos
__167+[PHContentEditingInputRequestContext contentEditingInputRequestContextForAsset:requestID:managerID:networkAccessAllowed:downloadIntent:progressHandler:resultHandler:]_block_invoke + 260
19
Photos
-[PHAdjustmentData(ContentEditingInput) _contentEditing_readableByClientWithVerificationBlock:] + 136
20
Photos
-[PHAdjustmentData(ContentEditingInput) _contentEditing_requiredBaseVersionReadableByClient:verificationBlock:] + 88
21
Photos
-[PHContentEditingInputRequestContext _adjustmentBaseVersionFromResult:request:canHandleAdjustmentData:] + 356
22
Photos
-[PHContentEditingInputRequestContext produceChildRequestsForRequest:reportingIsLocallyAvailable:isDegraded:result:] + 624
23
Photos
-[PHMediaRequestContext _produceChildRequestsForRequest:withResult:] + 88
24
Photos
-[PHMediaRequestContext mediaRequest:didFinishWithResult:] + 92
25
Photos
-[PHAdjustmentDataRequest _finishFromAsynchronousCallback] + 124
26
Photos
__39-[PHAdjustmentDataRequest startRequest]_block_invoke + 584
27
PhotoLibraryServicesCore
__106-[PLAssetsdResourceClient adjustmentDataForAsset:networkAccessAllowed:trackCPLDownload:completionHandler:]block_invoke.84 + 880
28
CoreFoundation
invoking + 148
29
CoreFoundation
-[NSInvocation invoke] + 424
30
Foundation
<deduplicated_symbol> + 16
31
Foundation
-[NSXPCConnection _decodeAndInvokeReplyBlockWithEvent:sequence:replyInfo:] + 528
32
Foundation
__88-[NSXPCConnection _sendInvocation:orArguments:count:methodSignature:selector:withProxy:]_block_invoke_5 + 188
33
libxpc.dylib
_xpc_connection_reply_callout + 124
42
libsystem_pthread.dylib
start_wqthread + 8