Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Posts under Photos & Camera subtopic

Post

Replies

Boosts

Views

Activity

Do we need an MFi authentication chip to send video/commands to iPhone?
Hello, my company is developing a product that will send data to/from the phone over cable and Wi-Fi. I have three questions: Do we need an MFi authentication chip in our product if we plan to send video and commands to the iPhone/iPad over USB or Lightning cable? Likewise, do we need an MFI authentication chip for communication over Wi-Fi? (Informal research suggests that the answer is no to this one.) And, do we even still need MFI certification at all for Wi-Fi comms? (We are not using HomeKit.) Thank you!
0
0
437
Feb ’25
PHPhotoLibrary.shared().performChanges raised error 3303
I want to modify the photo's exif information, which means putting the original image in through CGImageDestinationAddImageFromSource, and then adding the modified exif information to a place called properties. Here is the complete process: static func setImageMetadata(asset: PHAsset, exif: [String: Any]) { let options = PHContentEditingInputRequestOptions() options.canHandleAdjustmentData = { (adjustmentData) -> Bool in return true } asset.requestContentEditingInput(with: options, completionHandler: { input, map in guard let inputNN = input else { return } guard let url = inputNN.fullSizeImageURL else { return } let output = PHContentEditingOutput(contentEditingInput: inputNN) let adjustmentData = PHAdjustmentData(formatIdentifier: AppInfo.appBundleId(), formatVersion: AppInfo.appVersion(), data: Data()) output.adjustmentData = adjustmentData let outputURL = output.renderedContentURL guard let source = CGImageSourceCreateWithURL(url as CFURL, nil) else { return } guard let dest = CGImageDestinationCreateWithURL(outputURL as CFURL, UTType.jpeg.identifier as CFString, 1, nil) else { return } CGImageDestinationAddImageFromSource(dest, source, 0, exif as CFDictionary) let d = CGImageDestinationFinalize(dest) // d is true, and I checked the content of outputURL, image has been write correctly, it could be convert to UIImage and image is ok. PHPhotoLibrary.shared().performChanges { let changeReq = PHAssetChangeRequest(for: asset) changeReq.contentEditingOutput = output } completionHandler: { succ, err in if !succ { print(err) // 3303 here, always! } } }) }
1
0
375
Feb ’25
Set the capture device color space to apple log not works.
I set the device format and colorspace to Apple Log and turn off the HDR, why the movie output is still in HDR format rather than ProRes Log? Full runnable demo here: https://github.com/SpaceGrey/ColorSpaceDemo session.sessionPreset = .inputPriority // get the back camera let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: .video, position: .back) backCamera = deviceDiscoverySession.devices.first! try! backCamera.lockForConfiguration() backCamera.automaticallyAdjustsVideoHDREnabled = false backCamera.isVideoHDREnabled = false let formats = backCamera.formats let appleLogFormat = formats.first { format in format.supportedColorSpaces.contains(.appleLog) } print(appleLogFormat!.supportedColorSpaces.contains(.appleLog)) backCamera.activeFormat = appleLogFormat! backCamera.activeColorSpace = .appleLog print("colorspace is Apple Log \(backCamera.activeColorSpace == .appleLog)") backCamera.unlockForConfiguration() do { let input = try AVCaptureDeviceInput(device: backCamera) session.addInput(input) } catch { print(error.localizedDescription) } // add output output = AVCaptureMovieFileOutput() session.addOutput(output) let connection = output.connection(with: .video)! print( output.outputSettings(for: connection) ) /* ["AVVideoWidthKey": 1920, "AVVideoHeightKey": 1080, "AVVideoCodecKey": apch,<----- prores has enabled. "AVVideoCompressionPropertiesKey": { AverageBitRate = 220029696; ExpectedFrameRate = 30; PrepareEncodedSampleBuffersForPaddedWrites = 1; PrioritizeEncodingSpeedOverQuality = 0; RealTime = 1; }] */ previewSource = DefaultPreviewSource(session: session) queue.async { self.session.startRunning() } }
1
0
416
Mar ’25
Distortion corrected images
Hi, Currently I am developing a 3D reconstruction project. Which requires images to be distortion-free (rectilinear) and with known intrinsics. The session I am developing on is a builtInDualWideCamera, with isGeometricDistortionCorrectionEnabled set to false to be able to get the intrinsic matrix of the images, isVirtualDeviceConstituentPhotoDeliveryEnabled set to true and isAutoVirtualDeviceFusionEnabled set to false to get both images and isCameraCalibrationDataDeliveryEnabled set to true to actually get the calibration data. The distortion correction parameters such as lensDistortionLookupTable are used. The 42 coefficients mapping array is used as described in the AVCameraCalibrationData header file. A simple piecewise linear interpolation. There are two questions I would like to get support on: A way to set the calibration parameters in each image. I have an approach that sets the parameters in the kCGImagePropertyExifDictionary -> "UserComment". Is there a better approach to write calibration parameter data into the images? I feel like this is a bit dirty and there might be a better and neat approach. For the ultra-wide angle camera's images, the lensDistortionLookupTable contains several zeros at the end of the array. For example (last 10 elements are zero): "LensDistortionLookupTable":"0.000000000000000,0.000349554029526,0.001385628827848,0.003071037586778,... ,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000" The problem comes when the complete array is used to correct the image (including zeros), the end result is a wrapped-like-circle image close to the edges of it which is completely wrong. In contrast, if the LensDistortionLookupTable is used without the last zeros and the new size accommodated the image looks better (although not as rectilinear as if you take the image from the iPhone's camera app), but definitely less distorted. Including zeros (full array): Excluding zeros (array size changed): Am I missing an important point in the usage of the lensDistortionLookupTable where this case is addressed (zeros at the end)? What is the criteria to shrink/exclude elements of the array? Any advice is very much welcome.
0
0
417
Feb ’25
Accessing AV External Storage
Is it possible to use the AVExternalStorageDevice to access external storage from a connected camera or usb drive (via USB C or Lightning connector) on an iPad/iPhone. I have tested the following code on an iPhone 14 (iOS 18.1.1) and an iPad Gen 10 (18.3.1), and both return false for: // returns false on iPhone 14, iPad gen 10 print(AVExternalStorageDeviceDiscoverySession.isSupported) The following code returns null, when I try to access the external storage discovery session. // returns null on iOS devices print(AVExternalStorageDeviceDiscoverySession.shared) The following returns false, without displaying a permission dialog: AVExternalStorageDevice.requestAccess(completionHandler: { (granted: Bool) in // returns false with no permission dialog print(granted); What type of iOS devices are supported by AVExternalStorageDeviceDiscoverySession? What situations has it been used for (e.g. connecting to Camera via the external storage protocol, accessing photos from a SD card with an adapter, accessing photos from usb drive). Is there are sample code for using the AV External Storage api?
0
0
410
Feb ’25
Above Xcode16 operation project, in the project use AVPictureInPictureController opportunities (PIP) function open system blackout
I found that when the development tool above Xcode16 ran my app, I opened the suspended inscription function, and then opened the system camera, the content in the suspended window would not be displayed, and the suspended window would have a black screen. However, this phenomenon does not appear on Xcode15.4 development tools, it is the same code, I do not know why
6
0
655
Apr ’25
Crash in Photos framework
Hi This is one of our top crashes. It does not contain any of our code in the stacktrace and we can't reproduce it. Those points make this crash very hard to understand and fix. We know that most of the crashes are happening on iPhone 13 with iOS 18.x.x. Also we see that a lot of cases happen when app goes into background (stacktrace contains -[UIApplication _applicationDidEnterBackground]). 2025-03-04_16-06-00.3670_-0500-6a273c7d5da97f098b5cc24898bb9761dc45208e.crash 2025-03-04_20-21-08.6609_-0500-2c08f640900f8a62c4f8a4f6f2a61feb052e66dd.crash 2025-03-04_20-46-27.7138_+0000-4d7ea89b1b564eda22ca63e708f7ad3909c7b768.crash
2
0
436
Mar ’25
iPhone 16 Camera Control and AVCaptureSlider – Is there a way to detect which slider is active?
I am following the Apple sample code and trying to add a manual focus lens position slider: @available(iOS 18.0, *) private func addCameraControls() { if !self.session.controls.isEmpty { for control in self.session.controls { self.session.removeControl(control) } } self.cameraControlFocusSlider = nil //Focus Slider if self.videoDevice!.isLockingFocusWithCustomLensPositionSupported { self.cameraControlFocusSlider = AVCaptureSlider("Focus", symbolName: "dot.square", in: 0.0...1.0) self.cameraControlFocusSlider!.setActionQueue(self.sessionQueue) { focusValue in //Do manual focus } if self.session.canAddControl(self.cameraControlFocusSlider!) { self.session.addControl(self.cameraControlFocusSlider!) } } } So there are these AVCaptureSessionControlsDelegate methods: final func sessionControlsDidBecomeActive(_ session: AVCaptureSession) { print ("sessionControlsDidBecomeActive") } final func sessionControlsWillEnterFullscreenAppearance(_ session: AVCaptureSession) { print ("sessionControlsWillEnterFullscreenAppearance") } final func sessionControlsWillExitFullscreenAppearance(_ session: AVCaptureSession) { print ("sessionControlsWillExitFullscreenAppearance") } final func sessionControlsDidBecomeInactive(_ session: AVCaptureSession) { print ("sessionControlsDidBecomeInactive") } So when self.cameraControlFocusSlider is presented, I have to show the current value of the lense position. Lens position can change from auto focus and also from manual focus by the user using the app UI. Is there a way to see if self.cameraControlFocusSlider is active or being used? Please note that I will have more than one AVCaptureSlider in the final code.
0
0
428
Mar ’25
Prevent iOS from Switching Between Back Camera Lenses in getUserMedia (Safari/WebView, iOS 18)
I’m developing a hybrid app (WebView / Turbo Native) that uses getUserMedia to access the back camera for a PPG/heart rate measurement feature (the user places their finger on the camera). Problem: Even when I specify constraints like: { video: { deviceId: '...', facingMode: { exact: 'environment' }, advanced: [{ zoom: 1.0 }] }, audio: false } On iPhone 15 (iOS 18), iOS unexpectedly switches between the wide, ultra-wide, and telephoto lenses during the measurement. This breaks the heart rate detection, and it forces the user to move their finger in the middle of the measurement. Question: Is there any way, via getUserMedia/WebRTC, to force iOS to use only the wide-angle lens and prevent automatic lens switching? I know that with AVFoundation (Swift) you can pick .builtInWideAngleCamera, but I’m hoping to avoid building a custom native layer and would prefer to stick with WebView/JavaScript if possible to save time and complexity. Any suggestions, workarounds, or updates from Apple would be greatly appreciated! Thanks a lot!
1
0
397
Mar ’25
Create live photo error
I want to create a Live Photo. The project includes a .jpg image and a .mov video (2 seconds). Two permissions in xcode have been added: Privacy - Photo Library Usage Description Privacy - Photo Library Additions Usage Description Simulate: iphone 16, ios 18.3 The codes in ContentView.swift : private func saveLivePhoto(imageURL: URL, videoURL: URL, completion: @escaping (Bool, Error?) -> Void) { PHPhotoLibrary.shared().performChanges { let creationRequest = PHAssetCreationRequest.forAsset() let options = PHAssetResourceCreationOptions() options.shouldMoveFile = false creationRequest.addResource(with: .photo, fileURL: imageURL, options: options) creationRequest.addResource(with: .pairedVideo, fileURL: videoURL, options: options) } completionHandler: { success, error in DispatchQueue.main.async { print(error) completion(success, error) } } } guard let imageURL = Bundle.main.url(forResource: "livephoto", withExtension: "jpeg"), let videoURL = Bundle.main.url(forResource: "livephoto", withExtension: "mov") else { showAlertMessage(title: "error", message: "cant find Live Photo ") return } print("imageURL: \(imageURL)") print("videoURL: \(videoURL)") saveLivePhoto(imageURL: imageURL, videoURL: videoURL) { success, error in if success { xxxxx } else { xxxxx } } Really need help, thanks
0
0
193
Mar ’25
PHPickerResult take loong time to load large video
On some devices, loadFileRepresentation(forTypeIdentifier: completionHandler) take a loong time(about two minute) to callback result for some large video(about 200 MB, take by device camera). environment: Model: iPhone 12 Model Number: MGGM3CH/A iOS Version: 18.3.2 PHPickerResult.NSItemProvider.loadFileRepresentation() // import PhotosUI func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { picker.dismiss(animated: true, completion: nil) guard let provider = results.last?.itemProvider else { return } guard provider.hasItemConformingToTypeIdentifier(UTType.movie.identifier) else { return } Task { provider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { url, error in guard let url = url else { return } // Do some stuff... } } } ps: I also try some other function, eg: provide.loadItem(forTypeIdentifier:), but not work too.
1
0
77
Mar ’25
PHPickerResult return different data for the one media
On some devices, when i select the same media multiple times, the data by` loadFileRepresentation(forTypeIdentifier: completionHandler) ` returned is different(data.count is not equal). environment: * Model: iPhone 12 * Model Number: MGGM3CH/A * iOS Version: 18.3.2 ```Swift // import PhotosUI func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { picker.dismiss(animated: true, completion: nil) guard let provider = results.last?.itemProvider else { return } guard provider.hasItemConformingToTypeIdentifier(UTType.movie.identifier) else { return } Task { provider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { url, error in guard let url = url else { return } if let data = try? Data(contentsOf: url) { print("data count is: \(data.count)") } } } } ``` ps: I also try some other function, eg: ` provide.loadItem(forTypeIdentifier:)`, but not work too.
1
0
54
Mar ’25
Vision Framework VNTrackObjectRequest: Minimum Valid Bounding Box Size Causing Internal Error (Code=9)
I'm developing a tennis ball tracking feature using Vision Framework in Swift, specifically utilizing VNDetectedObjectObservation and VNTrackObjectRequest. Occasionally (but not always), I receive the following runtime error: Failed to perform SequenceRequest: Error Domain=com.apple.Vision Code=9 "Internal error: unexpected tracked object bounding box size" UserInfo={NSLocalizedDescription=Internal error: unexpected tracked object bounding box size} From my investigation, I suspect the issue arises when the bounding box from the initial observation (VNDetectedObjectObservation) is too small. However, Apple's documentation doesn't clearly define the minimum bounding box size that's considered valid by VNTrackObjectRequest. Could someone clarify: What is the minimum acceptable bounding box width and height (normalized) that Vision Framework's VNTrackObjectRequest expects? Is there any recommended practice or official guidance for bounding box size validation before creating a tracking request? This information would be extremely helpful to reliably avoid this internal error. Thank you!
1
0
106
Apr ’25
PHFetchOptions: Full List of Supported Predicate Keys
Hi, Could anybody share the full list of supported Predicate keys for the PHFetchOptions? I'm aware of the list that is posted in the documentation: https://developer.apple.com/documentation/photos/phfetchoptions However I have reason to believe that this is not an exhaustive list and there also seem to be mistakes in this doc. i.e. isFavorite does not work but favorite does. Through some experimentation I also found that this works: NSPredicate(format: "adjustmentFormatIdentifier == 'com.pixelmatorteam.touch.x.photo.PhotosAdjustmentData.EmbeddedSlimSidecarFileInfo.compressed'") even though adjustmentFormatIdentifier is not listed as a supported key. Are there other secret keys that you are aware of? Specifically I want to filter a fetch result for edited items. Something like this: NSPredicate(format: "hasAdjustments == true") (I tried this, doesn't work) The native Photos app has such a filter which leads me to believe that there probably is a key for this: If one of the Framework Developers reads this: Could you please update the documentation page with this information? Finally if there really aren't any more secret keys, is there a way to achieve this with adjustmentFormatIdentifier? I have tried a bunch of stuff already like adjustmentFormatIdentifier != nil but for some reason that gives me the exact opposite of what I want: all the photos without edits. 🫠 Any tips on the correct syntax here would be much appreciated.
1
0
123
Jun ’25
NonLiDAR Photogrammetry
Hello! In iOS1.7.5, photogrammetry sessions cannot be performed on iPhones without LiDAR, but I don't think there is much difference in GPU performance between those with and without LiDAR. For example, the chips installed in the iPhone 14 Pro and iPhone 15 are the same A16 Bionic, and I think the GPU performance is also the same. Despite this, photogrammetry can be performed on the iPhone 14 Pro but not on the iPhone 15. Why is this? In fact, we have confirmed that if you transfer images taken with an iPhone 16 without LiDAR to an iPhone 16 Pro and run a photogrammetry session using those images, a 3D model can be generated. Also, will photogrammetry be able to be performed on high-performance iPhones without LiDAR in the future?
0
0
79
Apr ’25
AVCapturePhotoOutput crashes at delegate callback on MacOS 13.7.5
A functioning Multiplatform app, which includes use of Continuity Camera on an M1MacMini running Sequoia 15.5, works correctly capturing photos with AVCapturePhoto. However, that app (and a test app just for Continuity Camera) crashes at delegate callback when run on a 2017 MacBookPro under MacOS 13.7.5. The app was created with Xcode 16 (various releases) and using Swift 6 (but tried with 5). Compiling and running the test app with Xcode 15.2 on the 13.7.5 machine also crashes at delegate callback. The iPhone 15 Continuity Camera gets detected and set up correctly, and preview video works correctly. It's when the CapturePhoto code is run that the crash occurs. The relevant capture code is: func capturePhoto() { let captureSettings = AVCapturePhotoSettings() captureSettings.flashMode = .auto photoOutput.maxPhotoQualityPrioritization = .quality photoOutput.capturePhoto(with: captureSettings, delegate: PhotoDelegate.shared) print("**** CameraManager: capturePhoto") } and the delegate callbacks are: class PhotoDelegate: NSObject, AVCapturePhotoCaptureDelegate { nonisolated(unsafe) static let shared = PhotoDelegate() // MARK: - Delegate callbacks func photoOutput( _ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: (any Error)? ) { print("**** CameraManager: didFinishProcessingPhoto") guard let pData = photo.fileDataRepresentation() else { print("**** photoOutput is empty") return } print("**** photoOutput data is \(pData.count) bytes") } func photoOutput( _ output: AVCapturePhotoOutput, willBeginCaptureFor resolvedSettings: AVCaptureResolvedPhotoSettings ) { print("**** CameraManager: willBeginCaptureFor") } func photoOutput(_ output: AVCapturePhotoOutput, willCapturePhotoFor resolvedSettings: AVCaptureResolvedPhotoSettings) { print("**** CameraManager: willCaptureCapturePhotoFor") } } The crash report significant parts are..... Crashed Thread: 3 Dispatch queue: com.apple.cmio.CMIOExtensionProviderHostContext Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Codes: KERN_INVALID_ADDRESS at 0x0000000000000000 Exception Codes: 0x0000000000000001, 0x0000000000000000 Termination Reason: Namespace SIGNAL, Code 11 Segmentation fault: 11 Terminating Process: exc handler [30850] VM Region Info: 0 is not in any region. Bytes before following region: 4296495104 REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL UNUSED SPACE AT START ---> __TEXT 100175000-10017f000 [ 40K] r-x/r-x SM=COW ...tinuityCamera Thread 0:: Dispatch queue: com.apple.main-thread 0 libsystem_kernel.dylib 0x7ff803aed552 mach_msg2_trap + 10 1 libsystem_kernel.dylib 0x7ff803afb6cd mach_msg2_internal + 78 2 libsystem_kernel.dylib 0x7ff803af4584 mach_msg_overwrite + 692 3 libsystem_kernel.dylib 0x7ff803aed83a mach_msg + 19 4 CoreFoundation 0x7ff803c07f8f __CFRunLoopServiceMachPort + 145 5 CoreFoundation 0x7ff803c06a10 __CFRunLoopRun + 1365 6 CoreFoundation 0x7ff803c05e51 CFRunLoopRunSpecific + 560 7 HIToolbox 0x7ff80d694f3d RunCurrentEventLoopInMode + 292 8 HIToolbox 0x7ff80d694d4e ReceiveNextEventCommon + 657 9 HIToolbox 0x7ff80d694aa8 _BlockUntilNextEventMatchingListInModeWithFilter + 64 10 AppKit 0x7ff806ca59d8 _DPSNextEvent + 858 11 AppKit 0x7ff806ca4882 -[NSApplication(NSEvent) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 1214 12 AppKit 0x7ff806c96ef7 -[NSApplication run] + 586 13 AppKit 0x7ff806c6b111 NSApplicationMain + 817 14 SwiftUI 0x7ff90e03a9fb 0x7ff90dfb4000 + 551419 15 SwiftUI 0x7ff90f0778b4 0x7ff90dfb4000 + 17578164 16 SwiftUI 0x7ff90e9906cf 0x7ff90dfb4000 + 10340047 17 ContinuityCamera 0x10017b49e 0x100175000 + 25758 18 dyld 0x7ff8037d1418 start + 1896 Thread 1: 0 libsystem_pthread.dylib 0x7ff803b27bb0 start_wqthread + 0 Thread 2: 0 libsystem_pthread.dylib 0x7ff803b27bb0 start_wqthread + 0 Thread 3 Crashed:: Dispatch queue: com.apple.cmio.CMIOExtensionProviderHostContext 0 ??? 0x0 ??? 1 AVFCapture 0x7ff82045996c StreamAsyncStillCaptureCallback + 61 2 CoreMediaIO 0x7ff813a4358f __94-[CMIOExtensionProviderHostContext captureAsyncStillImageWithStreamID:uniqueID:options:reply:]_block_invoke + 498 3 libxpc.dylib 0x7ff803875b33 _xpc_connection_reply_callout + 36 4 libxpc.dylib 0x7ff803875ab2 _xpc_connection_call_reply_async + 69 5 libdispatch.dylib 0x7ff80398b099 _dispatch_client_callout3 + 8 6 libdispatch.dylib 0x7ff8039a6795 _dispatch_mach_msg_async_reply_invoke + 387 7 libdispatch.dylib 0x7ff803991088 _dispatch_lane_serial_drain + 393 8 libdispatch.dylib 0x7ff803991d6c _dispatch_lane_invoke + 417 9 libdispatch.dylib 0x7ff80399c3fc _dispatch_workloop_worker_thread + 765 10 libsystem_pthread.dylib 0x7ff803b28c55 _pthread_wqthread + 327 11 libsystem_pthread.dylib 0x7ff803b27bbf start_wqthread + 15 Of course, the MacBookPro is an old device - but Continuity Camera works with the installed Photo Booth app, so it's possible. Any thoughts on solving this situation would be appreciated. Regards, Michaela
1
0
440
2w
Impact on iOS Due to Image Policy Changes with Android Target SDK 34
As the image access policy has changed with Android targeting SDK 34, I’m planning to update the way our app accesses photos. We are using the react-native-image-picker library to access images. On Android, the system no longer prompts the user for image access permissions, but on iOS, permission requests still appear. Since Android no longer requires explicit permissions, I’ve removed the permission request logic for Android. In this case, is it also safe to remove the permission request for iOS? In our app, photo access is only used for changing the user profile picture and attaching images when writing a post on the bulletin board. Are there any limitations or considerations for this kind of usage?
1
0
94
Apr ’25
[iOS 18.0] PHImageManager request image crash
I am seeing a crash on iOS 18.0 in my app due to PHImageManager.default().requestImage. Same crash is seen even while using requestImageDataAndOrientation. Not sure why this is happening only on iOS 18.0. Any help on this would be appreciated. Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Subtype: KERN_INVALID_ADDRESS at 0x0000000000000010 Exception Codes: 0x0000000000000001, 0x0000000000000010 VM Region Info: 0x10 is not in any region. Bytes before following region: 4310777840 REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL UNUSED SPACE AT START ---> Termination Reason: SIGNAL 11 Segmentation fault: 11 Terminating Process: exc handler [695] Triggered by Thread: 14 Thread 14 name: Dispatch queue: */file=33) .Generic Thread 14 Crashed: 0 libobjc.A.dylib 0x1944d7020 objc_msgSend + 32 1 PhotoLibraryServices 0x1b080262c +[PLUniformTypeIdentifier utiWithCompactRepresentation:conformanceHint:] + 40 2 Photos 0x1b0081354 _resourceInfoFromResultDict + 1048 3 Photos 0x1b0080a7c fetchResourcesForChoosing + 584 4 Photos 0x1b0080714 ___fetchNonHintResources_block_invoke.227 + 152 5 PhotoLibraryServices 0x1b026b3c4 __53-[PLManagedObjectContext _directPerformBlockAndWait:]_block_invoke + 48 6 CoreData 0x19f171b00 developerSubmittedBlockToNSManagedObjectContextPerform + 228 7 libdispatch.dylib 0x19eeff584 _dispatch_client_callout + 16 8 libdispatch.dylib 0x19eef5728 _dispatch_lane_barrier_sync_invoke_and_complete + 56 9 CoreData 0x19f1c2a0c -[NSManagedObjectContext performBlockAndWait:] + 308 10 PhotoLibraryServices 0x1b026cea4 -[PLManagedObjectContext _directPerformBlockAndWait:] + 144 11 PhotoLibraryServices 0x1b026cdf8 -[PLManagedObjectContext performBlockAndWait:] + 196 12 Photos 0x1b00804a4 _fetchNonHintResources + 292 13 Photos 0x1afeedb38 PHChooserListContinueEnumerating + 144 14 Photos 0x1afeed9f8 -[PHImageResourceChooser presentNextQualifyingResource] + 412 15 Photos 0x1afeed1d0 -[PHImageRequest startRequest] + 2424 16 libdispatch.dylib 0x19eee5aac _dispatch_call_block_and_release + 32 17 libdispatch.dylib 0x19eeff584 _dispatch_client_callout + 16 18 libdispatch.dylib 0x19eeee2d0 _dispatch_lane_serial_drain + 740 19 libdispatch.dylib 0x19eeeede0 _dispatch_lane_invoke + 440 20 libdispatch.dylib 0x19eef91dc _dispatch_root_queue_drain_deferred_wlh + 292 21 libdispatch.dylib 0x19eef8a60 _dispatch_workloop_worker_thread + 540 22 libsystem_pthread.dylib 0x2214d5660 _pthread_wqthread + 292 23 libsystem_pthread.dylib 0x2214d29f8 start_wqthread + 8
6
0
143
May ’25
PhotosPicker obtains the result information of the selected video
In the AVP project, a selector pops up, only wanting to filter spatial videos. When selecting the material of one of the spatial videos, the selection result returns empty. How can we obtain the video selected by the user and get the path and the URL of the file The code is as follows: PhotosPicker(selection: $selectedItem, matching: .videos) { Text("Choose a spatial photo or video") } func loadTransferable(from imageSelection: PhotosPickerItem) -> Progress { return imageSelection.loadTransferable(type: URL.self) { result in DispatchQueue.main.async { // guard imageSelection == self.imageSelection else { return } print("加载成功的图片集合:(result)") switch result { case .success(let url?): self.selectSpatialVideoURL = url print("获取视频链接:(url)") case .success(nil): break // Handle the success case with an empty value. case .failure(let error): print("spatial错误:(error)") // Handle the failure case with the provided error. } } } }
4
0
126
May ’25