Hello,
I'm new to the Swift MusicKit API and am starting with the implementation in iOS 16.
I'm getting stuck on an issue where there is no background or text color associated with the Artwork object. Is this something you have to make an additional property request for, and if so, how do you do that?
var catalogSearch = MusicCatalogResourceRequest<Album>(matching: \.id, equalTo: item.id)
let catalogResponse = try await request.response()
guard let firstItem = catalogResponse.items.first else {
return
}
In this example, firstItem.artwork only contains the url and what look like incorrect max width/height values.
here's a printout of firstItem.artwork
Optional(Artwork(
urlFormat: "musicKit://artwork/library/5F37858D-F46B-4F12-BA67-40FA8DD63D87/{w}x{h}?at=item&fat=&id=7718670444435992305&lid=5F37858D-F46B-4F12-BA67-40FA8DD63D87&mt=music&aat=Music122/v4/37/25/f5/3725f515-249f-7b91-77bb-f479cd48201c/22UMGIM32254.rgb.jpg",
maximumWidth: 0,
maximumHeight: 0
))
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi Apple Developer Forums,
I’m developing an iOS camera app that processes RAW captures using Core Image. I’m seeing a large “first use” performance penalty specifically when creating the CIImage from CIRAWFilter.outputImage.
What’s slow (important detail)
I’m measuring the time for:
let rawFilter = CIRAWFilter(imageData: rawData, identifierHint: hint)
let ciImage = rawFilter.outputImage
This is not CIContext.render(...) / createCGImage(...). It’s just the time to access outputImage (i.e., building the Core Image graph / RAW pipeline setup).
Observed behavior
First time accessing CIRAWFilter.outputImage: ~3 seconds
Second time (same app session, similar RAW): ~3 milliseconds
So something heavy is happening only on first use (decoder initialization, pipeline setup, shader/library compilation, caching, etc.).
Using Metal System Trace, I also noticed that during the slow first call there are many “Create MTLLibrary” events, while the second call doesn’t show this pattern.
Warm-up attempts using bundled DNG
I tried to “warm up” early (e.g., on camera screen entry) by loading a bundled DNG and then accessing CIRAWFilter.outputImage by taking a photo:
Warm-up with a ~247 KB DNG → first real RAW outputImage cost drops to ~1.42s
Warm-up with a ~25 MB DNG → first real RAW outputImage cost drops to ~843ms
This helps, but it’s still far from the steady-state ~3ms.
Warm-up by capturing a real RAW (works, but concerns)
The only method that fully eliminates the delay is to trigger a real RAW capture programmatically before the user’s first photo, then use that captured rawData to warm up the CIRAWFilter.outputImage path. This brings the first user-facing capture close to the steady-state timing.
However:
In some regions, the camera shutter sound cannot be suppressed, so “hidden warm-up capture” is unacceptable UX.
I’m also unsure whether triggering a real capture without an explicit user action could raise compliance/privacy concerns, even if the image is immediately discarded and never saved/uploaded.
Questions
Is the large first-time cost of CIRAWFilter.outputImage expected (RAW pipeline initialization / shader compilation)?
Is there an Apple-recommended way to pre-initialize the Core Image RAW pipeline / Metal resources so the first outputImage is fast, without taking a real photo?
Are there any best practices (e.g. CIContext creation timing, prepareRender(...), specific options) that reliably reduce this first-use overhead for CIRAWFilter?
Attachments
Figure 1: First RAW capture with no warm-up (~3s outputImage time)
Figure 2: First RAW capture after warm-up with bundled DNG (improved but still hundreds of ms)
Thanks for any guidance or experience sharing!
I'm getting this error when I launch my application on the iPhone 14 Pro via Xcode. Everything builds OK. I"m using the audio kit plugin and Sound Pipe Audiokit.
The error starts as soon as I start the app and will carry on repeatedly.
I have background processing turned on as I'd like the sounds to play when the phone is locked via the headphones.
I can't find anything online about this error. None of my catches are printing anything in the logs either. So I don't know if this is just something that pops up repeatedly or whether there is something fundamentally wrong.
private func setupAudioSession() {
do {
let session = AVAudioSession.sharedInstance()
try session.setCategory(.playback, mode: .default, options: [.mixWithOthers])
try session.setActive(true, options: .notifyOthersOnDeactivation)
} catch {
errorMessage = "Failed to set up audio session: (error.localizedDescription)"
print(errorMessage ?? "")
}
}
// MARK: - Background Task Handling
private func setupBackgroundTaskHandling() {
// Handle app entering background
notificationObservers.append(
NotificationCenter.default.addObserver(
forName: UIApplication.didEnterBackgroundNotification,
object: nil,
queue: .main,
using: { [weak self] _ in
// Safely unwrap self
guard let self = self else { return }
self.handleBackgroundTransition()
}
)
)
I'm not sure if this is the code causing the issue. Any help would be gratefully appreciated. This is my first app I'm working on .
Topic:
Media Technologies
SubTopic:
Audio
When setting the now playing info for playing media in MPNowPlayingInfoCenter we can set artwork. But it seems the Apple API for creating the artwork is crashing on iOS 18 (FB15145734).
On iOS 17 this gave the warning that the completion handler was not run on the main thread.
I've tried to seek help here: https://stackoverflow.com/questions/78989543/swift-data-race-with-appkit-mpmediaitemartwork-function/78990231?noredirect=1#comment139277425_78990231
but it seems that it's not possible to override the completion handler and therefor it's up to Apple to fix this issue.
.task {
await MainActor.run {
let nowPlayingInfoCenter = MPNowPlayingInfoCenter.default()
var nowPlayingInfo = [String: Any]()
let image = NSImage(named: "image")!
// warning: data race detected: @MainActor function at MPMediaItemArtwork/ContentView.swift:22 was not called on the main thread
nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size, requestHandler: { _ in
// Not on main thread here!
return image
})
nowPlayingInfoCenter.nowPlayingInfo = nowPlayingInfo
}
}
I'm wondering if there is an alternative method to set the now playing artwork?
I have an HDR10+ encoded video that if loaded as a mov plays back on the Apple Vision Pro but when that video is encoded using the latest (1.23b) Apple HLS tools to generate an fMP4 - the resulting m3u8 cannot be played back in the Apple Vision Pro and I only get back a "Cannot Open" error.
To generate the m3u8, I'm just calling mediafilesegmenter (with -iso-fragmented) and then variantplaylistcreator. This completes with no errors but the m3u8 will playback on the Mac using VLC but not on the Apple Vision Pro.
The relevant part of the m3u8 is:
#EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=40022507,BANDWIDTH=48883974,VIDEO-RANGE=PQ,CODECS="ec-3,hvc1.1.60000000.L180.B0",RESOLUTION=4096x4096,FRAME-RATE=24.000,CLOSED-CAPTIONS=NONE,AUDIO="audio1",REQ-VIDEO-LAYOUT="CH-STEREO"
{{url}}
Has anyone been able to use the HLS tools to generate fMP4s of MV-HEVC videos with HDR10?
HI Guys,
I'm using Shazamkit in my IOS app and successfully capturing the currently playing track details, when using the devices (iPhone) built-in mic.
When I test with AirPods though, my app cannot both send the output to through the AirPods and capture that same output with the AirPods mic, for Shazamkit recognition.
I believe this must be possible, because the Shazamkit widget on IOS can do this.
Is it restricted in some way for third party apps?
If not, I'd appreciate some guidance on how to achieve this in Swift code.
Thanks in advance.
Please consider adding the ability to programatically download Premium and Enhanced voices. At the moment it is extremely inconvenient for our users, as they have to navigate to settings themselves to download voices. Our app relies heavily on SpeechSynthesis integration, and it would greatly benefit from this feature.
FB16307193
In my app, I use api provided in Photos framework to delete specified photo.
But after upgrading to iOS 26, the delete function in some iOS device no longer work.
The api will never triggers the system confirmation dialog, and the completionHandler is never called.
In the iOS Photos app, deletion works correctly on the same assets, but calling the API from my app does not work.
Steps to Reproduce
Make sure the app has Full Photo Library Access.
Execute the following code:
PHPhotoLibrary.shared().performChanges({
let assetsToBeDeleted = PHAsset.fetchAssets(withLocalIdentifiers: delUrls, options: nil)
PHAssetChangeRequest.deleteAssets(assetsToBeDeleted)
}, completionHandler: completionHandler)
Expected Behavior
The system should present a confirmation dialog asking the user to delete the selected photos.
After the user confirms, the deletion should occur, and the completionHandler should be called with success or error.
Actual Behavior
The system delete confirmation dialog does not appear.
The completionHandler is never called.
Environment
iOS Versions: 26.1 / 26.0.1
It looks like api bug.
I want to check Is it a know issue and will be fixed. Thanks
Hello,
My company has an in-store app with FPS SDK 4.x (1024) keys. We've handed those keys over to a trusted third-party and we do not have them. We've been in-store for several years.
The person that created the keys in our organization mistakenly stored them encrypted to our third-party's PGP keys, so we cannot decrypt them, and the third party also has no mechanism to provide us with the keys even though it is in their runtime environment. They only have secure mechanisms for us to upload keys onto their servers.
We are trying to migrate to a different third-party DRM provider, and would like to obtain new keys. Unfortunately, the developer portal won't let me create new keys, saying that we have exceeded the number of keys allowed, which I assume is one.
Additionally, the new DRM provider can only support SDK 4.x keys, and it appears that we can only request SDK 5.x keys on the Apple Developer portal, as the SDK 4.0 option is grayed out. Regardless, it seems that we are not able to request any keys.
We've submitted a request to the support e-mail address and received an automated e-mail that the response should take a few days, but may take longer on occasion. It's now been a month. The e-mail says that the reply address is not monitored. Is there any way we can accelerate this?
Thank you,
Carlos
Please include the line below in follow-up emails for this request.
Case-ID: 11089799
When using AVSpeechUtterance and setting it to play in Mandarin, if Siri is set to Cantonese on iOS 18, it will be played in Cantonese. There is no such issue on iOS 17 and 16.
1.let utterance = AVSpeechUtterance(string: textView.text)
let voice = AVSpeechSynthesisVoice(language: "zh-CN")
utterance.voice = voice
2.In the phone settings, Siri is set to Cantonese
Issue:
Under certain conditions, using CallKit does not automatically enable the microphone.
Steps to Reproduce:
1.Start an outgoing call, then the user manually mutes the audio.
2.Receive a native incoming call, end the current call, then answer the new incoming call.(This order is important.)
3.End the incoming call.
4.Start another outgoing call and observe the microphone; do not manually mute or unmute.
Actual Behavior:
The audio icon indicates that the audio is unmuted, but the microphone remains off, and the small yellow dot in the top status bar (which represents the microphone) does not appear.
Expected Behavior:
The microphone should be on, consistent with the audio icon display, and the small yellow dot should appear in the top status bar.
Device:
iPhone 16 pro & iPhone 15 pro, iOS 18.0+
Can it be reproduced using speakerbox(CallKit Demo)?
YES
I am working on an iOS application using SwiftUI where I want to convert a JPG and a MOV file to a live photo. I am utilizing the LivePhoto Class from Github for this. The JPG and MOV files are displayed correctly in my WallpaperDetailView, but I am facing issues when trying to download the live photo to the gallery and generate the Live Photo.
Here is the relevant code and the errors I am encountering:
Console prints:
Play button should be visible Image URL fetched and set: Optional("https://firebasestorage.googleapis.com/...") Video is ready to play Video downloaded to: file:///var/mobile/Containers/Data/Application/.../tmp/CFNetworkDownload_7rW5ny.tmp Failed to generate Live Photo
I have verified that the app has the necessary permissions to access the Photo Library.
The JPEG and MOV files are successfully downloaded and can be displayed in the app.
The issue seems to occur when generating the Live Photo from the downloaded files.
struct WallpaperDetailView: View {
var wallpaper: Wallpaper
@State private var isLoading = false
@State private var isImageSaved = false
@State private var imageURL: URL?
@State private var livePhotoVideoURL: URL?
@State private var player: AVPlayer?
@State private var playerViewController: AVPlayerViewController?
@State private var isVideoReady = false
@State private var showBuffering = false
var body: some View {
ZStack {
if let imageURL = imageURL {
GeometryReader { geometry in
KFImage(imageURL)
.resizable()
...
}
}
if let playerViewController = playerViewController {
VideoPlayerViewController(playerViewController: playerViewController)
.frame(maxWidth: .infinity, maxHeight: .infinity)
.clipped()
.edgesIgnoringSafeArea(.all)
}
}
.onAppear {
PHPhotoLibrary.requestAuthorization { status in
if status == .authorized {
loadImage()
} else {
print("User denied access to photo library")
}
}
}
private func loadImage() {
isLoading = true
if let imageURLString = wallpaper.imageURL, let imageURL = URL(string: imageURLString) {
self.imageURL = imageURL
if imageURL.scheme == "file" {
self.isLoading = false
print("Local image URL set: \(imageURL)")
} else {
fetchDownloadURL(from: imageURLString) { url in
self.imageURL = url
self.isLoading = false
print("Image URL fetched and set: \(String(describing: url))")
}
}
}
if let livePhotoVideoURLString = wallpaper.livePhotoVideoURL, let livePhotoVideoURL = URL(string: livePhotoVideoURLString) {
self.livePhotoVideoURL = livePhotoVideoURL
preloadAndPlayVideo(from: livePhotoVideoURL)
} else {
self.isLoading = false
print("No valid image or video URL")
}
}
private func preloadAndPlayVideo(from url: URL) {
self.player = AVPlayer(url: url)
let playerViewController = AVPlayerViewController()
playerViewController.player = self.player
self.playerViewController = playerViewController
let playerItem = AVPlayerItem(url: url)
playerItem.preferredForwardBufferDuration = 1.0
self.player?.replaceCurrentItem(with: playerItem)
...
print("Live Photo Video URL set: \(url)")
}
private func saveWallpaperToPhotos() {
if let imageURL = imageURL, let livePhotoVideoURL = livePhotoVideoURL {
saveLivePhotoToPhotos(imageURL: imageURL, videoURL: livePhotoVideoURL)
} else if let imageURL = imageURL {
saveImageToPhotos(url: imageURL)
}
}
private func saveImageToPhotos(url: URL) {
...
}
private func saveLivePhotoToPhotos(imageURL: URL, videoURL: URL) {
isLoading = true
downloadVideo(from: videoURL) { localVideoURL in
guard let localVideoURL = localVideoURL else {
print("Failed to download video for Live Photo")
DispatchQueue.main.async {
self.isLoading = false
}
return
}
print("Video downloaded to: \(localVideoURL)")
self.generateAndSaveLivePhoto(imageURL: imageURL, videoURL: localVideoURL)
}
}
private func generateAndSaveLivePhoto(imageURL: URL, videoURL: URL) {
LivePhoto.generate(from: imageURL, videoURL: videoURL, progress: { percent in
print("Progress: \(percent)")
}, completion: { livePhoto, resources in
guard let resources = resources else {
print("Failed to generate Live Photo")
DispatchQueue.main.async {
self.isLoading = false
}
return
}
print("Live Photo generated with resources: \(resources)")
self.saveLivePhotoToLibrary(resources: resources)
})
}
private func saveLivePhotoToLibrary(resources: LivePhoto.LivePhotoResources) {
LivePhoto.saveToLibrary(resources) { success in
DispatchQueue.main.async {
if success {
self.isImageSaved = true
print("Live Photo saved successfully")
} else {
print("Failed to save Live Photo")
}
self.isLoading = false
}
}
}
private func fetchDownloadURL(from gsURL: String, completion: @escaping (URL?) -> Void) {
let storageRef = Storage.storage().reference(forURL: gsURL)
storageRef.downloadURL { url, error in
if let error = error {
print("Failed to fetch image URL: \(error)")
completion(nil)
} else {
completion(url)
}
}
}
private func downloadVideo(from url: URL, completion: @escaping (URL?) -> Void) {
let task = URLSession.shared.downloadTask(with: url) { localURL, response, error in
guard let localURL = localURL, error == nil else {
print("Failed to download video: \(String(describing: error))")
completion(nil)
return
}
completion(localURL)
}
task.resume()
}
}```
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
Files and Storage
Swift
SwiftUI
Photos and Imaging
Hello,
I'm developing an app that displays a photo library using UICollectionView and PHCachingImageManager. I'd like to achieve a user experience similar to the native iOS Photos app, where low-quality images are shown quickly while scrolling, and higher-quality images are loaded for visible cells once scrolling stops.
I'm currently using the following approach:
While Scrolling: I'm using the UICollectionViewDataSourcePrefetching protocol. In the prefetchItemsAt method, I call startCachingImages with low-quality options to cache images in advance.
After Scrolling Stops: In the scrollViewDidEndDecelerating method, I intend to load high-quality images for the currently visible cells.
I have a few questions regarding this approach:
What is the best practice for managing both low-quality and high-quality images efficiently with PHCachingImageManager? Is it correct to call startCachingImages with fastFormat options and then call it again with highQualityFormat in scrollViewDidEndDecelerating?
How can I minimize the delay when a low-quality image is replaced by a high-quality one? Are there any additional strategies to help pre-load high-quality images more effectively?
Topic:
Media Technologies
SubTopic:
Photos & Camera
I noticed that AVSampleBufferDisplayLayerContentLayer is not released when the AVSampleBufferDisplayLayer is removed and released.
It is possible to reproduce the issue with the simple code:
import AVFoundation
import UIKit
class ViewController: UIViewController {
var displayBufferLayer: AVSampleBufferDisplayLayer?
override func viewDidLoad() {
super.viewDidLoad()
let displayBufferLayer = AVSampleBufferDisplayLayer()
displayBufferLayer.videoGravity = .resizeAspectFill
displayBufferLayer.frame = view.bounds
view.layer.insertSublayer(displayBufferLayer, at: 0)
self.displayBufferLayer = displayBufferLayer
DispatchQueue.main.asyncAfter(deadline: .now() + 1) {
self.displayBufferLayer?.flush()
self.displayBufferLayer?.removeFromSuperlayer()
self.displayBufferLayer = nil
}
}
}
In my real project I have mutliple AVSampleBufferDisplayLayer created and removed in different view controllers, this is problematic because the amount of leaked AVSampleBufferDisplayLayerContentLayer keeps increasing.
I wonder that maybe I should use a pool of AVSampleBufferDisplayLayer and reuse them, however I'm slightly afraid that this can also lead to strange bugs.
Edit: It doesn't cause leaks on iOS 18 device but leaks on iPad Pro, iOS 17.5.1
getting an interesting error attempting to compile my app in Xcode 26 beta.
error: Unable to find module dependency: '_MediaPlayer_AppIntents' (in target 'icatcher' from project 'icatcher')
note: A dependency of main module 'MainModuleCrossImportOverlays' (in target 'icatcher' from project 'icatcher')
Unable to find module dependency: '_MediaPlayer_AppIntents'
Not sure what to try and pull to fix this issue
I am developing an app that plays HLS audio.
When using AVPlayerItem with AVURLAsset, can AVAssetResourceLoaderDelegate correctly handle HLS segments?
My goal is to use AVAssetResourceLoaderDelegate to add authentication HTTP headers when accessing HLS .m3u8 and .ts files.
I can successfully download the files, but playback fails with errors.
Specifically, I am observing the following cases:
A. AVAssetResourceLoaderDelegate is canceled, and CoreMediaErrorDomain -12881 occurs
In NSURLConnectionDataDelegate’s didReceiveResponse method, set contentInformationRequest
In didReceiveData, call dataRequest respondWithData
resourceLoader didCancelLoadingRequest is called
CoreMediaErrorDomain -12881 occurs
B. CoreMediaErrorDomain -12881 occurs
In NSURLConnectionDataDelegate’s didReceiveResponse method, set contentInformationRequest
In connection didReceiveData, buffer all received data until the end
In connectionDidFinishLoading, pass the buffered data to respondWithData
Call loadingRequest finishLoading
CoreMediaErrorDomain -12881 occurs
In both cases, dataRequest.requestsAllDataToEndOfResource is YES.
For this use case, I am not using AVURLAssetHTTPHeaderFieldsKey because I need to apply the most up-to-date authentication data at the moment each file is accessed.
I would appreciate any advice or suggestions you might have. Thank you in advance!
Is there any way we can detect the status of the Show When Muted and Show on Skip Back device settings in code ?
I'm creating Live Photos programmatically in my app using the Photos and AVFoundation frameworks. While the Live Photos work perfectly in the Photos app (long press shows motion), users cannot set them as motion wallpapers. The system shows "Motion not available" message.
Here's my approach for creating Live Photos:
// 1. Create video with required metadata
let writer = try AVAssetWriter(outputURL: videoURL, fileType: .mov)
let contentIdentifier = AVMutableMetadataItem()
contentIdentifier.identifier = .quickTimeMetadataContentIdentifier
contentIdentifier.value = assetIdentifier as NSString
writer.metadata = [contentIdentifier]
// Video settings: 882x1920, H.264, 30fps, 2 seconds
// Added still-image-time metadata at middle frame
// 2. Create HEIC image with asset identifier
var makerAppleDict: [String: Any] = [:]
makerAppleDict["17"] = assetIdentifier // Required key for Live Photo
metadata[kCGImagePropertyMakerAppleDictionary as String] = makerAppleDict
// 3. Generate Live Photo
PHLivePhoto.request(
withResourceFileURLs: [photoURL, videoURL],
placeholderImage: nil,
targetSize: .zero,
contentMode: .aspectFit
) { livePhoto, info in
// Success - Live Photo created
}
// 4. Save to Photos library
PHAssetCreationRequest.forAsset().addResource(with: .photo, fileURL: photoURL, options: nil)
PHAssetCreationRequest.forAsset().addResource(with: .pairedVideo, fileURL: videoURL, options: nil)
What I've Tried
Matching exact video specifications from Camera app (882x1920, H.264, 30fps)
Adding all documented metadata (content identifier, still-image-time)
Testing various video durations (1.5s, 2s, 3s)
Different image formats (HEIC, JPEG)
Comparing with exiftool against working Live Photos
Expected Behavior
Live Photos created programmatically should be eligible for motion wallpapers, just like those from the Camera app.
Actual Behavior
System shows "Motion not available" and only allows setting as static wallpaper.
Any insights or workarounds would be greatly appreciated. This is affecting our users who want to use their created content as wallpapers.
Questions
Are there additional undocumented requirements for Live Photos to be wallpaper-eligible?
Is this a deliberate restriction for third-party apps, or a bug?
Has anyone successfully created Live Photos that work as motion wallpapers?
Environment
iOS 17.0 - 18.1
Xcode 16.0
Tested on iPhone 16 Pro
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
LivePhotosKit JS
PhotoKit
Core Image
AVFoundation
Hi there,
We're working on offline playback of DRM tracks. The persistent keys (also known as track licenses) for offline playback are stored locally on the device and are served from cache when a user initiates playback of a downloaded track.
Our persistent keys have a limited validity time and need to be refreshed when they expire. To prevent a situation where a persistent key expires while the user is offline, we've decided to eagerly refresh these keys one week before their expiration date. To make that happen we need to be able to obtain the expiration date of the given track license.
We've been attempting to use the makeSecureTokenForExpirationDateOfPersistableContentKey API to facilitate this process. The documentation states that this API returns a secret token representing the persistent key, which we can then exchange with our license server for the expiration date: https://developer.apple.com/documentation/avfoundation/avcontentkeysession/makesecuretokenforexpirationdate(ofpersistablecontentkey:completionhandler:)?language=objc
However, every time we call makeSecureTokenForExpirationDateOfPersistableContentKey, we receive an error with code -46250. We haven't been able to find any public references or documentation for this specific error code, which is preventing us from troubleshooting the issue. We are conducting our tests on a physical device, as the simulator does not support FairPlay playback. We don't use dual expiry approach.
Is our understanding of how to obtain the expiration timestamp correct? Are we using the makeSecureTokenForExpirationDateOfPersistableContentKey API as it was intended? What does the -46250 error code mean, and what steps should we take to fix our FairPlay implementation to make this work?
Thanks in advance for your assistance.
We are encountering a critical, intermittently occurring crash issue when accessing photo data using PHAssetResourceManager.writeDataForAssetResource on iOS 18. The problem does not arise on iOS 17 or earlier versions.
We have been unable to identify a consistent reproduction path. Based on user feedback, the issue seems to involve Live Photo and Raw image files.
Our investigation has revealed that the crash occurs in the +[PISchema identifier] method of the PhotoImaging Framework. When called manually, this method causes a crash on iOS 18 but works without issues on iOS 17.
Reproduction Steps:
1.Fetch PHAsset.
2.Get PHAssetResource by [PHAssetResource assetResourcesForAsset:].
3.Call [PHAssetResourceManager writeDataForAssetResource:toFile:options:completionHandler:].
Crash Log:
Incident Identifier: CFD60092-FDB1-43B4-BA42-3F507F7B8B96
CrashReporter Key: 260b4780989083a54e0cb451930fe9a3bed64862
Hardware Model: iPhone13,4
AppStoreTools: 16C5031b
AppVariant: 1:iPhone13,4:18
Code Type: ARM-64 (Native)
Role: Foreground
Parent Process: launchd [1]
Date/Time: 2025-02-15 19:07:57.7054 +0800
Launch Time: 2025-02-15 19:07:55.4106 +0800
OS Version: iPhone OS 18.3.1 (22D72)
Release Type: User
Baseband Version: 5.20.03
Report Version: 104
Exception Type: EXC_CRASH (SIGABRT)
Exception Codes: 0x0000000000000000, 0x0000000000000000
Termination Reason: SIGNAL 6 Abort trap: 6
Terminating Process: mCloud_iPhone [11109]
Triggered by Thread: 11
Application Specific Information:
abort() called
Thread 11 name: Dispatch queue: com.apple.NSXPCConnection.m-user.com.apple.photos.service
Thread 11 Crashed:
0 libsystem_kernel.dylib 0x1e850b2d4 __pthread_kill + 8
1 libsystem_pthread.dylib 0x221b4959c pthread_kill + 268
2 libsystem_c.dylib 0x19ec24b08 abort + 128
3 NeutrinoCore 0x1bdcdbdec -[NUAssertionPolicyAbort notifyAssertion:] + 68
4 NeutrinoCore 0x1bdcdbbf4 -[NUAssertionPolicyComposite notifyAssertion:] + 160
5 NeutrinoCore 0x1bdcdc098 -[NUAssertionPolicyUnique notifyAssertion:] + 176
6 NeutrinoCore 0x1bdcdb524 -[NUAssertionHandler handleFailureInFunction:file:lineNumber:currentlyExecutingJobName:description:arguments:] + 156
7 NeutrinoCore 0x1bdcdc4bc _NUAssertFailHandler + 176
8 NeutrinoCore 0x1bdc8ea98 -[NUIdentifier initWithNamespace:name:version:] + 2352
9 NeutrinoCore 0x1bdc8eba8 -[NUIdentifier initWithName:version:] + 84
10 NeutrinoCore 0x1bdc8ec10 -[NUIdentifier initWithName:] + 68
11 PhotoImaging 0x1bda54ce4 +[PISchema identifier] + 36
12 PhotoImaging 0x1bda550fc +[PISchema registeredPhotosSchemaIdentifier] + 32
13 PhotoImaging 0x1bd9d7128 +[PIPhotoEditHelper newComposition] + 28
14 PhotoImaging 0x1bd940798 +[PICompositionSerializer deserializeCompositionFromAdjustments:metadata:formatIdentifier:formatVersion:sidecarData:error:] + 160
15 PhotoImaging 0x1bd9412ec +[PICompositionSerializer deserializeCompositionFromData:formatIdentifier:formatVersion:sidecarData:error:] + 224
16 PhotoLibraryServices 0x1afabf75c -[PLPhotoEditPersistenceManager loadCompositionFrom:formatIdentifier:formatVersion:sidecarData:error:] + 1856
17 PhotoLibraryServices 0x1afabffe4 +[PLPhotoEditPersistenceManager validateAdjustmentData:formatIdentifier:formatVersion:error:] + 108
18 Photos 0x1af4ac360 __167+[PHContentEditingInputRequestContext contentEditingInputRequestContextForAsset:requestID:managerID:networkAccessAllowed:downloadIntent:progressHandler:resultHandler:]_block_invoke + 260
19 Photos 0x1af4ac67c -[PHAdjustmentData(ContentEditingInput) _contentEditing_readableByClientWithVerificationBlock:] + 136
20 Photos 0x1af4ac4b0 -[PHAdjustmentData(ContentEditingInput) _contentEditing_requiredBaseVersionReadableByClient:verificationBlock:] + 88
21 Photos 0x1af4abb8c -[PHContentEditingInputRequestContext _adjustmentBaseVersionFromResult:request:canHandleAdjustmentData:] + 404
22 Photos 0x1af4a911c -[PHContentEditingInputRequestContext produceChildRequestsForRequest:reportingIsLocallyAvailable:isDegraded:result:] + 624
23 Photos 0x1af2c1d10 -[PHMediaRequestContext _produceChildRequestsForRequest:withResult:] + 88
24 Photos 0x1af2c11e8 -[PHMediaRequestContext mediaRequest:didFinishWithResult:] + 88
25 Photos 0x1af505184 -[PHAdjustmentDataRequest _finishFromAsynchronousCallback] + 124
26 Photos 0x1af5050a0 __39-[PHAdjustmentDataRequest startRequest]_block_invoke + 584
27 PhotoLibraryServicesCore 0x1b001be8c __106-[PLAssetsdResourceClient adjustmentDataForAsset:networkAccessAllowed:trackCPLDownload:completionHandler:]_block_invoke.86 + 864
28 CoreFoundation 0x196dd8e34 __invoking___ + 148
29 CoreFoundation 0x196dd7e7c -[NSInvocation invoke] + 428
30 Foundation 0x195a64ae0 __NSXPCCONNECTION_IS_CALLING_OUT_TO_EXPORTED_OBJECT__ + 16
31 Foundation 0x195a63514 -[NSXPCConnection _decodeAndInvokeReplyBlockWithEvent:sequence:replyInfo:] + 532
32 Foundation 0x195a6653c __88-[NSXPCConnection _sendInvocation:orArguments:count:methodSignature:selector:withProxy:]_block_invoke_5 + 188
33 libxpc.dylib 0x221babb80 _xpc_connection_reply_callout + 116
34 libxpc.dylib 0x221b9e2d0 _xpc_connection_call_reply_async + 80
35 libdispatch.dylib 0x19eb6b028 _dispatch_client_callout3 + 20
36 libdispatch.dylib 0x19eb88b64 _dispatch_mach_msg_async_reply_invoke + 340
37 libdispatch.dylib 0x19eb7242c _dispatch_lane_serial_drain + 352
38 libdispatch.dylib 0x19eb73158 _dispatch_lane_invoke + 432
39 libdispatch.dylib 0x19eb7e38c _dispatch_root_queue_drain_deferred_wlh + 288
40 libdispatch.dylib 0x19eb7dbd8 _dispatch_workloop_worker_thread + 540
41 libsystem_pthread.dylib 0x221b44680 _pthread_wqthread + 288
42 libsystem_pthread.dylib 0x221b42474 start_wqthread + 8
Topic:
Media Technologies
SubTopic:
Photos & Camera