Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Apple Music playlist create/delete works but DELETE returns 401 — and MusicKit write APIs are macOS‑unavailable. How to build a playlist editor on macOS?
I’m trying to build a playlist editor on macOS. I can create playlists via the Apple Music HTTP API, but DELETE always returns 401 even immediately after creation with the same tokens. Minimal repro: #!/usr/bin/env bash set -euo pipefail BASE_URL="https://api.music.apple.com/v1" PLAYLIST_NAME="${PLAYLIST_NAME:-blah}" : "${APPLE_MUSIC_DEV_TOKEN:?}" : "${APPLE_MUSIC_USER_TOKEN:?}" create_body="$(mktemp)" delete_body="$(mktemp)" trap 'rm -f "$create_body" "$delete_body"' EXIT curl -sS --compressed -o "$create_body" -w "Create status: %{http_code}\n" \ -X POST "${BASE_URL}/me/library/playlists" \ -H "Authorization: Bearer ${APPLE_MUSIC_DEV_TOKEN}" \ -H "Music-User-Token: ${APPLE_MUSIC_USER_TOKEN}" \ -H "Content-Type: application/json" \ -d "{\"attributes\":{\"name\":\"${PLAYLIST_NAME}\"}}" playlist_id="$(python3 - "$create_body" <<'PY' import json, sys with open(sys.argv[1], "r", encoding="utf-8") as f: data = json.load(f) print(data["data"][0]["id"]) PY )" curl -sS --compressed -o "$delete_body" -w "Delete status: %{http_code}\n" \ -X DELETE "${BASE_URL}/me/library/playlists/${playlist_id}" \ -H "Authorization: Bearer ${APPLE_MUSIC_DEV_TOKEN}" \ -H "Music-User-Token: ${APPLE_MUSIC_USER_TOKEN}" \ -H "Content-Type: application/json" I capture the response bodies like this: cat "$create_body" cat "$delete_body" Result: Create: 201 Delete: 401 I also checked the latest macOS SDK’s MusicKit interfaces and MusicLibrary.createPlaylist/edit/add(to:) are marked @available(macOS, unavailable), so I can’t create/ delete via MusicKit on macOS either. Question: How can I implement a playlist editor on macOS (create/delete/modify) if: MusicKit write APIs are unavailable on macOS, and The HTTP API can create but DELETE returns 401? Any guidance or official workaround would be hugely appreciated.
0
0
144
2w
Apple Music API no longer returns standalone singles as “single” albums
I’ve been using Apple Music API for quite a while now and a recent change must have happened which is quite disruptive. On many occasions, artists release singles from an album as part of promoting this album. For recent examples, Harry Styles released “Aperture” (a single) to promote his upcoming album “Kiss All The Time. Disco, Occasionally“. Similarly, Bruno Mars released “I Just Might”, a single from the upcoming album “The Romantic”. Previously, those would return at the endpoint ”artists/{artist_id}/albums” with a “- Single” suffix. But it seems a recent change happens where they only appear as playable tracks inside the album. This behavior is also evident in the Apple Music app itself. Those singles no longer appear under “Singles & EPs”. Instead, they would only be visible if the single becomes popular enough to be shown on “Top Songs“. Otherwise one would have to know to tap on the (future) album to discover if there are released singles. Meanwhile Spotify’s API returns those as singles properly, just like Apple Music API used to. This change must be recent but the question is if it’s intentional, and if so, how can the API be used from here on out to “extract” those singles and represent them?
0
1
152
1w
AVAssetWriterInput.PixelBufferReceiver.append hangs indefinitely (suspends and never resumes)
I’ve been struggling with a very frustrating issue using the new iOS 26 Swift Concurrency APIs for video processing. My pipeline reads frames using AVAssetReader, processes them via CIContext (Lanczos upscale), and then appends the result to an AVAssetWriter using the new PixelBufferReceiver. The Problem: The execution randomly stops at the ]await append(...)] call. The task suspends and never resumes. It is completely unpredictable: It might hang on the very first run, or it might work fine for 4-5 runs and then hang on the next one. It is independent of video duration: It happens with 5-second clips just as often as with long videos. No feedback from the system: There is no crash, no error thrown, and CPU usage drops to zero. The thread just stays in the suspended state indefinitely. If I manually cancel the operation and restart the VideoEngine, it usually starts working again for a few more attempts, which makes me suspect some internal resource exhaustion or a deadlock between the GPU context and the writer's input. The Code: Here is a simplified version of my processing loop: private func proccessVideoPipeline( readerOutputProvider: AVAssetReaderOutput.Provider<CMReadySampleBuffer<CMSampleBuffer.DynamicContent>>, pixelBufferReceiver: AVAssetWriterInput.PixelBufferReceiver, nominalFrameRate: Float, targetSize: CGSize ) async throws { while !Task.isCancelled, let payload = try await readerOutputProvider.next() { let sampleBufferInfo: (imageBuffer: CVPixelBuffer?, presentationTimeStamp: CMTime) = payload.withUnsafeSampleBuffer { sampleBuffer in return (sampleBuffer.imageBuffer, sampleBuffer.presentationTimeStamp) } guard let currentPixelBuffer = sampleBufferInfo.imageBuffer else { throw AsyncFrameProcessorError.missingImageBuffer } guard let pixelBufferPool = pixelBufferReceiver.pixelBufferPool else { throw NSError(domain: "PixelBufferPool", code: -1, userInfo: [NSLocalizedDescriptionKey: "No pixel buffer pool available"]) } let newPixelBuffer = try pixelBufferPool.makeMutablePixelBuffer() let newCVPixelBuffer = newPixelBuffer.withUnsafeBuffer({ $0 }) try upscale(currentPixelBuffer, outputPixelBuffer: newCVPixelBuffer, targetSize: targetSize ) let presentationTime = sampleBufferInfo.presentationTimeStamp try await pixelBufferReceiver.append(.init(unsafeBuffer: newCVPixelBuffer), with: presentationTime) } } Does anyone know how to fix it?
0
0
85
1w
AVAudioEngine fails to start during FaceTime call (error 2003329396)
Is it possible to perform speech-to-text using AVAudioEngine to capture microphone input while being on a FaceTime call at the same time? I tried implementing this, but whenever I attempt to start the  AVAudioEngine  while a FaceTime call is active, I get the following error: “The operation couldn’t be completed. (OSStatus error 2003329396)” I assume this might be due to microphone resource restrictions during FaceTime, but I’d like to confirm whether this limitation is at the system level or if there’s any possible workaround or entitlement that allows concurrent microphone access. Has anyone encountered this issue or found a solution?
1
1
430
3d
USDZ model files when opened in Preview turns black
On macOS 26.2 (Tahoe), the Preview app fails to render many USDZ models correctly. The issue appears inconsistent: • Some USDZ files open normally in Preview • Some USDZ files open but the viewport turns completely black (no geometry, no material, no lighting) • All the same files open correctly when opened using “Open With → Xcode” This was not the behavior on macOS 15.7.2, where the exact same USDZ files rendered consistently in Preview without failures.
1
0
94
19h
Background Upload Extension
Hello, We are trying to use the new Background Upload Extension to improve uploads of assets (Photos, Live Photos, Videos) in the background in our application. 1 - We implemented the code to upload still images. We would like to do the same for Live Photos but we are not sure how to proceed since a Live Photo is composed of 2 resources that we would like to upload in the same job so that we keep the association between the 2 resources. Steps: A Live Photo is captured on the device. Our application is notified for new content in the Photo Library and the asset is queued for upload by our application. The system calls the background upload extension and the Live Photo is prepared for upload but we can schedule a job only for one resource (still photo or paired video) so we can not upload both resources in a single job. 2 - Is there a way to synchronise the application and the extension so that the application would not process the same data or execute the same requests than the extension. For example: our application is working with tokens and we would like to prevent those tokens to be consumed by the application and the extension at the same time. 3 - is there a command in xcode or terminal to start or stop the extension, something similar to what exists for processing tasks (https://developer.apple.com/documentation/backgroundtasks/starting-and-terminating-tasks-during-development).
3
1
535
15h