Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Uploading PhotoKit Resources in the Background
The introduction of PHBackgroundResourceUploadExtension is a welcome addition in iOS 26.1. I wonder however, how to attach a debugger and actually get the system to call the process() method of the extension. I tried to run the extension both inside photos app (and also the main app for testing), but when I take a photo or add photos to the library (saving), the process() method does never get called. Any hints would be appreciated to debug the PHBackgroundResourceUploadExtension during development.
0
1
154
Nov ’25
Audio session activation occasionally fails from CarPlay
I'm working on adding CarPlay support to an audio app and am running into an issue. Occasionally, when a user opens the app from CarPlay while the main app scene is either not connected or is currently in the background, I will receive an error when attempting to activate the audio session. The code below mimics my setup: do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .spokenAudio) try AVAudioSession.sharedInstance().setActive(true) } catch { print(error) // NSOSStatusErrorDomain - 560557684: Session activation failed } That error code maps to AVAudioSession.ErrorCode.cannotInterruptOthers. Once in this state, all subsequent attempts to play different pieces of content will fail. However, things will start working normally if the user opens the app on their phone and tries again from CarPlay (while the app is in the foreground on their phone). I'm not sure why it would behave this way and want to note that I do have the audio background mode capability enabled. Has anyone else encountered this? Are there any workarounds or changes I could make to prevent this from happening?
0
1
161
Apr ’25
AVPlayer HLS High Bitrate Problem on Apple TV HD (A1625) tvOS 26
Hello, We have Video Stream app. It has HLS VOD Content. We supply 1080p, 4K Contents to users. Users were watching 1080p content before tvOS 26. Users can not watch 1080p content anymore when they update to tvOS 26. We have not changed anything at HLS playlist side and application version. This problem only occurs on Apple TV 4th Gen (A1625) tvOS 26 version. There is no problem with newer Apple TV devices. Would you help to resolve problem? Thanks in advance
2
1
418
Oct ’25
Regarding FPS SDK Upgrade
Hi Apple Team, We have integrated FairPlay Streaming Server SDK v3 into our MDRM platform since 2017, the system works stable and stayed untouched. As you know, both Widevine and Playready have requirements to upgrade the Server SDK regularly. We want to know if Apple imposes similar requirements for upgrading the FPS SDK, or if we may continue using the old one without any updates. Thanks for your support!
0
1
366
Feb ’25
Accessing External Timecode from Blackmagic ProDock in Custom App
Hi everyone, I’m exploring using the iPhone 17 Pro with the Blackmagic ProDock in a custom capture app. The genlock functionality seems accessible via AVExternalSyncDevice and related APIs, which is great. I’m specifically curious about external timecode coming in from the ProDock: • Is there a public way to access the timecode feed in a custom app via AVFoundation or another Apple API? • If so, what is the recommended approach to read or apply that timecode during capture? • Are there any current limitations or entitlements required to access timecode from ProDock in a third-party app? I’m excited to start integrating synchronized capture in my app, and any guidance or sample patterns would be greatly appreciated. Thanks in advance! — [Artem]
2
0
265
Oct ’25
Graceful shutdown during background audio playback.
Hello. My team and I think we have an issue where our app is asked to gracefully shutdown with a following SIGTERM. As we’ve learned, this is normally not an issue. However, it seems to also be happening while our app (an audio streamer) is actively playing in the background. From our perspective, starting playback is indicating strong user intent. We understand that there can be extreme circumstances where the background audio needs to be killed, but should it be considered part of normal operation? We hope that’s not the case. All we see in the logs is the graceful shutdown request. We can say with high certainty that it’s happening though, as we know that playback is running within 0.5 seconds of the crash, without any other tracked user interaction. Can you verify if this is intended behavior, and if there’s something we can do about it from our end. From our logs it doesn’t look to be related to either memory usage within the app, or the system as a whole. Best, John
0
1
101
Jun ’25
Failure of AudioUnitSetProperty when using MacCatalyst (works on macOS)
I was trying to set custom audio output device for a generated audio on macCatalyst. While using let status = AudioUnitSetProperty(outputUnit, kAudioOutputUnitProperty_CurrentDevice, kAudioUnitScope_Global, 0, &outputDeviceID, UInt32(MemoryLayout.size)) kAudioOutputUnitProperty_CurrentDevice is invalid, and status = -10879, indicating an error. STEPS TO REPRODUCE Set Run Destination to MacOS and run the program. "AudioUnitSetProperty: 0" should be printed, indicating it works fine. Set Run Destination to Mac Catalyst and run the program. "Error setting output device: -10879" should be printed, indicating an error.
4
1
661
Mar ’25
CoreMediaErrorDomain -42709 error
Hello, I am developing a video streaming service that uses FairPlay. Since around February 20th, we have started receiving reports of CoreMediaErrorDomain -42709 errors. Unfortunately, there is no documentation from Apple that explains what this error means, so we are not sure how to address or fix the issue. Most of the users who reported this error are using iOS 18.2.1 and iOS 18.3.1. Could you please advise on what we should check or how we might resolve this error?
1
1
552
Mar ’25
iOS Radio App: Need to extract and stream audio-only from HLS streams with video content
I'm developing an iOS radio app that plays various HLS streams. The challenge is that some stations broadcast HLS streams containing both audio and video (example: https://svs.itworkscdn.net/smcwatarlive/smcwatar/chunks.m3u8), but I want to: Extract and play only the audio track Support AirPlay for audio-only streaming Minimize data usage by not downloading video content Technical Details: iOS 17+ Swift 5.9 Using AVFoundation for playback Current implementation uses AVPlayer with AVPlayerItem Current Code Structure: class StreamPlayer: ObservableObject { @Published var isPlaying = false private var player: AVPlayer? private var playerItem: AVPlayerItem? func playStream(url: URL) { let asset = AVURLAsset(url: url) playerItem = AVPlayerItem(asset: asset) player = AVPlayer(playerItem: playerItem) player?.play() } Stream Analysis: When analyzing the video stream using FFmpeg: CopyInput #0, hls, from 'https://svs.itworkscdn.net/smcwatarlive/smcwatar/chunks.m3u8': Stream #0:0: Video: h264, yuv420p(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 25 fps Stream #0:1: Audio: aac, 44100 Hz, stereo, fltp Attempted Solutions: Using MobileFFmpeg: let command = [ "-i", streamUrl, "-vn", "-acodec", "aac", "-ac", "2", "-ar", "44100", "-b:a", "128k", "-f", "mpegts", "udp://127.0.0.1:12345" ].joined(separator: " ") ffmpegProcess = MobileFFmpeg.execute(command) Issue: While FFmpeg successfully extracts audio, playback through AVPlayer doesn't work reliably. Tried using HLS output: let command = [ "-i", streamUrl, "-vn", "-acodec", "aac", "-ac", "2", "-ar", "44100", "-b:a", "128k", "-f", "hls", "-hls_time", "2", "-hls_list_size", "3", outputUrl.path ] Issue: Creates temporary files but faces synchronization issues with live streams. Requirements: Real-time audio extraction from HLS stream Maintain live streaming capabilities Full AirPlay support Minimal data usage (avoid downloading video content) Handle network interruptions gracefully Questions: What's the most efficient way to extract only audio from an HLS stream in real-time? Is there a way to tell AVPlayer to ignore video tracks completely? Are there better alternatives to FFmpeg for this specific use case? What's the recommended approach for handling AirPlay with modified streams? Any guidance or alternative approaches would be greatly appreciated. Thank you!
1
1
490
Dec ’24
iOS 26.0 (23A5276f) - Bluetooth Call Audio Broken (AirPods + Car)
iOS 26.0 (23A5276f) – Bluetooth Call Audio Issue I’m experiencing a Bluetooth audio issue on iOS 26.0 (build 23A5276f). I cannot make or receive phone calls properly using Bluetooth devices — this affects both my car’s Bluetooth system and my AirPods Pro (2nd generation). Notably: Regular phone calls have no audio (either I can’t hear the other person, or they can’t hear me). WhatsApp and other VoIP apps work fine with the same Bluetooth devices. Media playback (music, video, etc.) works without issues over Bluetooth. It seems this bug is limited to the native Phone app or the system audio routing for regular cellular calls. Please advise if this is a known issue or if a fix is expected in upcoming beta releases.
1
1
283
Jun ’25
Creating RTP-MIDI Sessions via MIDINetworkSession C API (dlopen/dlsym) on macOS 15?
I’m an amateur developer working on a free utility for composers/producers, for which the macOS release needs to create and name RTP-MIDI sessions in Audio MIDI Setup from the command line (so I can ship a small C helper instead of telling users to click through the UI). Here’s what I’ve tried so far, without luck: • Plist hacks: Injecting entries into ~/Library/Audio/MIDI Configurations/*.mcfg works when AMS is closed, but AMS immediately locks and reverts my changes when it’s open. • CoreMIDI C API: I can create virtual ports with MIDISourceCreate, but attempting MIDIObjectGetDataProperty on the apple.midirtp.session plugin always returns err –10836. • Obj-C & Swift: Loading MIDINetworkSession and calling defaultSession, init, setNetworkName: and setting enabled = YES doesn’t produce a new session object in the Network panel. • dlopen/dlsym: I extracted the real CoreMIDI binary out of the dyld shared cache and tried binding _MIDINetworkSessionCreate, _SetName, _SetEnabled, etc., but all the symbols come back null or my tool segfaults. • Plugin registration: I’ve pulled the factory UUID (70C9C5EA-7C65-11D8-B317-000393A34B5A) from /System/Library/Extensions/AppleMIDIRTPDriver.plugin/Contents/Info.plist and called CFPlugInRegisterFactories, but it still never exposes the session-creation calls. At this point I’m convinced I’m either loading the wrong binary or missing one critical step in registering the RTP-MIDI plugin’s private API. Can anyone point me to: The exact path of the dylib or bundle that actually exports the MIDINetworkSessionCreate/MIDINetworkSessionSetName/MIDINetworkSessionSetEnabled symbols? A minimal working snippet (C or Obj-C) that reliably creates and names a Network-MIDI session? Any pointers, sample code, or even ideas about where Apple hides this functionality on macOS 15 would be hugely appreciated. Thanks!
0
1
104
Jun ’25
Always audio from latest connected external USB mic
Hello! I've two mics connected to a USB-hub. The USB-hub is then connected to my iPad. Both mics are part of the audio session's list of available inputs. The problem is that regardless of which mic I select in my app (using setPreferredInput() on the audio session), the audio keeps coming from the mic that was last connected to the USB-hub. Anyone that knows if this is a limitation in iPadOS/iOS?
1
1
203
Jul ’25
VideoToolbox Encoder's Unregistered User Data SEI NAL UUID
I was advised to post here by a Code-Level Support representative. Below will be a copy of my initial issue report, and my minimally reproductive test project can be found at the following GitHub repository URL... https://github.com/PierceLBrooks/vtUudSeiNalCmake DESCRIPTION OF PROBLEM When encoding H264 video codec data using the VTCompressionSession API facilities available through the VideoToolbox framework on MacOS, the resultant bitstream will invariably include Unregistered User Data SEI NAL units that carry the UUID "47564adc-5c4c-433f-94ef-c5113cd143a8". The proprietary decoders we are working with currently struggle with filtering out these NAL units. Can you explain what purpose this serves, what the meaning of the byte-wise unit payloads are, and which configuration settings the VideoToolbox encoder instance specifically depends upon for triggering the insertion of them? STEPS TO REPRODUCE 1. Invoke the instantiation of a new VideoToolbox H264 encoder object by calling VTCompressionSessionCreate with appropriate configuration flags. 2. Push frames through the encoder, receiving their encoded byte buffer counterparts through an asynchronous callback. 3. Write that encoded data to some buffer which will contain the totality of the encoder's output. 4. Inspect the NAL units of the initial portion of this output bitstream buffer. 5. Observe the presence of at least one Unregistered User Data SEI NAL unit carrying the "47564adc-5c4c-433f-94ef-c5113cd143a8" UUID near the beginning of the output segment.
1
2
198
Apr ’25
Unable to play audio via MusicKit
Hey folks, I'm running into an odd issue suddenly with an app that had a working MusicKit integration before. I'm using ApplicationMusicPlayer to play Apple Music albums and songs. I'm testing on a physical device, signed in to Apple ID, and with a valid subscription. Apple Music via the first-party app works entirely fine on this device. Attempting to play back any content at all gives the log: <ICUserIdentityStoreACAccountBackend: 0x1070bf3e0> Failed to initialize primary apple account, error=Error Domain=ICError Code=-7013 "Client is not entitled to access account store" UserInfo={NSDebugDescription=Client is not entitled to access account store} [ICUserIdentityStore] - initializing account histories with activeAccountDSID = nil, activeLockerAccountDSID = nil, timestamp = 14605951908 [ICUserIdentityStore] Failed to fetch local store account with error: Error Domain=ICError Code=-7013 "Client is not entitled to access account store" UserInfo={NSDebugDescription=Client is not entitled to access account store}. The album artwork, track names, etc, all appear in the control center playback controls, but the music doesn't play. Trying to trigger playback with control center just results in it skipping to the next track, which doesn't play either. This exact code used to work. I have the MusicKit service selected in Apple Connect. Since this isn't entitlement-based, I'm not sure how else to check that I'm set up correctly. I've tried deleting/reinstalling the app, restarting the device, cleaning/rebuilding, and deleting DerivedData, to no avail. Any help? Running Xcode 16.4 (16F6), testing on iOS 18.5 (22F76)
0
1
144
Jun ’25
macOS Sonoma 'Cannot Decode' HLS Video
I use AVPlayer to play HLS video successfully on macOS Sonoma, but I encountered this error on macOS Sequoia. Please help me: Error Domain=AVFoundationErrorDomain Code=-11833 ‘Cannot Decode’ UserInfo={NSUnderlyingError=0x600001e57330 {Error Domain=CoreMediaErrorDomain Code=-12906 ‘(null)’}, NSLocalizedFailureReason=The decoder required for this media cannot be found., AVErrorMediaTypeKey=vide, NSLocalizedDescription=Cannot Decode} Thanks!
2
1
643
Mar ’25
Problem with UVC Device Access on visionOS
No external cameras show up in the app on visionOS. We use this sample code as a basis for our tests: https://developer.apple.com/documentation/visionos/displaying-video-from-connected-devices We also received the needed entitlement from Apple, but every camera we tried so far does not show up on visionOS. We tried the following devices and hubs: Insta360 X4 Somikon Endoscope Camera: USB HD Endoscope Camera EMEET Full HD Webcam - C960 BENFEI Video/Audio Capture Card, 4K HDMI auf USB C/A Logitech C920 HD PRO Webcam, Anker PowerConf C200 Insta360 GO 3S Anker 341 USB-C Hub UGREEN Revodok Pro 10Gbps USB-C Hub All Vision Pro devices we tried run with visionOS 2.3. When trying the same code on iPad we can actually use external cameras. Steps to reproduce: Start the app on a Vision Pro device and connect an external camera. The connected camera does not show up in the dropdown. Development environment: Xcode 16.2, macOS 15.3 Run-time configuration: iOS 18.3, visionOS 2.3
2
0
612
Feb ’25
FPS Certificate Re-issuance and Validity of Existing Certificate
Keywords: FairPlay, FPS Certificate, DRM, FairPlay Streaming, license server Hi all, We are currently using FairPlay Streaming in production and already have an FPS certificate in place. However, the passphrase for the existing FPS certificate has unfortunately been lost. We are now considering reissuing a new FPS certificate, and I would like to confirm a few points before proceeding: 1️⃣ If we reissue a new FPS certificate, will the existing certificate be automatically revoked? Or will it remain valid until its original expiration date? 2️⃣ Is it possible to have both the newly issued and the existing certificates valid at the same time? In other words, can we serve DRM licenses using either certificate depending on the packaging or client? 3️⃣ Are there any caveats or best practices we should be aware of when reissuing an FPS certificate? For example, would existing packaged content become unplayable, or would CDN/packaging server configurations need to be updated carefully? Since this affects our production environment, we would like to minimize any service disruption or compatibility issues. Unfortunately, when we contacted Apple support directly, we were advised to post this question here in the Forums for additional guidance. Any advice or experiences would be greatly appreciated! Thank you in advance.
1
0
169
Jun ’25
AVCaptureDevice rotationCoordinator modifying CALayer on switching devices
I am trying to use AVCaptureDevice.rotationCoordinator API to observe angles for preview and capture and it seems there is an issue with the API when used with arbitrary CALayer (which is not a AVCaptureVideoPreviewLayer) and switching cameras. Here is my setup. The below function is defined in an actor class called CameraManager that performs setup of rotationCoordinator. func updateRotationCoordinator(_ callback:@escaping @MainActor (CGFloat) -> Void) { guard let device = sessionConfiguration.activeVideoInput?.device, let displayLayer = displayLayer else { return } cancellables.removeAll() rotationCoordinator = AVCaptureDevice.RotationCoordinator(device: device, previewLayer: displayLayer) guard let coordinator = rotationCoordinator else { return } coordinator.publisher(for: \.videoRotationAngleForHorizonLevelPreview) .receive(on: DispatchQueue.main) .sink { degrees in let radians = degrees * .pi / 180 MainActor.assumeIsolated { callback(radians) } } .store(in: &cancellables) } This works the very first time but when I switch cameras and call this function again, it throws a runtime error that view's layer is modified from a non-main thread. This happens at the very line where rotation coordinator is been recreated. It's not clear why initialising rotation coordinator should modify CALayer properties right in it's init method. Modifying properties of a view's layer off the main thread is not allowed: view <MyApp.DisplayLayerView: 0x102ffaf40> with nearest ancestor view controller <_TtGC7SwiftUI19UIHostingControllerGVS_15ModifiedContentVS_7AnyViewVS_12RootModifier__: 0x101f7fb80>; backtrace: ( 0 UIKitCore 0x0000000194a977b4 575E5140-FA6A-37C2-B00B-A4EACEDFDA53 + 22509492 1 UIKitCore 0x000000019358594c 575E5140-FA6A-37C2-B00B-A4EACEDFDA53 + 416076 2 QuartzCore 0x00000001927f5bd8 D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 43992 3 QuartzCore 0x00000001927f5a4c D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 43596 4 QuartzCore 0x000000019283a41c D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 324636 5 QuartzCore 0x000000019283a0a8 D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 323752 6 AVFCapture 0x00000001af072a18 09192166-E0B6-346C-B1C2-7C95C3EFF7F7 + 420376 7 MyApp.debug.dylib 0x0000000105fa3914 $s10MyApp15CapturePipelineC25updateRotationCoordinatoryyy12CoreGraphics7CGFloatVScMYccF + 972 8 MyApp.debug.dylib 0x00000001063ade40 $s10MyApp11CameraModelC18switchVideoDevicesyyYaFTY3_ + 72 9 MyApp.debug.dylib 0x0000000105fe3cbd $s10MyApp11ContentViewV4bodyQrvg7SwiftUI6VStackVyAE05TupleE0VyAE6HStackVyAIyAE6SpacerV_AE6ButtonVyAE0E0PAEE5frame5width6height9alignmentQr12CoreGraphics7CGFloatVSg_AyE9AlignmentVtFQOyAqEE11scaledToFitQryFQOyAqEE10imageScaleyQrAE5ImageV0Z0OFQOyA3__Qo__Qo__Qo_GtGG_AmKyAIyAKyAIyAqEE7paddingyQrAE4EdgeO3SetV_AYtFQOyAA07CaptureM0V_Qo__AOyAE4TextVGAmKyAIyA9__AqEEArstUQrAY_AYA_tFQOyAM_Qo_A9_tGGtGG_AmqEE10background_AUQrqd___A_tAePRd__lFQOyAqEEArstUQrAY_AYA_tFQOyA21__Qo__AqEEArstUQrAY_AYA_tFQOyAE06_ShapeE0VyAE9RectangleVAE5ColorVG_Qo_Qo_SgtGGtGGyXEfU0_A42_yXEfU_A10_yXEfU_yyScMYccfU_yyYacfU_TQ1_ + 1 10 MyApp.debug.dylib 0x0000000105ff06d9 $s10MyApp11ContentViewV4bodyQrvg7SwiftUI6VStackVyAE05TupleE0VyAE6HStackVyAIyAE6SpacerV_AE6ButtonVyAE0E0PAEE5frame5width6height9alignmentQr12CoreGraphics7CGFloatVSg_AyE9AlignmentVtFQOyAqEE11scaledToFitQryFQOyAqEE10imageScaleyQrAE5ImageV0Z0OFQOyA3__Qo__Qo__Qo_GtGG_AmKyAIyAKyAIyAqEE7paddingyQrAE4EdgeO3SetV_AYtFQOyAA07CaptureM0V_Qo__AOyAE4TextVGAmKyAIyA9__AqEEArstUQrAY_AYA_tFQOyAM_Qo_A9_tGGtGG_AmqEE10background_AUQrqd___A_tAePRd__lFQOyAqEEArstUQrAY_AYA_tFQOyA21__Qo__AqEEArstUQrAY_AYA_tFQOyAE06_ShapeE0VyAE9RectangleVAE5ColorVG_Qo_Qo_SgtGGtGGyXEfU0_A42_yXEfU_A10_yXEfU_yyScMYccfU_yyYacfU_TATQ0_ + 1 11 MyApp.debug.dylib 0x0000000105f9c595 $sxIeAgHr_xs5Error_pIegHrzo_s8SendableRzs5NeverORs_r0_lTRTQ0_ + 1 12 MyApp.debug.dylib 0x0000000105f9fb3d $sxIeAgHr_xs5Error_pIegHrzo_s8SendableRzs5NeverORs_r0_lTRTATQ0_ + 1 13 libswift_Concurrency.dylib 0x000000019c49fe39 E15CC6EE-9354-3CE5-AF91-F641CA8283E0 + 433721 )
2
0
569
Feb ’25
Why personal music recommendations contain no more than 12 item?
Hi! I get personal recommendations MusicItemCollection using this code: func getRecommendations() async throws -> MusicItemCollection<MusicPersonalRecommendation> { let request = MusicPersonalRecommendationsRequest() let response = try await request.response() let recommendations = response.recommendations return recommendations } However, all recommendations contain no more than 12 MusicItem's, while the Music.app application provides much more for some recommendations, for example, for the You recently listened recommendation, the Music.app application displays 40 items. Each recommendation has an items property that contains a collection of musical items MusicItemCollection<MusicPersonalRecommendation.Item>, the hasNextBatch property for these collections is always false. I expected that for some collections loading of new items would be available. Please tell me if I'm doing something wrong or is this a MusicKit bug? Thank you!
1
0
452
Feb ’25