Hello Apple team and developer community,
I am preparing a visionOS app for a fair environment, where we want to automatically stream the current experience to a nearby monitor via AirPlay, without requiring guests or staff to manually interact with the Control Center or AirPlay pickers all the time.
The goal is to provide a smooth, frictionless setup so attendees can focus on the demo, not the configuration.
Feature Request:
A supported API or method to programmatically start/stop AirPlay video streaming (mirroring or external playback) from within a visionOS app, allowing the current experience to be instantly displayed on an external monitor or Apple TV for the audience.
Context & Rationale:
In a trade fair or exhibition setting, rapid guest turnaround and minimal staff intervention are crucial. Having to manually guide each visitor through AirPlay setup is impractical.
As I understood, AVRoutePickerView can be used for this on iOS/macOS, but this is not available in visionOS. Enabling similar automated streaming on visionOS would make the device far more suitable for live demos and public showcases.
Questions:
Are there any supported workarounds or best practices for enabling automated screen streaming or AirPlay initiation on visionOS in public demo environments that I missed?
Is Apple considering adding programmatic AirPlay control or accessibility features to support such use cases in future visionOS releases?
Thank you for considering this request! If there are recommended patterns, entitlements, or accessibility solutions we could explore for trade fair scenarios, your guidance would be greatly appreciated.
Best regards,
Julian Zürn - IPI, HS Kempten
Streaming
RSS for tagDeep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
I'm investigating an issue with LL-HLS playback using AVPlayer, specifically during DVR Live seeking (seeking to a past time).
I noticed that in certain seeking scenarios, AVPlayer sends a Blocking Playlist Reload request that includes the _HLS_msn parameter but is missing the _HLS_part parameter.
While I understand this is compliant with the HLS spec, I would like to know the specific criteria AVPlayer uses to decide when to drop the _HLS_part parameter. Does AVPlayer intentionally omit the part info when it determines that loading a specific partial segment is unnecessary during a seek operation?
Clarification on this behavior would help us greatly in debugging our stream delivery.
Thanks in advance.
For iOS17 we've had no problem playing Apple Fairplay encrypted content with keys delivered from our key server running on FairPlay Streaming Server SDK 5.1 and subsequently FairPlay Streaming Server SDK 26. It's built and deployed using Xcode Version 26.1.1 (17B100) with no changes to the code and - as expected - the content continued to be successfully decrypted and played (so far so good). However, as soon as a device was updated to iOS26, that device would no longer play the encrypted content.
Devices remaining on iOS17 continue to work normally and the debugging logs are a sanity-check that proves that. Is anyone else experiencing this issue?
Here's the code (you should be able to drop it into a fresh iOS Xcode project and provide a server url, content url and certificate).
Hello, I am developing a custom player SDK using AVPlayer to support HLS and LL-HLS live streaming. I have some questions about the internal logic of AVPlayer regarding ABR, as this information is not explicitly covered in the documentation.
ABR Switching Logic: Does AVPlayer trigger bitrate switching primarily based on stall occurrences (buffer starvation)? I am curious if the switching logic is reactive to stalls or if it proactively switches to prevent them based on throughput estimation.
Developer Controls for ABR: To influence or control the ABR selection, are preferredPeakBitRate and preferredForwardBufferDuration the only properties available to developers? Are there any other recommended APIs to assist with ABR decisions?
Thank you for your help.
Hi,
I understand that AVPlayer/AVFoundation doesn’t natively play MPEG-DASH manifests (.mpd) today, while HLS is supported and widely documented by Apple.
I’m not asking for roadmap commitments, but I’d like to understand whether there is any publicly documented rationale for not supporting DASH/MPD in AVFoundation (e.g., technical constraints, platform integration, DRM ecosystem, power/performance considerations, etc.).
Questions:
Is there any Apple statement / documentation explaining why DASH (MPD) isn’t supported in AVFoundation?
Is Apple’s recommended approach still “provide HLS for Apple clients” (potentially sharing CMAF segments and generating separate manifests)?
If there’s no public rationale, is filing Feedback Assistant the best channel for requesting MPD playback support?
Thanks!
Hi
We’re updating our KSM to support SPC v2/v3 and currently operate with both legacy SDK4 credentials (ASK + 1024 cert) and SDK26 credentials (certificate bundle + provisioning data + 1024/2048 keys).
Our client apps run across a wide range of iOS/tvOS versions, so we want to follow Apple’s recommended client strategy for certificate selection. The docs describe SHA‑1 vs SHA‑256 in the SPC header, but do not specify which OS versions should use SDK4 vs SDK26 credentials.
Could you clarify:
Is there an official minimum iOS/tvOS version where you recommend SDK26 credentials for client apps?
For older OS versions (e.g. iOS 15), is SDK4 still the recommended choice for client apps?
Are there any official migration guidelines for client apps moving from SDK4 to SDK26 credentials?
Thanks in advance.
I have an HDR10+ encoded video that if loaded as a mov plays back on the Apple Vision Pro but when that video is encoded using the latest (1.23b) Apple HLS tools to generate an fMP4 - the resulting m3u8 cannot be played back in the Apple Vision Pro and I only get back a "Cannot Open" error.
To generate the m3u8, I'm just calling mediafilesegmenter (with -iso-fragmented) and then variantplaylistcreator. This completes with no errors but the m3u8 will playback on the Mac using VLC but not on the Apple Vision Pro.
The relevant part of the m3u8 is:
#EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=40022507,BANDWIDTH=48883974,VIDEO-RANGE=PQ,CODECS="ec-3,hvc1.1.60000000.L180.B0",RESOLUTION=4096x4096,FRAME-RATE=24.000,CLOSED-CAPTIONS=NONE,AUDIO="audio1",REQ-VIDEO-LAYOUT="CH-STEREO"
{{url}}
Has anyone been able to use the HLS tools to generate fMP4s of MV-HEVC videos with HDR10?
Hello,
Our users have started to see a new fatal AVPlayer error during playback starting with iOS/tvOS 18.0. The error is defined as "CoreMediaErrorDomain Code=-15486".
We have not been able to reproduce this issue locally within our development team.
Is there any documentation on the cause of this error or steps to recover from this error?
Thank you,
Howard
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
HTTP Live Streaming
AVFoundation
Hello, our application is unable to HDMI output FairPlay protected content to TV via official Lightning HDMI AV Adapter, by checking the console log on mediaplayerd it is found that a CoreMediaErrorDomain Code=-19156 is raised, but we are unable to know what this error code means.
default 11:18:15.121584+0800 mediaplaybackd keyboss ckb_customURLReadCallback: 0x7fa62f800 60/0 customURLReqID 4 isComplete 1 err -19156 error <private> (0) dokeyCallbacksExist 0
default 11:18:15.121670+0800 mediaplaybackd keyboss ckb_processErrorForRequest: 0x7fa62f800 60/0 handler 4 err 0
default 11:18:15.121752+0800 mediaplaybackd <<<< FigCustomURLHandling >>>> curll_cancelRequestOnQueue: 0x7fa031360: requestID: 4
default 11:18:15.121932+0800 mediaplaybackd keyboss ckb_transitionRequestToTerminalState: 0x7fa62f800 60/0 reqFin err Error Domain=CoreMediaErrorDomain Code=-19156 (-19156) dokeyCallbacksExist 0
default 11:18:15.122025+0800 mediaplaybackd keyboss ckb_transitionRequestToTerminalState: 0x7fa62f800 60/0 retry
default 11:18:15.123195+0800 mediaplaybackd <<<< FigCPECryptorPKD >>>> PostKeyRequestErrorOccurred: 0x7fab7be80 029592C2-093D-400D-B57F-7AB06CC292D1 key request error: Error Domain=CoreMediaErrorDomain Code=-19160 (-19160)
Hi everyone,
After updating my Apple TV HD (model A1625) to tvOS 26, I’ve noticed a significant spike in CPU usage—up to 3× higher than before the update. Go from around 40% to 120%
Model: Apple TV HD (A1625)
tvOS Version: 26 (stable release) and beta version of 26.1,
App downgrade stream due to lack of cpu power
If anyone else is experiencing this, please share your findings or workarounds.
Would love to hear from Apple engineers or other developers if this is a known regression or if there’s a recommended fix.
Thanks!
We are getting reports from customers that they are not able to play videos in our app after updating their phones to iOS18.3.1.
(Further checking indicates that it happens on all iOS18 versions. It suddenly started occurring from February 18th, 2025)
When checking logs we see that playback is failing due to CoreMediaErrorDomain error -42709.
This is an undocumented error code and hence we do not know the cause of the playback issue.
Does anyone know what this error code means and how the app should handle it?
Reported as FB16638501.
Topic:
Media Technologies
SubTopic:
Streaming
Hi everyone,
We’re currently developing a music-based app using MusicKit, and we recently noticed that iOS 26 beta introduces a new “Automix” feature in the Apple Music app. This enables seamless DJ-style transitions between songs—beyond the standard crossfade functionality.
We’re trying to understand:
Will this Automix feature be accessible to third-party apps that use MusicKit?
If not available in the initial iOS 26 release, is there a plan to expose it through public APIs in a future update?
Is there any technical documentation, WWDC session, or roadmap info regarding Automix support via MusicKit?
This functionality would be a significant enhancement for our app, especially for intelligent audio transitions and curated playlists.
Thanks.
Hello,
I am currently developing a video player using Custom AVPlayer SDK and testing LL-HLS live streaming.
I encountered a specific error, CoreMediaErrorDomain -15418, during playback. I have searched through the official documentation and the forums, but I could not find any information regarding this error code.
I would like to inquire about the following:
Description & Cause: What does the error code -15418 specifically represent in the context of CoreMedia and LL-HLS?
Severity: Is this a critical error that halts playback, or is it merely a warning?
Environment Details:
iOS Version: iOS 26.2
Device: iPhone 15 Pro Max
Stream Type: LL-HLS (Low-Latency HLS)
Impact: Quality drops
Any insights or references to documentation would be greatly appreciated.
Thank you.
Hi,
After updating to iOS 26, our app is experiencing playback failures with AVPlayer. The same code and streams work fine on iOS 18 and earlier.
Error:
Domain [CoreMediaErrorDomain]
Code [-15628]
Description [The operation couldn’t be completed.]
Underlying Error Domain [(null)]
Code [0]
Description [(null)]
Environment:
iOS version: iOS 26
Stream type: HLS (m3u8) with segment (.ts) files
Observed behaviour:
We don’t have concrete steps to reproduce the issue, but so far, we have observed that this error tends to occur under low network conditions.
I'm having a crash on an app that plays videos when the users activates close captions.
I was able to replicate the issue on an empty project. The crash happens when the AVPlayerLayer is used to instantiate an AVPictureInPictureController
These are the example project where I tested the crash:
struct ContentView: View {
var body: some View {
VStack {
VideoPlaylistView()
}
.frame(maxWidth: .infinity, maxHeight: .infinity)
.background(Color.black.ignoresSafeArea())
}
}
class VideoPlaylistViewModel: ObservableObject {
// Test with other videos
var player: AVPlayer? = AVPlayer(url: URL(string:"https://d2ufudlfb4rsg4.cloudfront.net/newsnation/WIpkLz23h/adaptive/WIpkLz23h_master.m3u8")!)
}
struct VideoPlaylistView: View {
@StateObject var viewModel = VideoPlaylistViewModel()
var body: some View {
ScrollView {
VideoCellView(player: viewModel.player)
.onAppear {
viewModel.player?.play()
}
}
.scrollTargetBehavior(.paging)
.ignoresSafeArea()
}
}
struct VideoCellView: View {
let player: AVPlayer?
@State var isCCEnabled: Bool = false
var body: some View {
ZStack {
PlayerView(player: player)
.accessibilityIdentifier("Player View")
}
.containerRelativeFrame([.horizontal, .vertical])
.overlay(alignment: .bottom) {
Button {
player?.currentItem?.asset.loadMediaSelectionGroup(for: .legible) { group,error in
if let group {
let option = !isCCEnabled ? group.options.first : nil
player?.currentItem?.select(option, in: group)
isCCEnabled.toggle()
}
}
} label: {
Text("Close Captions")
.font(.subheadline)
.foregroundStyle(isCCEnabled ? .red : .primary)
.buttonStyle(.bordered)
.padding(8)
.background(Color.blue.opacity(0.75))
}
.padding(.bottom, 48)
.accessibilityIdentifier("Button Close Captions")
}
}
}
import Foundation
import UIKit
import SwiftUI
import AVFoundation
import AVKit
struct PlayerView: UIViewRepresentable {
let player: AVPlayer?
func updateUIView(_ uiView: UIView, context: UIViewRepresentableContext<PlayerView>) {
}
func makeUIView(context: Context) -> UIView {
let view = PlayerUIView()
view.playerLayer.player = player
view.layer.addSublayer(view.playerLayer)
view.layer.backgroundColor = UIColor.red.cgColor
view.pipController = AVPictureInPictureController(playerLayer: view.playerLayer)
view.pipController?.requiresLinearPlayback = true
view.pipController?.canStartPictureInPictureAutomaticallyFromInline = true
view.pipController?.delegate = view
return view
}
}
class PlayerUIView: UIView, AVPictureInPictureControllerDelegate {
let playerLayer = AVPlayerLayer()
var pipController: AVPictureInPictureController?
override init(frame: CGRect) {
super.init(frame: frame)
}
required init?(coder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
override func layoutSubviews() {
super.layoutSubviews()
playerLayer.frame = bounds
playerLayer.backgroundColor = UIColor.green.cgColor
}
func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController, failedToStartPictureInPictureWithError error: any Error) {
print("Error starting Picture in Picture: \(error.localizedDescription)")
}
}
class AppDelegate: NSObject, UIApplicationDelegate {
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey : Any]? = nil) -> Bool {
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(.playback, mode: .moviePlayback)
try audioSession.setActive(true)
} catch {
print("ERR: \(error.localizedDescription)")
}
return true
}
}
UITest to make the app crash:
final class VideoPlaylistSampleUITests: XCTestCase {
func testCrashiOS26ToggleCloseCaptions() throws {
let app = XCUIApplication()
app.launch()
let videoPlayer = app.otherElements["Player View"]
XCTAssertTrue(videoPlayer.waitForExistence(timeout: 30))
let closeCaptionButton = app.buttons["Button Close Captions"]
for _ in 0..<2000 {
closeCaptionButton.tap()
}
}
}
In our Apple TV application, we use the native AVPlayer for live playback functionality. Until tvOS 17.6 and during the tvOS 18 beta, the Pause/Resume feature worked as expected, allowing us to pause live playback. However, after updating to tvOS 18.1, the pause functionality no longer works.
The same app still works fine on tvOS 17, but on tvOS 18, attempting to pause live playback has no effect. We reviewed the tvOS 18 release notes but couldn't find any relevant changes or deprecations related to AVPlayer or live playback behavior.
Has there been any change in the handling of live playback or the Pause/Resume functionality in tvOS 18.1? Any guidance or suggestions to address this issue would be greatly appreciated.
Thank you!
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
Media Player
HTTP Live Streaming
Hi,
After updating to iOS 26, our app is facing playback failures with AVPlayer. The same code and streams work fine on iOS 18 and earlier.
Error - Domain[CoreMediaErrorDomain]:Code[-15628]:Desc[The operation couldn’t be completed.]:Underlying Error Domain[(null)]:Code[0]:Desc[(null)]
Environment:
iOS version: ios 26
React Native: 0.69
Video library: react-native-video (AVPlayer under the hood)
Stream type: HLS (m3u8) with segment (.ts) files
Observed behaviour:
Playback works initially on iOS 26.
On iOS 26, the stream fails at runtime after a few seconds/minutes (not on first load).
Network logs show 307 redirects on some segment requests. After this, AVPlayer throws the above error.
Playback fails intermittently on slow/unstable networks.