Streaming

RSS for tag

Deep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.

Streaming Documentation

Posts under Streaming subtopic

Post

Replies

Boosts

Views

Activity

Are there known cases where DepthData is empty while Face ID is working?
We are experiencing an issue related to DepthData from the TrueDepth camera on a specific device. On December 1, we tested with the complainant’s device iPhone 14 / iOS 26.0.1, and observed that the depth image is received with empty values. However, the same implementation works normally on iPhone 17 Pro Max (iOS 26.1) and iPhone 13 Pro Max (iOS 26.0.1), where depth data is delivered correctly. In the problematic case: TrueDepth camera is active Face ID works normally The app receives a DepthData object, but all values are empty (0), not nil Because the DepthData object is not nil, this makes it difficult to detect the issue through software fallback handling. We developed the feature with reference to the following Apple sample: https://developer.apple.com/documentation/AVFoundation/streaming-depth-data-from-the-truedepth-camera We would like to ask: Are there known cases where Face ID functions normally but DepthData from the TrueDepth camera is returned as empty values? If so, is there a recommended approach for identifying or handling this situation? Any guidance from Apple engineers or the community would be greatly appreciated. Thank you.
0
0
137
2w
Use AVPlayer for multiple videos
I'm developing a tutorial style tvOS app with multiple videos. The examples I've seen so far deal with only one video. Defining the player and source(url) before body view let avPlayer = AVPlayer(url: URL(string: "https://domain.com/.../.../video.mp4")!)) and then in the body view the video is displayed VideoPlayer(player: avPlayer) This allows options such as stop/start etc. When I try something similar with a video title passed into this view I can't define the player with this title variable. var vTitle: String var avPlayer = AVPlayer(url: URL(string: "https://domain.com/.../.../" + vTitle + ".mp4"")!)) var body: some View { I het an error that vTitle can't be used in the url above the body view. Any thoughts or suggestions? Thanks
1
0
807
Dec ’24
iOS Radio App: Need to extract and stream audio-only from HLS streams with video content
I'm developing an iOS radio app that plays various HLS streams. The challenge is that some stations broadcast HLS streams containing both audio and video (example: https://svs.itworkscdn.net/smcwatarlive/smcwatar/chunks.m3u8), but I want to: Extract and play only the audio track Support AirPlay for audio-only streaming Minimize data usage by not downloading video content Technical Details: iOS 17+ Swift 5.9 Using AVFoundation for playback Current implementation uses AVPlayer with AVPlayerItem Current Code Structure: class StreamPlayer: ObservableObject { @Published var isPlaying = false private var player: AVPlayer? private var playerItem: AVPlayerItem? func playStream(url: URL) { let asset = AVURLAsset(url: url) playerItem = AVPlayerItem(asset: asset) player = AVPlayer(playerItem: playerItem) player?.play() } Stream Analysis: When analyzing the video stream using FFmpeg: CopyInput #0, hls, from 'https://svs.itworkscdn.net/smcwatarlive/smcwatar/chunks.m3u8': Stream #0:0: Video: h264, yuv420p(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 25 fps Stream #0:1: Audio: aac, 44100 Hz, stereo, fltp Attempted Solutions: Using MobileFFmpeg: let command = [ "-i", streamUrl, "-vn", "-acodec", "aac", "-ac", "2", "-ar", "44100", "-b:a", "128k", "-f", "mpegts", "udp://127.0.0.1:12345" ].joined(separator: " ") ffmpegProcess = MobileFFmpeg.execute(command) Issue: While FFmpeg successfully extracts audio, playback through AVPlayer doesn't work reliably. Tried using HLS output: let command = [ "-i", streamUrl, "-vn", "-acodec", "aac", "-ac", "2", "-ar", "44100", "-b:a", "128k", "-f", "hls", "-hls_time", "2", "-hls_list_size", "3", outputUrl.path ] Issue: Creates temporary files but faces synchronization issues with live streams. Requirements: Real-time audio extraction from HLS stream Maintain live streaming capabilities Full AirPlay support Minimal data usage (avoid downloading video content) Handle network interruptions gracefully Questions: What's the most efficient way to extract only audio from an HLS stream in real-time? Is there a way to tell AVPlayer to ignore video tracks completely? Are there better alternatives to FFmpeg for this specific use case? What's the recommended approach for handling AirPlay with modified streams? Any guidance or alternative approaches would be greatly appreciated. Thank you!
1
1
489
Dec ’24
ApplicationMusicPlayer stops with error after skipping quickly over playlist entries
ApplicationMusicPlayer with queue created from playlist crashes with random occurrence shortly after skipping back or forth using controls embedded in the notification, with the error on console log: applicationController: xpc service connection interrupted. I've noticed that the issue occurs more frequently the shorter is time between skipping entries. Since ApplicationMusicPlayer is run on a remote process, the main app does not crash, but the music stops playing without any exception, and the playback control turns uninitiated. Here is how I'm initiating the queue: let entries = playlist .with(.entries).entries! .map { ApplicationMusicPlayer.Queue.Entry($0) } ApplicationMusicPlayer.shared.queue = .init( entries, startingAt: entries.last ) Please give me some tips on how to solve this. EDIT: The issue does not occur when navigating quickly through the station.
1
0
510
Jan ’25
The operation couldn’t be completed error during live restart playback in native AV player
In our Apple TV application, we are using the native AVPlayer for live playback functionality. During live restart playback, we intermittently encounter an error when the playback timeline approaches the actual live event end time. Error: The operation couldn’t be completed. (CoreMediaErrorDomain error -16839 - Unable to get playlist before long download timer) / Failure reason: Scenario: The live event is scheduled from 7:00 AM to 8:00 AM. Restart playback begins at 7:20 AM, allowing the user to watch the event from the start while the live stream continues in real-time. As the restart playback timeline approaches the actual event end time (8:00 AM), AVPlayer displays an error, and playback continues in the background.
1
0
825
Jan ’25
EXT-X-DISCONTINUITY not properly handled in native iOS and Safari player (AVPlayer), broken playback.
Hi folks, When doing HLS v6 live streaming with fmp4 chunks we noticed that when the encoder timestamps slightly drift and a #EXT-X-DISCONTINUITY tag is created in either the audio or video playlist (in an ABR setup), the tag is not correctly handled by the player leading to a broken playback containing black screen or no audio (depending on which playlist the tag is printed in). We noticed that this is often true when the number of tags is odd between the playlists (eg. the audio playlist contains 1 tag and the video contains 2 tags will result in a black screen with audio). By using the same "broken" source but using Shaka player instead won't break the playback at all. Are there any possible fix (or upcoming) for AV Player?
1
0
608
Jan ’25
CoreMediaErrorDomain error -12927 with HEVC and DRM
I can't play video content with HEVC and DRM. Tested HEVC only: OK. Tested DRM+AVC: Ok. Tested 2 players (Clappr/Stevie and BitMovin) Master, variants and EXT-X-MAPs are downloaded Ok, DRM keys Ok and then, for instance with BitMovin Player: [BMP] [Player] [Error] Event: SourceError, Data: {"code":2001,"data":{"message":"The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.)","code":-12927},"message":"Source Error. The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.)","timestamp":1740320663.4505711,"type":"onSourceError"} code: 2001 [Data code: -12927, message: The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.), underlying error: Error Domain=CoreMediaErrorDomain Code=-12927 "(null)"] 4k-master.m3u8.txt 4k.m3u8.txt 4k-audio.m3u8.txt
1
0
533
Feb ’25
How to delete FPS Certificate from Apple developer account
Hello All, I am looking for assistance with our FairPlay Streaming (FPS) certificates. We are in the process of migrating to a new video streaming vendor and need to create a new FPS certificate using SDK 4. However, we have reached the limit of allowed FPS certificates in our account and cannot create a new one. Issue Details: • We currently have two FPS certificates active in our developer account. • One of these was created using SDK 5, but our new vendor (Mux) requires an FPS certificate based on SDK 4. • Since Apple does not allow deleting FPS certificates from the developer portal, we are unable to create a new SDK 4 certificate. • We kindly request Apple to revoke one of our existing FPS certificates to allow us to generate a new SDK 4 certificate. Request: We would greatly appreciate it if you could assist us on how to delete one of our existing FPS certificates so that we can proceed with creating a new SDK 4 certificate for our vendor integration. Thank you for your support.
1
1
611
Oct ’25
CoreMediaErrorDomain -42709 error
Hello, I am developing a video streaming service that uses FairPlay. Since around February 20th, we have started receiving reports of CoreMediaErrorDomain -42709 errors. Unfortunately, there is no documentation from Apple that explains what this error means, so we are not sure how to address or fix the issue. Most of the users who reported this error are using iOS 18.2.1 and iOS 18.3.1. Could you please advise on what we should check or how we might resolve this error?
1
1
552
Mar ’25
"Conduct marketing and request technological support."
A few months ago, I had the opportunity to receive a 2018 iMac, and I’ve been using it to create content for my social media. I was truly impressed by the power of its processors. Even with this older model, I’ve been able to grow my presence online—something I couldn’t achieve with newer computers from other brands that I previously purchased. I would love to become a promoter of your brand in the gaming world. All I ask for is technological support with more recent equipment and a minimal payment for collaborating with you. I am genuinely interested in being part of your company and leveraging the potential and reputation of Apple to reach even greater heights.
1
0
154
Mar ’25
FairPlay-Protected HLS Files Not Transferred via Quick Start
I have an iOS app that downloads HLS files, which are protected by FairPlay. These files are stored locally, and their locations are managed using Core Data. When playing these tracks, I use AVURLAsset to access the stored file paths. Recently, a client upgraded to a new iPhone and used Quick Start to transfer data from his old device. While all other app data was successfully transferred, including Core Data records and UserDefaults, the actual HLS files were missing. As a result, the app retained metadata about the downloaded content, but the files themselves were gone, causing playback failures. Does Quick Start exclude certain types of locally stored files, especially DRM-protected HLS downloads, or is the issue related to how FairPlay-protected content is handled during the transfer of locally stored files?
1
0
159
Mar ’25
Support for encrypted Mp4 on Apple devices.
Hello, hope everybody is doing well. I have some reels (of aspect ratio 9X16) content, which I want to playback on iOS phones. My question is does AV player support out of box playback of encrypted Mp4. Please note, this is not HLS fMp4, rather unfragmented Mp4 content. If it is supported, what algorithm of encryption shall be used? Please let me know.
1
0
98
Mar ’25
Processing AVCaptureVideoDataOutput video stream with appleLog and HLG_BT2020 AVCaptureColorSpace input
I’m building a professional camera app where users can customize the video recording format and color grading. In the func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) method, I handle video frames and use Metal for real-time color grading. This works well when device.activeColorSpace is sRGB or P3, and the results are great. However, when the color space is HLG_BT2020 or appleLog, the MTKTextureLoader.newTexture(cgImage: cgImage, options: options) method throws an error. After researching, I found that the video frame in these color spaces has a bit-per-channel (bpc) greater than 8 after being converted to CGImage, causing the texture creation to fail. I tried converting the CGImage to a lower bpc to successfully create the texture, but the final output image is garbled and not as expected. Is there a solution to this issue?
1
0
121
Apr ’25
Level Networking on watchOS for Duplex audio streaming
I did watch WWDC 2019 Session 716 and understand that an active audio session is key to unlocking low‑level networking on watchOS. I’m configuring my audio session and engine as follows: private func configureAudioSession(completion: @escaping (Bool) -> Void) { let audioSession = AVAudioSession.sharedInstance() do { try audioSession.setCategory(.playAndRecord, mode: .voiceChat, options: []) try audioSession.setActive(true, options: .notifyOthersOnDeactivation) // Retrieve sample rate and configure the audio format. let sampleRate = audioSession.sampleRate print("Active hardware sample rate: \(sampleRate)") audioFormat = AVAudioFormat(standardFormatWithSampleRate: sampleRate, channels: 1) // Configure the audio engine. audioInputNode = audioEngine.inputNode audioEngine.attach(audioPlayerNode) audioEngine.connect(audioPlayerNode, to: audioEngine.mainMixerNode, format: audioFormat) try audioEngine.start() completion(true) } catch { print("Error configuring audio session: \(error.localizedDescription)") completion(false) } } private func setupUDPConnection() { let parameters = NWParameters.udp parameters.includePeerToPeer = true connection = NWConnection(host: "***.***.xxxxx.***", port: 0000, using: parameters) setupNWConnectionHandlers() } private func setupTCPConnection() { let parameters = NWParameters.tcp connection = NWConnection(host: "***.***.xxxxx.***", port: 0000, using: parameters) setupNWConnectionHandlers() } private func setupWebSocketConnection() { guard let url = URL(string: "ws://***.***.xxxxx.***:0000") else { print("Invalid WebSocket URL") return } let session = URLSession(configuration: .default) webSocketTask = session.webSocketTask(with: url) webSocketTask?.resume() print("WebSocket connection initiated") sendAudioToServer() receiveDataFromServer() sendWebSocketPing(after: 0.6) } private func setupNWConnectionHandlers() { connection?.stateUpdateHandler = { [weak self] state in DispatchQueue.main.async { switch state { case .ready: print("Connected (NWConnection)") self?.isConnected = true self?.failToConnect = false self?.receiveDataFromServer() self?.sendAudioToServer() case .waiting(let error), .failed(let error): print("Connection error: \(error.localizedDescription)") DispatchQueue.main.asyncAfter(deadline: .now() + 2) { self?.setupNetwork() } case .cancelled: print("NWConnection cancelled") self?.isConnected = false default: break } } } connection?.start(queue: .main) } Duplex in this context refers to two-way audio transmission simultaneously recording and sending audio while also receiving and playing back incoming audio, similar to a VoIP/SIP call. The setup works fine on the simulator, which suggests that the core logic is correct. However, since the simulator doesn’t fully replicate WatchOS hardware behavior especially for audio sessions and networking issues might arise when running on a real device. The problem likely lies in either the Watch’s actual hardware limitations, permission constraints, or specific audio session configurations. I am reaching out to seek further assistance regarding the challenges I've been experiencing with establishing a UDP, TCP & web socket connection on watchOS using NWConnection for duplex audio streaming. Despite implementing the recommendations provided earlier, I am still encountering difficulties From what I can see, your implementation is focused on streaming audio playback with the server. In my case, I'm looking for a slightly different approach: I want to capture audio and send buffers of a specific size to the server while playing audio simultaneously, essentially achieving full duplex streaming similar to a VOIP call. Additionally, I’d like to ensure that if no external audio route is connected, the Apple Watch speaker is used by default. Any thoughts or insights on adapting this setup for those requirements would be very welcome.
1
0
132
Apr ’25
AVAssetWriterInputTaggedPixelBufferGroupAdaptor Hanging With Tagged Buffers
We've successfully implemented an AVAssetWriter to produce HLS streams (all code is Objective-C++ for interop with existing codebase) but are struggling to extend the operations to use tagged buffers. We're starting to wonder if the tagged buffers required for an MV-HEVC signal are fully supported when producing HLS segments in a live-stream setting. We generate a live stream of data using something like: UTType *t = [UTType typeWithIdentifier:AVFileTypeMPEG4]; m_writter = [[AVAssetWriter alloc] initWithContentType:t]; // - videoHint describes HEVC and width/height // - m_videoConfig includes compression settings and, when using MV-HEVC, // the correct keys are added (i.e. kVTCompressionPropertyKey_MVHEVCVideoLayerIDs) // The app was throwing an exception without these which was // useful to know when we got the configuration right. m_video = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:m_videoConfig sourceFormatHint:videoHint]; For either path we're producing CVPixelBufferRefs that contain the raw pixel information (i.e. 32BGRA) so we use an adapter to make that as simple as possible. If we use a single view and a AVAssetWriterInputPixelBufferAdaptor things work out very well. We produce segments and the delegate is called. However, if we use the AVAssetWriterInputTaggedPixelBufferGroupAdaptor as exampled in the SideBySideToMVHEVC demo project, things go poorly. We create the tagged buffers with something like: CMTagCollectionRef collections[2]; CMTag leftTags[] = { CMTagMakeWithSInt64Value( kCMTagCategory_VideoLayerID, (int64_t)0), CMTagMakeWithSInt64Value( kCMTagCategory_StereoView, kCMStereoView_LeftEye) }; CMTagCollectionCreate( kCFAllocatorDefault, leftTags, 2, &(collections[0]) ); CMTag rightTags[] = { CMTagMakeWithSInt64Value( kCMTagCategory_VideoLayerID, (int64_t)1), CMTagMakeWithSInt64Value( kCMTagCategory_StereoView, kCMStereoView_RightEye) }; CMTagCollectionCreate( kCFAllocatorDefault, rightTags, 2, &(collections[1]) ); CFArrayRef tagCollections = CFArrayCreate( kCFAllocatorDefault, (const void **)collections, 2, &kCFTypeArrayCallBacks ); CVPixelBufferRef buffers[] = {*b, *alt}; CFArrayRef b = CFArrayCreate( kCFAllocatorDefault, (const void **)buffers, 2, &kCFTypeArrayCallBacks ); CMTaggedBufferGroupRef bufferGroup; OSStatus res = CMTaggedBufferGroupCreate( kCFAllocatorDefault, tagCollections, b, &bufferGroup ); Perhaps there's something about this OBJC code that I've buggered up? Hopefully! Anyways, when I submit this tagged bugger group to the adaptor: if (![mvVideoAdapter appendTaggedPixelBufferGroup:bufferGroup withPresentationTime:pts]) { // report error... } Appending does not raise any errors - eventually it just hangs on us and we never return from it... Real Issue: So either: The delegate assigned to the AVAssetWriter doesn't fire its assetWriter callback which should produce the segments The adapter hangs on the appendTaggedPixelBufferGroup before a segment is ready to be completed (but succeeds for a number of buffer groups before this happens). This is the same delegate class that's assigned to the non multi view code path if MV-HEVC is turned off which works perfectly.
1
0
93
Apr ’25
FairPlay HLS Downloaded Asset Fails to Play on First Attempt When Offline, Works on Retry
Hello, We're seeing an intermittent issue when playing back FairPlay-protected HLS downloads while the device is offline. Assets are downloaded using AVAggregateAssetDownloadTask with FairPlay protection. After download, asset.assetCache.isPlayableOffline == true. On first playback attempt (offline), ~8% of downloads fail. Retrying playback always works. We recreate the asset and player on each attempt. During the playback setup, we try to load variants via: try await asset.load(.variants) This call sometimes fails with: Error Domain=NSURLErrorDomain Code=-1009 “The Internet connection appears to be offline.” UserInfo={NSUnderlyingError=0x105654a00 {Error Domain=NSURLErrorDomain Code=-1009 “The Internet connection appears to be offline.” UserInfo={NSDescription=The Internet connection appears to be offline.}}, NSErrorFailingURLStringKey=file:///private/var/mobile/Containers/Data/Application/2DDF9D7C-9197-46BE-8690-C23EE75C9E90/Library/com.apple.UserManagedAssets.XVvqfh/Baggage_9DD4E2D3F9C0E68F.movpkg/, NSErrorFailingURLKey=file:///private/var/mobile/Containers/Data/Application/2DDF9D7C-9197-46BE-8690-C23EE75C9E90/Library/com.apple.UserManagedAssets.XVvqfh/Baggage_9DD4E2D3F9C0E68F.movpkg/, NSURL=file:///private/var/mobile/Containers/Data/Application/2DDF9D7C-9197-46BE-8690-C23EE75C9E90/Library/com.apple.UserManagedAssets.XVvqfh/Baggage_9DD4E2D3F9C0E68F.movpkg/, AVErrorFailedDependenciesKey=( “assetProperty_HLSAlternates” ), NSLocalizedDescription=The Internet connection appears to be offline.} This variant load is used to determine available audio tracks, check for Dolby support, and apply user language preferences. After this step, the AVPlayerItem also fails via Combine’s publisher for .status. However, retrying the entire process immediately after (same offline conditions, same asset path, new AVURLAsset) results in successful playback. Assets are represented using the following class: public class DownloadedAsset: AVURLAsset { public let id: String public let localFileUrl: URL public let fairplayLicenseUrlString: String? public let drmToken: String? var isProtected: Bool { return fairplayLicenseUrlString != nil } public init(id: String, localFileUrl: URL, fairplayLicenseUrlString: String?, drmToken: String?) { self.id = id self.localFileUrl = localFileUrl self.fairplayLicenseUrlString = fairplayLicenseUrlString self.drmToken = drmToken super.init(url: localFileUrl, options: nil) } } We use user-selected quality levels to control bitrate and multichannel (e.g. Dolby 5.1) downloads: let downloadQuality = UserDefaults.standard.downloadVideoQuality let bitrate: Int let shouldDownloadMultichannelTracks: Bool switch downloadQuality { case .dataSaver: shouldDownloadMultichannelTracks = false bitrate = 596564 case .standard: shouldDownloadMultichannelTracks = false bitrate = 1503844 case .best: shouldDownloadMultichannelTracks = true bitrate = 7038970 } var selections = multichannelIdentifiedMediaSelections if !shouldDownloadMultichannelTracks { selections = selections.filter { !$0.isMultichannel } } let task = session.aggregateAssetDownloadTask( with: asset, mediaSelections: selections.map { $0.mediaSelection }, assetTitle: title, assetArtworkData: nil, options: [AVAssetDownloadTaskMinimumRequiredMediaBitrateKey: bitrate] ) Seen on devices running iOS 16, iOS 17, and iOS 18. What could cause the initial failure of an otherwise valid, offline-ready FairPlay HLS asset? Could .load(.variants) internally trigger a failed network resolution, even when offline? Is there an internal caching or initialization behavior in AVFoundation that might explain why the second attempt works? Any guidance would be appreciated.
1
0
202
Jun ’25