Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Created

Clarification on SFSpeechRecognizer system alert message and service URLs for whitelisting
Hello Apple Engineers, I am developing a feature related to SpeechRecognizer (import Speech), and I have two questions: After adding the NSSpeechRecognitionUsageDescription key in my Info.plist, when I initialize an SFSpeechRecognizer instance, the system shows an authorization alert. The alert contains a pre-defined message from Apple: “Speech data from this app will be sent to Apple to process your requests. This will also help Apple improve its speech recognition technology.” Is it possible to remove or customize this message? My app runs in a network environment with a whitelist. I need to know which URL the SFSpeechRecognizer instance connects to, and which port it uses, so that I can add it to the whitelist. Thank you very much for your support! Best regards, Yu Cheng
0
0
66
Sep ’25
Orientation does not work on iPhone 17 and above.
I'm receiving output from avcapturesession and capturing an image using Vision, but the image is output in landscape orientation instead of portrait. Even when I set the orientation to up in ciimage, cgimage, and uiimage, the image is still output in landscape orientation. On iPhones 16 and below, the image is output in portrait orientation. But on iPhones 17 and above, the image is output in landscape orientation. Please help.
1
1
245
Sep ’25
Problems recording audio on Tahoe 26.0 (Intel only)
I have some tried-and-tested code that records and plays back audio via AUHAL which breaks on Tahoe on Intel. The same code works fine on Sequioa and also works on Tahoe on Apple Silicon. To start with something simple, the following code to request access to the Microphone doesn't work as it should: bool RequestMicrophoneAccess () { __block AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType: AVMediaTypeAudio]; if (status == AVAuthorizationStatusAuthorized) return true; __block bool done = false; [AVCaptureDevice requestAccessForMediaType: AVMediaTypeAudio completionHandler: ^ (BOOL granted) { status = (granted) ? AVAuthorizationStatusAuthorized : AVAuthorizationStatusDenied; done = true; }]; while (!done) CFRunLoopRunInMode (kCFRunLoopDefaultMode, 2.0, true); return status == AVAuthorizationStatusAuthorized; } On Tahoe on Intel, the code runs to completion but granted is always returned as NO. Tellingly, the popup to ask the user to grant microphone access is never displayed, even though the app is not present in the Privacy pane and never appears there. On Apple Silicon, everything works fine. There are some other problems, but I'm hoping they have a common underlying cause and that the Apple guys can figure out what's wrong from the information in this post. I'd be happy to test any potential fix. Thanks.
2
0
436
Sep ’25
Failed to change the TTS language to CN or TW
I have some question about the TTS My device default language is zh-HK. (cantonese) my device is iPhone 16 Pro, IOS 18.6 I create a function speakMandarin I want the device to speak the zh-CN (putonghua) however the device only can speak zh-HK (cantonese). I already set the AVSpeechSynthesisVoice language as zh-CN func speakMandarin(text: String) { print("speakMandarin, \(text)") lastError = nil // Reset error // Stop any ongoing speech before starting new if synthesizer.isSpeaking { synthesizer.stopSpeaking(at: .immediate) } // Configure speech utterance let utterance = AVSpeechUtterance(ssmlRepresentation: text)! utterance.rate = 0.5 // Natural speaking speed utterance.pitchMultiplier = 1.0 utterance.volume = 1.0 utterance.voice = AVSpeechSynthesisVoice(language: "zh-CN") let preferredLanguages = [ "zh-CN" , "zh-TW"] var selectedVoice: AVSpeechSynthesisVoice? for lang in preferredLanguages { if let voice = AVSpeechSynthesisVoice(language: lang) { print(lang) selectedVoice = voice utterance.voice = voice break } } // If no Mandarin voice found, use system default if selectedVoice == nil { selectedVoice = AVSpeechSynthesisVoice(language: nil) lastError = "未偵測到普通話語音包,將使用系統預設語音" print (lastError) } utterance.voice = selectedVoice print(utterance) synthesizer.speak(utterance) } here is my log speakMandarin, <speak>你好!我們來聊聊喜歡的動物吧。你喜歡什麼動物呢?</speak> zh-CN [AVSpeechUtterance 0x1194efb80] String: 你好!我們來聊聊喜歡的動物吧。你喜歡什麼動物呢? Voice: [AVSpeechSynthesisVoice 0x104ceff90] Language: zh-CN, Name: Tingting, Quality: Default [com.apple.voice.compact.zh-CN.Tingting] Rate: 0.50 Volume: 1.00 Pitch Multiplier: 1.00 Delays: Pre: 0.00(s) Post: 0.00(s)
0
0
235
Sep ’25
CoreMediaErrorDomain -15628 playback failure in iOS 26 (React Native / AVPlayer, HLS stream)
Hi, After updating to iOS 26, our app is facing playback failures with AVPlayer. The same code and streams work fine on iOS 18 and earlier. Error - Domain[CoreMediaErrorDomain]:Code[-15628]:Desc[The operation couldn’t be completed.]:Underlying Error Domain[(null)]:Code[0]:Desc[(null)] Environment: iOS version: ios 26 React Native: 0.69 Video library: react-native-video (AVPlayer under the hood) Stream type: HLS (m3u8) with segment (.ts) files Observed behaviour: Playback works initially on iOS 26. On iOS 26, the stream fails at runtime after a few seconds/minutes (not on first load). Network logs show 307 redirects on some segment requests. After this, AVPlayer throws the above error. Playback fails intermittently on slow/unstable networks.
5
18
1.1k
Sep ’25
Convert CoreAudio AudioObjectID to IOUSB LocationID
Is there a recommended way on macOS 26 Tahoe to take a CoreAudio AudioObjectID and use it to lookup the underlying USB LocationID? I previously used AudioObjectID to query the corresponding DeviceUID with kAudioDevicePropertyDeviceUID. Then I queried for the IOService matching kIOAudioEngineClassName with property kIOAudioEngineGlobalUniqueIDKey matching DeviceUID, and I loaded kUSBDevicePropertyLocationID from the result. This fails on macOS 26, because the IO Registry for the device has an entry for usbaudiod rather than AppleUSBAudioEngine, and usbaudiod does not include a kIOAudioEngineGlobalUniqueIDKey property (or any other property to map it to a CoreAudio DeviceUID). My use-case here is a piece of audio recording software that allows configuring a set of supported audio devices via USB HID prior to recording. I present the user with a list of CoreAudio devices to use, but without a way to lookup the underlying USB LocationID, I cannot guarantee that the configured device matches the selected device (e.g. if the user plugged in two identical microphones).
2
0
517
Sep ’25
Memory leak on processing stereoscopic video frame, makeMutablePixelBuffer()
Hi, I downloaded and ran https://developer.apple.com/documentation/realitykit/rendering-stereoscopic-video-with-realitykit and noticed that memory usage grows linearly. I replaced the sample video with a different 8k side by side video, and the app crashed almost immediately due to memory leak. it looks like the culprit is from makeMutablePixelBuffer() function and the allocated pixelBuffers are not recycled after being used. screenshot is from a physical device.
0
0
320
Sep ’25
iPhone 17 smart framing api not working
I tried to modify the AVCam sample code by copying the code here https://developer.apple.com/documentation/avfoundation/adopting-smart-framing-in-your-camera-app#Configure-the-smart-framing-monitor smart framing monitors I can ensure the activeformat supports smart framing, but the supported frames in monitor is always nil. In my another project it has supported value, but the observation has never been triggered, then I tried to keep printing the recommended frame, it's always nil. Could the engineer embed the code into AVCam rather than posting a few code pieces?
0
0
146
Sep ’25
Disabling Hardware OIS via AVFoundation — Clarification on AVCaptureVideoStabilizationMode
Hello everyone, I'm looking for a definitive clarification on how to completely disable all video stabilization, including the hardware OIS, using AVFoundation. The goal is to achieve a completely raw, unstabilized video feed, which is crucial when using external equipment like gimbals to avoid conflicting stabilization motions. My research points to using the AVCaptureConnection property preferredVideoStabilizationMode and setting it to AVCaptureVideoStabilizationMode.off. The documentation for the .off case states: A mode that doesn’t stabilize video capture. This description is slightly ambiguous. It's unclear whether this only affects software-level stabilization (EIS, EIS+OIS, etc) or if it guarantees the complete deactivation of the physical OIS module. For professional video applications, this is a critical distinction. So, I'd like to ask the community: Has anyone been able to definitively confirm that setting preferredVideoStabilizationMode to .off also disables the hardware OIS? Are there any known tests or documentation that prove this behavior? Is there an alternative or more direct method to ensure the OIS module is physically inactive during video capture? What is the community's best practice for ensuring absolutely no stabilization is applied to the video pipeline? Any insights or shared experiences on this topic would be greatly appreciated. Thank you!
0
1
264
Sep ’25
-46250 error when calling `makeSecureTokenForExpirationDateOfPersistableContentKey`
Hi there, We're working on offline playback of DRM tracks. The persistent keys (also known as track licenses) for offline playback are stored locally on the device and are served from cache when a user initiates playback of a downloaded track. Our persistent keys have a limited validity time and need to be refreshed when they expire. To prevent a situation where a persistent key expires while the user is offline, we've decided to eagerly refresh these keys one week before their expiration date. To make that happen we need to be able to obtain the expiration date of the given track license. We've been attempting to use the makeSecureTokenForExpirationDateOfPersistableContentKey API to facilitate this process. The documentation states that this API returns a secret token representing the persistent key, which we can then exchange with our license server for the expiration date: https://developer.apple.com/documentation/avfoundation/avcontentkeysession/makesecuretokenforexpirationdate(ofpersistablecontentkey:completionhandler:)?language=objc However, every time we call makeSecureTokenForExpirationDateOfPersistableContentKey, we receive an error with code -46250. We haven't been able to find any public references or documentation for this specific error code, which is preventing us from troubleshooting the issue. We are conducting our tests on a physical device, as the simulator does not support FairPlay playback. We don't use dual expiry approach. Is our understanding of how to obtain the expiration timestamp correct? Are we using the makeSecureTokenForExpirationDateOfPersistableContentKey API as it was intended? What does the -46250 error code mean, and what steps should we take to fix our FairPlay implementation to make this work? Thanks in advance for your assistance.
1
0
256
Sep ’25
Mac OS Tahoe 26.0 (25A354) Sound Glitches When opening the simulator app
Hey there, I just upgraded to Mac OS Tahoe ,son an apple MacBook Pro 2019 16inch. am using IntellijIDEA and Flutter to develop a mobile app which I test on the simulator app running iOS 18.4 . the issue: when I start the simulator app. ( while in the loading phase and in the operation phase as well ), the audio from an already open YouTube tab on safari (this happens on chrome browser as well). the sound glitches and becomes Noise. a fix I found online is to kill the audio deamon on Mac OS, This works using the command: "sudo killall coreaudiod" this kills the audio process, (while the emulator is operational), then the macOS restarts the audio deamon then the audio works fine alongside with the simulator being open. I just want to ask is there a permanent fix for this? is Apple working on a fix for this in the upcoming update?
3
5
1.2k
Sep ’25
CoreMediaErrorDomain error -12848
Good day. A video I created via iOS AVAssetWriter with the following settings: let videoWriterInput = AVAssetWriterInput( mediaType: .video, outputSettings: [ AVVideoCodecKey: AVVideoCodecType.hevc, AVVideoWidthKey: 1080, AVVideoHeightKey: 1920, AVVideoCompressionPropertiesKey: [ AVVideoAverageBitRateKey: 2_000_000, AVVideoMaxKeyFrameIntervalKey: 30 ], ] ) let audioWriterInput = AVAssetWriterInput( mediaType: .audio, outputSettings: [ AVFormatIDKey: kAudioFormatMPEG4AAC, AVNumberOfChannelsKey: 2, AVSampleRateKey: 44100, AVEncoderBitRateKey: 128000 ] ) When It is split into fMP4 HLS format using ffmpeg, the video is unable to be played in iOS with the following error: CoreMediaErrorDomain error -12848 However, the video is played normally in Android, Browser HLS players, and also VLC Media Player. Please assist. Thank you.
1
0
372
Sep ’25
Crash iOS 26.0: [__NSSingleObjectArrayI selectedMediaOptionInMediaSelectionGroup:]: unrecognized selector sent to instance
I'm having a crash on an app that plays videos when the users activates close captions. I was able to replicate the issue on an empty project. The crash happens when the AVPlayerLayer is used to instantiate an AVPictureInPictureController These are the example project where I tested the crash: struct ContentView: View { var body: some View { VStack { VideoPlaylistView() } .frame(maxWidth: .infinity, maxHeight: .infinity) .background(Color.black.ignoresSafeArea()) } } class VideoPlaylistViewModel: ObservableObject { // Test with other videos var player: AVPlayer? = AVPlayer(url: URL(string:"https://d2ufudlfb4rsg4.cloudfront.net/newsnation/WIpkLz23h/adaptive/WIpkLz23h_master.m3u8")!) } struct VideoPlaylistView: View { @StateObject var viewModel = VideoPlaylistViewModel() var body: some View { ScrollView { VideoCellView(player: viewModel.player) .onAppear { viewModel.player?.play() } } .scrollTargetBehavior(.paging) .ignoresSafeArea() } } struct VideoCellView: View { let player: AVPlayer? @State var isCCEnabled: Bool = false var body: some View { ZStack { PlayerView(player: player) .accessibilityIdentifier("Player View") } .containerRelativeFrame([.horizontal, .vertical]) .overlay(alignment: .bottom) { Button { player?.currentItem?.asset.loadMediaSelectionGroup(for: .legible) { group,error in if let group { let option = !isCCEnabled ? group.options.first : nil player?.currentItem?.select(option, in: group) isCCEnabled.toggle() } } } label: { Text("Close Captions") .font(.subheadline) .foregroundStyle(isCCEnabled ? .red : .primary) .buttonStyle(.bordered) .padding(8) .background(Color.blue.opacity(0.75)) } .padding(.bottom, 48) .accessibilityIdentifier("Button Close Captions") } } } import Foundation import UIKit import SwiftUI import AVFoundation import AVKit struct PlayerView: UIViewRepresentable { let player: AVPlayer? func updateUIView(_ uiView: UIView, context: UIViewRepresentableContext<PlayerView>) { } func makeUIView(context: Context) -> UIView { let view = PlayerUIView() view.playerLayer.player = player view.layer.addSublayer(view.playerLayer) view.layer.backgroundColor = UIColor.red.cgColor view.pipController = AVPictureInPictureController(playerLayer: view.playerLayer) view.pipController?.requiresLinearPlayback = true view.pipController?.canStartPictureInPictureAutomaticallyFromInline = true view.pipController?.delegate = view return view } } class PlayerUIView: UIView, AVPictureInPictureControllerDelegate { let playerLayer = AVPlayerLayer() var pipController: AVPictureInPictureController? override init(frame: CGRect) { super.init(frame: frame) } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } override func layoutSubviews() { super.layoutSubviews() playerLayer.frame = bounds playerLayer.backgroundColor = UIColor.green.cgColor } func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController, failedToStartPictureInPictureWithError error: any Error) { print("Error starting Picture in Picture: \(error.localizedDescription)") } } class AppDelegate: NSObject, UIApplicationDelegate { func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey : Any]? = nil) -> Bool { let audioSession = AVAudioSession.sharedInstance() do { try audioSession.setCategory(.playback, mode: .moviePlayback) try audioSession.setActive(true) } catch { print("ERR: \(error.localizedDescription)") } return true } } UITest to make the app crash: final class VideoPlaylistSampleUITests: XCTestCase { func testCrashiOS26ToggleCloseCaptions() throws { let app = XCUIApplication() app.launch() let videoPlayer = app.otherElements["Player View"] XCTAssertTrue(videoPlayer.waitForExistence(timeout: 30)) let closeCaptionButton = app.buttons["Button Close Captions"] for _ in 0..<2000 { closeCaptionButton.tap() } } }
0
5
299
Sep ’25
Unabe to use writeHEIFRepresentation - failed to add image to the PhotoCompressionSession.
I'm getting an error writing a ciImage as a heif image: // Create CIImage directly from pixel buffer let ciImage = CIImage(cvPixelBuffer: pixelBuffer, options: [CIImageOption.properties: combinedMetadata]) // Write HEIC synchronously do { try ciContext.writeHEIFRepresentation(of: ciImage, to: url, format: .RGBA8, colorSpace: colorSpace) The error I'm getting is: Error Domain=CINonLocalizedDescriptionKey Code=3 "(null)" UserInfo={CINonLocalizedDescriptionKey=failed to write HEIC data to file., NSUnderlyingError=0x11b1a1ec0 {Error Domain=CINonLocalizedDescriptionKey Code=10 "(null)" UserInfo={CINonLocalizedDescriptionKey=failed to add image to the PhotoCompressionSession.}}} Both try ciContext.writeJPEGRepresentation(of: copiedCIImage, to: url, colorSpace: colorSpace, options: options) and try ciContext.writePNGRepresentation(of: copiedCIImage, to: url, format: .RGBA8, colorSpace: colorSpace) work. I also verified that the code works with iOS 18. Is there something new we need to do for Heif images? Thanks in advance
7
0
625
Sep ’25
AVPlayer loading performance problem in iOS 26
Hi, I have an app that displays tens of short (<1mb) mp4 videos stored in a remote server in a vertical UICollectionView that has horizontally scrollable sections. I'm caching all mp4 files on disk after downloading, and I also have a in-memory cache that holds a limited number (around 30) of players. The players I'm using are simple views that wrap an AVPlayerLayer and its AVPlayerItem, along with a few additional UI components. The scrolling performance was good before iOS 26, but with the release of iOS 26, I noticed that there is significant stuttering during scrolling while creating players with a fileUrl. It happens even if use the same video file cached on disk for each cell for testing. I also started getting this kind of log messages after the players are deinitialized: <<<< PlayerRemoteXPC >>>> signalled err=-12785 at <>:1107 <<<< PlayerRemoteXPC >>>> signalled err=-12785 at <>:1095 <<<< PlayerRemoteXPC >>>> signalled err=-12785 at <>:1095 There's also another log message that I see occasionally, but I don't know what triggers it. << FigXPC >> signalled err=-16152 at <>:1683 Is there anyone else that experienced this kind of problem with the latest release? Also, I'm wondering what's the best way to resolve the issue. I could increase the size of the memory cache to something large like 100, but I'm not sure if it is an acceptable solution because: 1- There will be 100 player instance in memory at all times. 2- There will still be stuttering during the initial loading of the videos from the web. Any help is appreciated!
1
0
328
Sep ’25
Blurry Depth Data since iPhone 13
I tested the accuracy of the depth map on iPhone 12, 13, 14, 15, and 16, and found that the variance of the depth map after iPhone 12 is significantly greater than that of iPhone 12. Enabling depth filtering will cause the depth data to be affected by the previous frame, adding more unnecessary noise, especially when the phone is moving. This is not friendly for high-precision reconstruction. I tried to add depth map smoothing in post-processing to solve the problem of large depth map deviation, but the performance is still poor. Is there any depth map smoothing solutions already announced by Apple?
1
0
70
Sep ’25
iOS26中ALAssetsLibrary 编译报错问题
mac os 系统版本:26.0 (25A354) Xcode版本:Version 26.0 (17A324) 项目编译报错 `SwiftExplicitDependencyCompileModuleFromInterface arm64 /Users/zhz/Library/Developer/Xcode/DerivedData/ModuleCache.noindex/AssetsLibrary-HTIJ05N58KN3.swiftmodule /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/usr/lib/swift/AssetsLibrary.swiftmodule/arm64e-apple-ios.swiftinterface:10:25: error: 'ALAssetsLibrary' is unavailable in iOS: Use PHPhotoLibrary from the Photos framework instead 8 | public import _StringProcessing 9 | public import _SwiftConcurrencyShims 10 | extension AssetsLibrary.ALAssetsLibrary { | `- error: 'ALAssetsLibrary' is unavailable in iOS: Use PHPhotoLibrary from the Photos framework instead 11 | #if compiler(>=5.3) && $NonescapableTypes 12 | @available(iOS, introduced: 9.0, deprecated: 9.0, obsoleted: 26.0) /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/System/Library/Frameworks/AssetsLibrary.framework/Headers/ALAssetsLibrary.h:80:12: note: 'ALAssetsLibrary' was obsoleted in iOS 26.0 78 | 79 | OS_EXPORT AL_DEPRECATED(4, "Use PHPhotoLibrary from the Photos framework instead") 80 | @interface ALAssetsLibrary : NSObject { | `- note: 'ALAssetsLibrary' was obsoleted in iOS 26.0 81 | @package 82 | id _internal; /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS26.0.sdk/usr/lib/swift/AssetsLibrary.swiftmodule/arm64e-apple-ios.swiftinterface:1:1: error: failed to build module 'AssetsLibrary'; this SDK is not supported by the compiler (the SDK is built with 'Apple Swift version 6.2 effective-5.10 (swiftlang-6.2.0.17.14 clang-1700.3.17.1)', while this compiler is 'Apple Swift version 6.2 effective-5.10 (swiftlang-6.2.0.19.9 clang-1700.3.19.1)'). Please select a toolchain which matches the SDK.
3
4
1.3k
Sep ’25
Metal CIKernel instances with arbitrarily structured data arguments
Hi, In the iOS13 and macOS Catalina release notes it says: Metal CIKernel instances now support arguments with arbitrarily structured data. I've been trying to use this functionality in a CIKernel with mixed results. I'm particularly interested in passing data in the form of a dynamically sized array. It seems to work up to a certain size. Beyond the threshold excessive data is discarded and the kernel becomes unstable. I assume there is some kind of memory alignment issue going on, but I've tried various types in my array and always get a similar result. I have not found any documentation or sample code regarding this. It would be great to know how this is intended to work and what the limitations are. In the forums there are two similar unanswered questions about data arguments, so I'm sure there are a few out there with similar issues. Thanks! Michael
5
0
438
Sep ’25