Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Created

iOS - record audio fails to record
Hi, I try to record audio on the iPhone with the AVAudioRecorder and Xcode 26.0.1. Maybe the problem is that I can not record audio with the simulator. But there's a menu for audio. In the plist I added 'Privacy - Microphone Usage Description' and I ask for permission before recording. if await AVAudioApplication.requestRecordPermission() { print("permission granted") recordPermission = true } else { print("permission denied") } Permission is granted. let settings: [String : Any] = [ AVFormatIDKey: kAudioFormatMPEG4AAC, AVSampleRateKey: 12000, AVNumberOfChannelsKey: 1, AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue ] recorder = try AVAudioRecorder(url: filename, settings: settings) let prepared = recorder.prepareToRecord() print("prepared started: \(prepared)") let started = recorder.record() print("recording started: \(started)") started is always false and I tried many settings. Error messages AddInstanceForFactory: No factory registered for id <CFUUID 0x600000211480> F8BB1C28-BAE8-11D6-9C31-00039315CD46 AudioConverter.cpp:1052 Failed to create a new in process converter -> from 0 ch, 12000 Hz, .... (0x00000000) 0 bits/channel, 0 bytes/packet, 0 frames/packet, 0 bytes/frame to 1 ch, 12000 Hz, aac (0x00000000) 0 bits/channel, 0 bytes/packet, 1024 frames/packet, 0 bytes/frame, with status -50 AudioQueueObject.cpp:1892 BuildConverter: AudioConverterNew returned -50 from: 0 ch, 12000 Hz, .... (0x00000000) 0 bits/channel, 0 bytes/packet, 0 frames/packet, 0 bytes/frame to: 1 ch, 12000 Hz, aac (0x00000000) 0 bits/channel, 0 bytes/packet, 1024 frames/packet, 0 bytes/frame prepared started: true AudioQueueObject.cpp:7581 ConvertInput: aq@0x10381be00: AudioConverterFillComplexBuffer returned -50, packetCount 5 recording started: false All examples I find are the same, but apparently there must be something different.
1
0
226
Oct ’25
Help! Green Video stream from iPhone 17 Pro/Pro Max with WebRTC
I'm at my wit's end with a problem I'm facing while developing an app. The app is designed to send video captured on an iPhone to a browser application for real-time display. While it works on many older iPhone models, whenever I test it on an iPhone 17 Pro or 17 Pro Max, the video displayed in the browser becomes a solid green screen, or a colorful, garbled image that's mostly green. I've been digging into this, and my main suspicion is an encoding failure. It seems the resolution of the ultra-wide and telephoto cameras was significantly increased on the 17 Pro and Pro Max (from 12MP to 48MP), and I think this might be overwhelming the encoder. I'm really hoping someone here has encountered a similar issue or has any suggestions. I'm open to any information or ideas you might have. Please help! Environment Information: WebRTC Library: GoogleWebRTC Version 1.1 (via CocoaPods) Signaling Server: AWS Kinesis Video Streams Problem Occurs on: Model: iPhone18,1, OS: 26.0 Model: iPhone18,1, OS: 26.1 Works Fine on: Many models before iPhone17,5 Model: iPhone18,1, OS: 26.0 Model: iPhone18,3, OS: 26.0
0
0
107
Oct ’25
iPhone 17 Pro, 17 Pro Maxで撮った映像をWebRTCで送るとブラウザで表示した際に緑の映像になる
iPhoneで撮影した映像をブラウザのアプリへ送信して画面に映す機能を持ったアプリを開発しています。 iPhone 17 Pro, 17 Pro Maxでこのアプリを利用するとブラウザ側に表示される映像が緑一色や、緑がメインのカラフルな映像になってしまいます。 調べてみると17Proと17ProMaxで超広角カメラと望遠カメラの画素数が変更になっている(1200万画素→4800万画素)ためエンコーディングで失敗しているのではないかと疑っています。 なんでも情報下さい。 環境情報 WebRTCライブラリ: GoogleWebRTC バージョン 1.1 (CocoaPodsで導入) シグナリングサーバー: AWS Kinesis Video Streams 問題が発生するデバイス: モデル名: iPhone18,1, OS: 26.0 モデル名: iPhone18,1, OS: 26.1 問題が発生しないデバイス: iPhone17,5 以前の多数のモデル モデル名: iPhone18,1, OS: 26.0 モデル名: iPhone18,3, OS: 26.0
2
0
186
Oct ’25
Enterprise API Passthrough in screen capture not working after visionOS26 update
We have been using passthrough in screen capture since visionOS26 with broadcast upload extension which was working in visionOS2.2 but now with visionOS26 it doesn't update. It fails with Invalid Broadcast session started, after a few seconds of starting the broadcast session. Is there a bug filed for it? or is it a known bug for it?
2
0
387
Oct ’25
"No signal" message when connecting LG tv via HDM
Hi everyone, I am currently on MacOS Tahoe (26.1), and for some weird reason my mac is not connecting via HDMI. To be accurate: it is connecting and the LG TV shows up in the Displays settings, but no image shows up in it, I have no idea why. This used to work as I've tried this cable before with the same exact tv. The cable is a basic Amazon Basics HDMI one. Allow me just to advanced this question a little: usually terminal commands are more advanced recommendations, whereas basic questions like "have you connected it right" are just a waste of time
4
0
664
Oct ’25
How to record voice, auto-transcribe, translate (auto-detect input language), and play back translated audio on same device in iOS Swift?
Hi everyone 👋 I’m building an iOS app in Swift where I want to do the following: Record the user’s voice Transcribe the spoken sentence (speech-to-text) Auto-detect the spoken language Translate it to another language selected by the user (e.g., English → Spanish or Hindi → English) Speak back (text-to-speech) the translated text on the same device Is this possible to record via phone mic and play the transcribe voice into headphone's audio?
0
0
192
Oct ’25
[AVFCore] IOS 26.0 EXC_BAD_ACCESS from _customCompositorShouldCancelPendingFrames
Hi, I'm working an a video editing software that lets you composite and export videos. I use a custom compositor to apply my effects etc. In my crash dashboard, I am seeing a report of an EXC_BAD_ACCESS crash from objc_msgSend. Below is the stacktrace. libobjc.A.dylib objc_msgSend libdispatch.dylib _dispatch_sync_invoke_and_complete_recurse libdispatch.dylib _dispatch_sync_f_slow [symbolication failed] libdispatch.dylib _dispatch_client_callout libdispatch.dylib _dispatch_lane_barrier_sync_invoke_and_complete AVFCore -[AVCustomVideoCompositorSession(AVCustomVideoCompositorSession_FigCallbackHandling) _customCompositorShouldCancelPendingFrames] AVFCore _customCompositorShouldCancelPendingFramesCallback MediaToolbox remoteVideoCompositor_HandleVideoCompositorClientMessage CoreMedia __figXPCConnection_CallClientMessageHandlers_block_invoke libdispatch.dylib _dispatch_call_block_and_release libdispatch.dylib _dispatch_client_callout libdispatch.dylib _dispatch_lane_serial_drain libdispatch.dylib _dispatch_lane_invoke libdispatch.dylib _dispatch_root_queue_drain_deferred_wlh libdispatch.dylib _dispatch_workloop_worker_thread libsystem_pthread.dylib _pthread_wqthread libsystem_pthread.dylib start_wqthread What stood out to me is that this is only being reported from IOS 26.0+ devices. A part of the stacktrace failed to be symbolicated [symbolication failed]. I'm 90% confident that this is Apple code, not my app's code. I cannot reproduce this locally. Is this a known issue? What are the possible root-causes, and how can I verify/eliminate them? Thanks,
0
0
82
Oct ’25
macOS Tahoe: Can't setup AVAudioEngine with playthrough
Hi, I'm trying to setup a AVAudioEngine for USB Audio recording and monitoring playthrough. As soon as I try to setup playthough I get an error in the console: AVAEInternal.h:83 required condition is false: [AVAudioEngineGraph.mm:1361:Initialize: (IsFormatSampleRateAndChannelCountValid(outputHWFormat))] Any ideas how to fix it? // Input-Device setzen try? setupInputDevice(deviceID: inputDevice) let input = audioEngine.inputNode // Stereo-Format erzwingen let inputHWFormat = input.inputFormat(forBus: 0) let stereoFormat = AVAudioFormat(commonFormat: inputHWFormat.commonFormat, sampleRate: inputHWFormat.sampleRate, channels: 2, interleaved: inputHWFormat.isInterleaved) guard let format = stereoFormat else { throw AudioError.deviceSetupFailed(-1) } print("Input format: \(inputHWFormat)") print("Forced stereo format: \(format)") audioEngine.attach(monitorMixer) audioEngine.connect(input, to: monitorMixer, format: format) // MonitorMixer -> MainMixer (Output) // Problem here, format: format also breaks. audioEngine.connect(monitorMixer, to: audioEngine.mainMixerNode, format: nil)
0
0
168
Oct ’25
How to safely switch between mic configurations on iOS?
I have an iPadOS M-processor application with two different running configurations. In config1, the shared AVAudioSession is configured for .videoChat mode using the built-in microphone. The input/output nodes of the AVAudioEngine are configured with voice processing enabled. The built-in mic is formatted for 1 channel at 48KHz. In config2, the shared AVAudioSession is configured for .measurement mode using an external USB microphone. The input/output nodes of the AVAudioEngine are configured with voice processing disabled. The external mic is formatted for 2 channels at 44.1KHz I've written a configuration manager designed to safely switch between these two configurations. It works by stopping AVAudioEngine and detaching all but the input and output nodes, updating the shared audio session for the desired mic and sample-rates, and setting the appropriate state for voice processing to either true or false as required by the configuration. Finally the new audio graph is constructed by attaching appropriate nodes, connecting them, and re-starting AVAudioEngine I'm experiencing what I believe is a race-condition between switching voice processing on or off and then trying to re-build and start the new audio graph. Even though notifications, which are dumped to the console indicate that my requested input and sample-rate settings are in place, I crash when trying to start the audio engine because the sample-rate is wrong. Investigating further it looks like the switch from remote I/O to voice-processing I/O or vice-versa has not yet actually completed. I introduced a 100ms second delay and that seems to help but is obviously not a reliable way to build software that must work consistently. How can I make sure that what are apparently asynchronous configuration changes to the shared audio session and the input/output nodes have completed before I go on? I tried using route change notifications from the shared AVAudioSession but these lie. They say my preferred mic input and sample-rate setting is in place but when I dump the AVAudioEngine graph to the debugger console, I still see the wrong sample rate assigned to the input/output nodes. Also these are the wrong AU nodes. That is, VPIO is still in place when RIO should be, or vice-versa. How can I make the switch reliable without arbitrary time delays? Is my configuration manager approach appropriate (question for Apple engineers)?
1
0
154
Oct ’25
TV A1625 Using 3× More CPU After tvOS 26 Update
Hi everyone, After updating my Apple TV HD (model A1625) to tvOS 26, I’ve noticed a significant spike in CPU usage—up to 3× higher than before the update. Go from around 40% to 120% Model: Apple TV HD (A1625) tvOS Version: 26 (stable release) and beta version of 26.1, App downgrade stream due to lack of cpu power If anyone else is experiencing this, please share your findings or workarounds. Would love to hear from Apple engineers or other developers if this is a known regression or if there’s a recommended fix. Thanks!
4
0
189
Oct ’25
AVPlayerView with .inline controlsStyle macOS 26
My audio app shows a control bar at the bottom of the window. The controls show nicely, but there is a black "slab" appearing behind the inline controls, the same size as the playerView. Setting the player view background color does nothing: playerView.wantsLayer = true playerView.layer?.backgroundColor = NSColor.clear.cgColor How can I clear the background? If I use .floating controlsStyle, I don't get the background "slab".
0
0
146
Oct ’25
Clean up render files saved to PHContentEditingOutput.renderedContentURL
I discovered when editing photos with the PhotoKit API, PHContentEditingOutput's renderedContentURL is a file in the app container's tmp directory with a filename that seems to follow the format render.<uuid>.JPG, and that file does not get deleted if the edit does not complete successfully (the user cancels the edit request, an error occurs, the app crashes, etc). I understand the system is supposed to automatically delete tmp files every once in a while, but some users are noticing my app's Documents & Data inflates, so I'm considering deleting these render files each time the app is launched. But I don't want to delete everything in the tmp directory as there could possibly be other data in there. What's the best way to remove those temporary files? Does the filename always start with render. no matter the device language? I thought I'd delete files in NSTemporaryDirectory() with that prefix but then I discovered in Mac Catalyst the location is not the tmp directory directly, they're in tmp/TemporaryItems/<bundleid>. Thanks!
0
0
86
Oct ’25
Background GPU access in iOS 26 for iPhones
We build mobile apps for creators to edit their videos. Post editing the video, the creator has to export the video so that it can be uploaded to Youtube. The export is a time consuming and GPU intensive process. The creator can exit the app due to various reasons like receiving the call, putting the app in background etc. This causes the export to fail :( Keeping this limitation in mind there was an announcement from Apple that with the IOS 26 launch would start to support background GPU access. Here is the official documentation: https://developer.apple.com/documentation/BundleResources/Entitlements/com.apple.developer.background-tasks.continued-processing.gpu When we tried using this feature, we were not able to get it to work on IOS 26. We stumbled upon this ticket(https://developer.apple.com/forums/thread/797538?answerId=854825022#854825022) in the Apple Developer forum, in which possibly an Apple engineer claims it is supported ONLY for iPadOS 26. This is a very big bummer for us. 96% of the users are on iPhone(compared to iPad), and if we refer to the official documentation above, it claims that this feature should work on IOS 26. This feature is extremely important for having the best user experience and reducing user frustration and will be useful for other video editing apps. Looking forward to a resolution.
1
0
223
Oct ’25
AirPods with H2 and studio-quality recording - how to replicate Camera video capture
Using an iPhone Pro 12 running iOS 26.0.1, with AirPods Pro 3. Camera app does capture video with what seems to be "Studio Quality Recording". Am trying to replicate that SQR with my own Camera like app, and while I can pull audio in from the APP3 mic, and my video capture app is recording a 48,000Hz high-bitrate video, the audio still sounds non-SQR. I'm seeing bluetoothA2DP , bluetoothLE , bluetoothHFP as portType, and not sure if SQR depends on one of those? Is there sample code demonstrating a SQR capture? Nevermind video and camera, just audio even? Also, I don't understand what SQR is doing between the APP3 and the iPhone. What codec is that? What bitrate is that? If I capture video using Capture and inspect the audio stream I see mono 74.14 kbit/s MPEG-4 AAC, 48000 Hz. But I assume that's been recompressed and not really giving me any insight into the APP3 H2 transmission?
1
0
118
Oct ’25
Low-Latency HLS stream playback issue on iOS 26
We have a Low-Latency HLS stream, and on iOS 26, even though the bandwidth is sufficient, it still selects a low-bandwidth resolution (e.g., RESOLUTION=640x360) for playback instead of using a higher-bandwidth resolution (e.g., RESOLUTION=1920x1080) when using AVPlayerViewController with AVPlayer. This works fine on iOS version 18 and previous versions. What could be the solution to this issue?
1
0
184
Oct ’25
Broadcast UploadExtension Stop data transmission
Currently, I am using the Broadcast UploadExtension function to obtain samplebuffer data through APP Group and IPC (based on the local Unix Domain Socket) The screen recording data transmission method of the domain socket is transmitted to the APP. However, when the APP goes back to the background to view videos in the album or other audio and video, the data transmission stops and the APP cannot obtain the screen recording data. I would like to ask how to solve this problem. I suspect that the system has suspended the extended screen recording
0
0
122
Oct ’25
iTunes Search API no longer returning explicit results?
My app has been using the iTunes Search API (itunes.apple.com/search) for a few years now, but at some point over the last week or so (late Sept. 2025) it is no longer returning track results with explicit content, regardless of whether I provide "explicit=Yes" (which is the default anyway, according to the API documentation - https://performance-partners.apple.com/search-api). Has anyone else experienced this with this API and have you figured out a workaround? FYI, I do also use the more robust Apple Music API in another part of my app, which isn't going through this issue, so I know it's technically an alternative. I just need to stick with iTunes Search API in this particular case. Thanks.
1
2
235
Oct ’25