Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Created

New playback error on iOS/tvOS 18.x "CoreMediaErrorDomain Code=-15486"
Hello, Our users have started to see a new fatal AVPlayer error during playback starting with iOS/tvOS 18.0. The error is defined as "CoreMediaErrorDomain Code=-15486". We have not been able to reproduce this issue locally within our development team. Is there any documentation on the cause of this error or steps to recover from this error? Thank you, Howard
0
2
723
Feb ’25
Set the capture device color space to apple log not works.
I set the device format and colorspace to Apple Log and turn off the HDR, why the movie output is still in HDR format rather than ProRes Log? Full runnable demo here: https://github.com/SpaceGrey/ColorSpaceDemo session.sessionPreset = .inputPriority // get the back camera let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: .video, position: .back) backCamera = deviceDiscoverySession.devices.first! try! backCamera.lockForConfiguration() backCamera.automaticallyAdjustsVideoHDREnabled = false backCamera.isVideoHDREnabled = false let formats = backCamera.formats let appleLogFormat = formats.first { format in format.supportedColorSpaces.contains(.appleLog) } print(appleLogFormat!.supportedColorSpaces.contains(.appleLog)) backCamera.activeFormat = appleLogFormat! backCamera.activeColorSpace = .appleLog print("colorspace is Apple Log \(backCamera.activeColorSpace == .appleLog)") backCamera.unlockForConfiguration() do { let input = try AVCaptureDeviceInput(device: backCamera) session.addInput(input) } catch { print(error.localizedDescription) } // add output output = AVCaptureMovieFileOutput() session.addOutput(output) let connection = output.connection(with: .video)! print( output.outputSettings(for: connection) ) /* ["AVVideoWidthKey": 1920, "AVVideoHeightKey": 1080, "AVVideoCodecKey": apch,<----- prores has enabled. "AVVideoCompressionPropertiesKey": { AverageBitRate = 220029696; ExpectedFrameRate = 30; PrepareEncodedSampleBuffersForPaddedWrites = 1; PrioritizeEncodingSpeedOverQuality = 0; RealTime = 1; }] */ previewSource = DefaultPreviewSource(session: session) queue.async { self.session.startRunning() } }
1
0
454
Feb ’25
kCGImageSourceDecodeToHDR and CGImageSourceCopyPropertiesAtIndex
Hi, I'm using Core Graphics to load a .DNG photo shot by a Leica Q3 camera. The photo is shot in portrait, however the embedded preview is rotated 90 degrees to landscape. I load the photo like this: let options = [kCGImageSourceDecodeRequest: kCGImageSourceDecodeToHDR] as CFDictionary let source = CGImageSourceCreateWithData(data as CFData, nil) let cgimage = CGImageSourceCreateImageAtIndex(source, 0, options) let properties = CGImageSourceCopyPropertiesAtIndex(source, 0, nil) as? [CFString : Any] When doing this I can see that the orientation property is 1 indicating that the orientation is 'Up', which it isn't. If I don't specify the kCGImageSourceDecodeToHDR option (eseentially setting options to nil) - the orientation property is 8 (rotated 90 degrees). What puzzles me is that a chang to the CGImageSourceCreateImageAtIndex call can have an influence on that latter call to CGImageSourceCopyPropertiesAtIndex ? I would expect these to work independently? Cheers Thomas
1
0
473
Feb ’25
AVAudioEngine. Select input device on macOS
Hello! I'm use AVFoundation for preview video and audio from selected device, and I try use AVAudioEngine for preview audio in real-time, but I can't or I don't understand how select input device? I can hear only my microphone in real-time So far, I'm using AVCaptureAudioPreviewOutput for in real-time hear audio, but I think has delay. On iOS works easy with AVAudioEngine, but on macOS bruh...
1
0
526
Feb ’25
Finding which effect has been selected for a Live Photo
I am writing an iOS app to present a slide show of assets in a Photo album, in a random order, including videos and live photos. I have got it all working quite nicely but for a Live Photo, I need to know what effect is selected (Live, Loop, Bounce, Long Exposure, Live Off) to display the image correctly. I can't find any mention of getting this information in the documentation. Anyone know how to do this? Thanks in advance. Adrian. (Xcode 16.1 iOS 18.0)
0
1
307
Feb ’25
CoreMediaErrorDomain error -12927 with HEVC and DRM
I can't play video content with HEVC and DRM. Tested HEVC only: OK. Tested DRM+AVC: Ok. Tested 2 players (Clappr/Stevie and BitMovin) Master, variants and EXT-X-MAPs are downloaded Ok, DRM keys Ok and then, for instance with BitMovin Player: [BMP] [Player] [Error] Event: SourceError, Data: {"code":2001,"data":{"message":"The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.)","code":-12927},"message":"Source Error. The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.)","timestamp":1740320663.4505711,"type":"onSourceError"} code: 2001 [Data code: -12927, message: The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.), underlying error: Error Domain=CoreMediaErrorDomain Code=-12927 "(null)"] 4k-master.m3u8.txt 4k.m3u8.txt 4k-audio.m3u8.txt
1
0
575
Feb ’25
How to reduce CMSampleBuffer volume
Hello, Basically, I am reading and writing an asset. To simplify, I am just reading the asset and rewriting it into an output video without any modifications. However, I want to add a fade-out effect to the last three seconds of the output video. I don’t know how to do this. So far, before adding the CMSampleBuffer to the output video, I tried reducing its volume using an extension on CMSampleBuffer. In the extension, I passed 0.4 for testing, aiming to reduce the video's overall volume by 60%. My question is: How can I directly adjust the volume of a CMSampleBuffer? Here is the extension: extension CMSampleBuffer { func adjustVolume(by factor: Float) -> CMSampleBuffer? { guard let blockBuffer = CMSampleBufferGetDataBuffer(self) else { return nil } var length = 0 var dataPointer: UnsafeMutablePointer<Int8>? guard CMBlockBufferGetDataPointer(blockBuffer, atOffset: 0, lengthAtOffsetOut: nil, totalLengthOut: &length, dataPointerOut: &dataPointer) == kCMBlockBufferNoErr else { return nil } guard let dataPointer = dataPointer else { return nil } let sampleCount = length / MemoryLayout<Int16>.size dataPointer.withMemoryRebound(to: Int16.self, capacity: sampleCount) { pointer in for i in 0..<sampleCount { let sample = Float(pointer[i]) pointer[i] = Int16(sample * factor) } } return self } }
4
0
427
Feb ’25
MusicKit Web Playback States
In MusicKit Web the playback states are provided as numbers. For example the playbackStateDidChange event listener will return: {oldState: 2, state: 3, item:...} When the state changes from playing (2) to paused (3). Those are pretty easy to guess, but I'm having a hard time with some of the others: completed, ended, loading, none, paused, playing, seeking, stalled, stopped, waiting. I cannot find a mapping of states to numbers documented anywhere. I got the above states from an enum in a d.ts file that is often incorrect/incomplete. Can someone help out pointing to the docs or provide a mapping? Thanks.
2
0
440
Feb ’25
Distortion corrected images
Hi, Currently I am developing a 3D reconstruction project. Which requires images to be distortion-free (rectilinear) and with known intrinsics. The session I am developing on is a builtInDualWideCamera, with isGeometricDistortionCorrectionEnabled set to false to be able to get the intrinsic matrix of the images, isVirtualDeviceConstituentPhotoDeliveryEnabled set to true and isAutoVirtualDeviceFusionEnabled set to false to get both images and isCameraCalibrationDataDeliveryEnabled set to true to actually get the calibration data. The distortion correction parameters such as lensDistortionLookupTable are used. The 42 coefficients mapping array is used as described in the AVCameraCalibrationData header file. A simple piecewise linear interpolation. There are two questions I would like to get support on: A way to set the calibration parameters in each image. I have an approach that sets the parameters in the kCGImagePropertyExifDictionary -> "UserComment". Is there a better approach to write calibration parameter data into the images? I feel like this is a bit dirty and there might be a better and neat approach. For the ultra-wide angle camera's images, the lensDistortionLookupTable contains several zeros at the end of the array. For example (last 10 elements are zero): "LensDistortionLookupTable":"0.000000000000000,0.000349554029526,0.001385628827848,0.003071037586778,... ,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000" The problem comes when the complete array is used to correct the image (including zeros), the end result is a wrapped-like-circle image close to the edges of it which is completely wrong. In contrast, if the LensDistortionLookupTable is used without the last zeros and the new size accommodated the image looks better (although not as rectilinear as if you take the image from the iPhone's camera app), but definitely less distorted. Including zeros (full array): Excluding zeros (array size changed): Am I missing an important point in the usage of the lensDistortionLookupTable where this case is addressed (zeros at the end)? What is the criteria to shrink/exclude elements of the array? Any advice is very much welcome.
0
0
454
Feb ’25
Regarding FPS SDK Upgrade
Hi Apple Team, We have integrated FairPlay Streaming Server SDK v3 into our MDRM platform since 2017, the system works stable and stayed untouched. As you know, both Widevine and Playready have requirements to upgrade the Server SDK regularly. We want to know if Apple imposes similar requirements for upgrading the FPS SDK, or if we may continue using the old one without any updates. Thanks for your support!
0
1
379
Feb ’25
HDR video metadata
On an iOS 18 phone, I use AVCaptureSession to capture HDR with x420 format. The output CMSampleBuffer is HLG colorspace, the propagated attachments contain kCVImageBufferAmbientViewingEnvironmentKey and kCVImageBufferSceneIlluminationKey. Now I use CAMetalLayer to render the CVPixelBuffer to the screen, but the brightness is brighter than AVSampleBufferDisplayLayer. Here is my code. - (void)_updateColorSpaceIfNeed:(CVPixelBufferRef)pixelBuffer { CAMetalLayer *layer = (CAMetalLayer *)_mtkView.layer; if (![layer isKindOfClass:CAMetalLayer.class]) return; layer.wantsExtendedDynamicRangeContent = YES; CFDataRef ambientViewingEnvironment = (CFDataRef)CVBufferCopyAttachment(pixelBuffer, kCVImageBufferAmbientViewingEnvironmentKey, NULL); NSData *data = (__bridge NSData *)ambientViewingEnvironment; if (ambientViewingEnvironment) CFRelease(ambientViewingEnvironment); CAEDRMetadata *metadata = [CAEDRMetadata HLGMetadataWithAmbientViewingEnvironment:data]; // CAEDRMetadata *metadata = [CAEDRMetadata HLGMetadata]; layer.EDRMetadata = metadata; layer.pixelFormat = MTLPixelFormatRGBA16Float; CGColorSpaceRef colorspace = CGColorSpaceCreateWithName(kCGColorSpaceITUR_2100_HLG); layer.colorspace = colorspace; if (colorspace) CGColorSpaceRelease(colorspace); } Why does the CAEDRMetadata class have "HLGMetadataWithAmbientViewingEnvironment:" and "HLGMetadata" methods, but does not provide the "HLGMetadataWithAmbientViewingEnvironment:sceneIllumination" method? I want to know how kCVImageBufferAmbientViewingEnvironmentKey and kCVImageBufferSceneIlluminationKey affect tone mapping. Is there any documentation I can refer to?
1
0
464
Feb ’25
AVContentKeySession reuse
Context We develop an iOS/Apple TV app that allows to play HLS+FP Live streams (custom playback UI), some of which use the same FairPlay content key id. All FairPlay content keys are requested to the same content key server. Implementation Despite Apple documentation warning to not reuse AVContentKeySessions, we use only one AVContentKeySession for all channels which allows the system to reuse the content key when a content key id is met again. As seen in another thread, people seems to think this is OK. Issue When reusing the AVContentKeySession and the user quickly tunes channels multiple times (up to 2 or 3 times per second using gestures), an inconsistency may occur where the content key request for a previous streams is asked to the delegate after a new stream is already being prepared and its AVURLAsset already assigned as the content key session AVContentKeyRecipient. Note that the previous content key recipient is removed before the new one is added. We also have been reported for crashes (though I haven't experienced it myself) when performing multiple channels tunings which makes us think that the AVContentKeySession should definitely not been reused. Note: On the other hand if a new AVContentKeySession is used for each stream, the system systematically requests a content key even if previous streams have used the same content key id. In this case, neither the crash nor the inconsistency issue are observed but it dramatically increases the number of calls to the content key server. Questions Should AVContentKeySessions definitely not be reused? Otherwise, how to handle the inconsistency issue described above?
0
0
449
Feb ’25
How to delete FPS Certificate from Apple developer account
Hello All, I am looking for assistance with our FairPlay Streaming (FPS) certificates. We are in the process of migrating to a new video streaming vendor and need to create a new FPS certificate using SDK 4. However, we have reached the limit of allowed FPS certificates in our account and cannot create a new one. Issue Details: • We currently have two FPS certificates active in our developer account. • One of these was created using SDK 5, but our new vendor (Mux) requires an FPS certificate based on SDK 4. • Since Apple does not allow deleting FPS certificates from the developer portal, we are unable to create a new SDK 4 certificate. • We kindly request Apple to revoke one of our existing FPS certificates to allow us to generate a new SDK 4 certificate. Request: We would greatly appreciate it if you could assist us on how to delete one of our existing FPS certificates so that we can proceed with creating a new SDK 4 certificate for our vendor integration. Thank you for your support.
1
1
640
Feb ’25
Accessing AV External Storage
Is it possible to use the AVExternalStorageDevice to access external storage from a connected camera or usb drive (via USB C or Lightning connector) on an iPad/iPhone. I have tested the following code on an iPhone 14 (iOS 18.1.1) and an iPad Gen 10 (18.3.1), and both return false for: // returns false on iPhone 14, iPad gen 10 print(AVExternalStorageDeviceDiscoverySession.isSupported) The following code returns null, when I try to access the external storage discovery session. // returns null on iOS devices print(AVExternalStorageDeviceDiscoverySession.shared) The following returns false, without displaying a permission dialog: AVExternalStorageDevice.requestAccess(completionHandler: { (granted: Bool) in // returns false with no permission dialog print(granted); What type of iOS devices are supported by AVExternalStorageDeviceDiscoverySession? What situations has it been used for (e.g. connecting to Camera via the external storage protocol, accessing photos from a SD card with an adapter, accessing photos from usb drive). Is there are sample code for using the AV External Storage api?
0
0
438
Feb ’25
Is AVPlayerViewController not supported on macCatalyst?
When building an application that can be built on iOS using macCatalyst, a link error like the one below will occur. Undefined symbol: OBJC_CLASS$_AVPlayerViewController The AVPlayerViewController documentation seems to support macCatalyst, but what is the reality? [AVPlayerViewController](https://developer.apple.com/documentation/avkit/avplayerviewcontroller? language=objc) Each version of the environment is as follows. Xcode 16.2 macOS deployment target: macOS 10.15 iOS deployment target: iOS 13.0 Thank you for your support.
4
0
433
Feb ’25
ScreenCaptureKit System Audio Capture Crashes with EXC_BAD_ACCESS
Bug Report: ScreenCaptureKit System Audio Capture Crashes with EXC_BAD_ACCESS Summary When using ScreenCaptureKit to capture system audio for extended periods, the application crashes with EXC_BAD_ACCESS in Swift's error handling runtime. The crash occurs in swift_getErrorValue when trying to process an error from the SCStream delegate method didStopWithError. This appears to be a framework-level issue in ScreenCaptureKit or its underlying ReplayKit implementation. Environment macOS Sonoma 14.6.1 Swift 5.8 ScreenCaptureKit framework Detailed Description Our application captures system audio using ScreenCaptureKit's audio capture capabilities. After successfully capturing for several minutes (typically after 3-4 segments of 60-second recordings), the application crashes with an EXC_BAD_ACCESS error. The crash happens when the Swift runtime attempts to process an error in the SCStreamDelegate.stream(_:didStopWithError:) method. The crash consistently occurs in swift_getErrorValue when attempting to access the class of what appears to be a null object. This suggests that the error being passed from the system framework to our delegate method is malformed or contains invalid memory. Steps to Reproduce Create an SCStream with audio capture enabled Add audio output to the stream Start capture and write audio data to disk Allow the capture to run for several minutes (3-5 minutes typically triggers the issue) The app will crash with EXC_BAD_ACCESS in swift_getErrorValue Code Sample func stream(_ stream: SCStream, didStopWithError error: Error) { print("Stream stopped with error: \(error)") // Crash occurs before this line executes } func stream(_ stream: SCStream, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, of type: SCStreamOutputType) { guard type == .audio, sampleBuffer.isValid else { return } // Process audio data... } Expected Behavior The error should be properly propagated to the delegate method, allowing for graceful error handling and recovery. Actual Behavior The application crashes with EXC_BAD_ACCESS when the Swift runtime attempts to process the error in swift_getErrorValue. Crash Log Details Thread #35, queue = 'com.apple.NSXPCConnection.m-user.com.apple.replayd', stop reason = EXC_BAD_ACCESS (code=1, address=0x0) frame #0: 0x0000000194c3088c libswiftCore.dylib`swift::_swift_getClass(void const*) + 8 frame #1: 0x0000000194c30104 libswiftCore.dylib`swift_getErrorValue + 40 frame #2: 0x00000001057fba30 shadow`NewScreenCaptureService.stream(stream=0x0000600002de6700, error=Swift.Error @ 0x000000016b7b5e30) at NEW+ScreenCaptureService.swift:365:15 frame #3: 0x00000001057fc050 shadow`@objc NewScreenCaptureService.stream(_:didStopWithError:) at <compiler-generated>:0 frame #4: 0x0000000219ec5ca0 ScreenCaptureKit`-[SCStreamManager stream:didStopWithError:] + 456 frame #5: 0x00000001ca68a5cc ReplayKit`-[RPScreenRecorder stream:didStopWithError:] + 84 frame #6: 0x00000001ca696ff8 ReplayKit`-[RPDaemonProxy stream:didStopWithError:] + 224 Printing description of stream._streamQueue: error: ObjectiveC.id:4294967281:18: note: 'id' has been explicitly marked unavailable here public typealias id = AnyObject ^ error: /var/folders/v4/3xg1hmp93gjd8_xlzmryf_wm0000gn/T/expr23-dfa421..cpp:1:65: 'id' is unavailable in Swift: 'id' is not available in Swift; use 'Any' Swift._DebuggerSupport.stringForPrintObject(Swift.UnsafePointer<id>(bitPattern: 0x104ae08c0)!.pointee) ^~ ObjectiveC.id:2:18: note: 'id' has been explicitly marked unavailable here public typealias id = AnyObject ^ warning: /var/folders/v4/3xg1hmp93gjd8_xlzmryf_wm0000gn/T/expr23-dfa421..cpp:5:7: initialization of variable '$__lldb_error_result' was never used; consider replacing with assignment to '_' or removing it var $__lldb_error_result = __lldb_tmp_error ~~~~^~~~~~~~~~~~~~~~~~~~ _ Before the crash, we observed this error message in the console: [ERROR] *****SCStream*****RemoteAudioQueueOperationHandlerWithError:1015 Error received from the remote queue -16665 Additional Context The issue occurs consistently after approximately 3-4 successful audio segment recordings of 60 seconds each Commenting out custom segment rotation logic does not prevent the crash The crash involves XPC communication with Apple's ReplayKit daemon The error appears to be corrupted or malformed when crossing the XPC boundary Workarounds Attempted Added proper thread safety for all published properties using DispatchQueue.main.async Implemented more robust error handling in the delegate methods None of these approaches prevented the crash since it occurs at the Swift runtime level before our code executes. Impact This issue prevents reliable long-duration audio capture using ScreenCaptureKit. This bug significantly limits the usefulness of ScreenCaptureKit for any application requiring continuous system audio capture for more than a few minutes. Perhaps this issue might be related to a macOS bug where the system dialog indicates that the screen is being shared, even though nothing is actually being shared. Moreover, when attempting to stop sharing, nothing happens.
2
0
684
Feb ’25
CoreAudio HAL plugin vs dext
The presentation "create audio drivers with DriverKit" from WWDC 2021 demonstrates how to use a dext to implement a virtual audio driver. It also says " If a virtual audio driver or device is all that is needed, the audio server plug-in driver model should continue to be used". Indeed, in AudioDriverKit/AudioDriverKitTypes.h, there is no IOUserAudioTransportType Virtual, although CoreAudio/AudioHardwareBase.h includes kAudioDeviceTransportTypeVirtual. For one of our products, we require virtual devices to implement a software loopback "cable". We've implemented this using the "traditional" HAL plugin, and as a proof-of-concept, also using a dext. In the dext, I tried setting the transport type to 'virt', which seems to only have the effect of changing the icon shown in Audio Midi Setup. HAL plugins require an installer, and the installer has to kill coreaudiod in a post-install script. You have to turn off SIP to debug them. Just like AudioDriverKit drivers, they are out-of-process and run in a process not owned by the hosting app. Our HAL plugin's interface is property based; we had to write a lot of boiler-plate code to implement required properties. Writing an AudioDriverKit driver is in most respects easier - a lot of the scaffolding is implemented in the base driver, which we only alter where required. Debugging and installation is much easier. The dext works just fine, as far as we can ascertain, just as well as a HAL plugin. So, my question is - is the advice to use a HAL plugin for a virtual device still correct in 2025? And if so, what's the objection? We'd really prefer to ship the AudioDriverKit virtual audio device.
2
1
517
Feb ’25
Urdu Language Keyboard Bug
Hello Apple, i've been using ios for many years and never had any issues with urdu language keyboard, but since the new 18.4 beta update some words are not working correctly as it should like a name of my friend who's name is "راعنیہ" but the new updated version cannot type is together and keep seperating like "راعنی ہ" its so frustrating to use like that and its not just one but so many other words that it just cannot do properly also the new font and no gap concept its hurting my eyes so much while reading or even typing.. i hope apple fixes that asap.. thankyou
1
0
490
Feb ’25
Media Library Access not working in my App since iOS 18.2
Hi There, I have an app which access the media library, to save and load files. Since the IOS 18.2, the access to the media library stopped working. Now, I've noticed that our App doesn't show in the List of apps with access to Files ( Privacy & Security -> Files & Folders). Weird behavior is that, one iPhone with iOS 18.3.1 can access to the Files but others no, same iOS version 18.3.1. Test on Simulators (MAC) and works fine also. My info.plist file have the keys to access media library for long time and hasn't changed (at least in the las 4 years) including the key "Privacy - Media Library Usage Description". Also, I've noticed, that the message (popup) that request access to the media library, when using the app for the first time, doesn't show up anymore. We request access to the network (wifi) and this message still showing up but no the media library. I'm using Visual Studio with Xamarin on a MAC. I really appreciate any help you can because is very odd behavior and this started from the iOS 18.2.
0
0
398
Feb ’25