Keywords: FairPlay, FPS Certificate, DRM, FairPlay Streaming, license server
Hi all,
We are currently using FairPlay Streaming in production and already have an FPS certificate in place.
However, the passphrase for the existing FPS certificate has unfortunately been lost.
We are now considering reissuing a new FPS certificate, and I would like to confirm a few points before proceeding:
1️⃣ If we reissue a new FPS certificate, will the existing certificate be automatically revoked?
Or will it remain valid until its original expiration date?
2️⃣ Is it possible to have both the newly issued and the existing certificates valid at the same time?
In other words, can we serve DRM licenses using either certificate depending on the packaging or client?
3️⃣ Are there any caveats or best practices we should be aware of when reissuing an FPS certificate?
For example, would existing packaged content become unplayable, or would CDN/packaging server configurations need to be updated carefully?
Since this affects our production environment, we would like to minimize any service disruption or compatibility issues.
Unfortunately, when we contacted Apple support directly, we were advised to post this question here in the Forums for additional guidance.
Any advice or experiences would be greatly appreciated!
Thank you in advance.
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
iOS 26 added smoothness to CIRoundedRectangleGenerator, for use with CIFilter.roundedRectangleGenerator. What should the smoothness value be to achieve the same corner curve as CALayerCornerCurve.continuous? Does it need to be calculated based on the extent size, if so, how?
Some users have reported an error editing portrait photo assets in my app:
The operation couldn’t be completed. (CINonLocalizedDescriptionKey error 3.)
What is that error? Will affected photos always encounter this error (due to data corruption for example) or can it be resolved in a future iOS update?
FB16241301
I use the iTunes Library framework in one of my apps, starting with macOS Sequoia 15.1 i can't create the ITLibrary object anymore with the following error:
Connection to amplibraryd was interrupted. clientName:iTunesLibrary(ITLibraryLoader)
Error connecting to the server via proxy object Error Domain=NSCocoaErrorDomain Code=4097 "connection to service named com.apple.amp.library.framework" UserInfo={NSDebugDescription=connection to service named com.apple.amp.library.framework}
configure failed: Error Domain=NSCocoaErrorDomain Code=4097 "connection to service named com.apple.amp.library.framework" UserInfo={NSDebugDescription=connection to service named com.apple.amp.library.framework}
I created a new sandboxed macOS app, added the music folder read permission and it reproduced the error:
import SwiftUI
import iTunesLibrary
@main
struct ITLibraryLoaderApp: App {
var body: some Scene {
WindowGroup {
ContentView()
}
}
}
struct ContentView: View {
var body: some View {
Button("Load ITLibrary") {
loadLibrary()
}
}
func loadLibrary() {
do {
let _ = try ITLibrary(apiVersion: "1.0", options: .none)
}
catch {
print(error)
}
}
}
I restarted my developer machine and the music app with no luck.
There are several unknown pictures were added to my iPhone Photos. I'm developing a iOS app in swiftUI. And I'm sure that neither my developing app asked a permission to save photos to iOS device, nor these pictures are taken by myself, downloaded from the web, or synced from my iCloud. Why does this happen? Is my account or my iPhone attacked or hacked? btw, my device has turned the debug mode on, will this setting option would leads some vulnerability or some debug tools would add photos to my albums?
The attached pictures are the unknown photos in my Photos App. I'v search this photo with google. And I found that many developers in stack overflow use this picture when they encounter troubles with image display or other manipulation. Where where are this photo come from? Why does it exist in my Photo albums?
The documentation for PHAssetChangeRequest.revertAssetContentToOriginal says it will fail if the original asset content is not on the current device so you should use PHAssetResourceManager to download it first, but this no longer seems to be the case in the latest iOS versions because an error no longer occurs when I take a photo on my iPhone, edit it, open Photos on my iPad and let it sync, then open my app on iPad and call revertAssetContentToOriginal for that asset. Does the system now take care of downloading the original when needed?
FxPlug is one of Apple’s official SDKs, recently updated to version 4.3.2. In theory the SDK should guarantee third-parties can build plug-ins that are backward compatible with older versions of Final Cut Pro, Motion and Compressor.
FxPlug SDK includes two frameworks that third-party developers like me end up bundling inside our third-party plugins: FxPlug.framework and PlugInManager.framework.
Behind the scenes, the SDK relies on PlugInKit, but the FxPlug.framework provides abstractions so that third-parties don't have to handle the intricacies of XPC directly.
The most recent version of FxPlug.framework included with the SDK was possibly built with an error: the Info.plist shows a LSMinimumSystemVersion entry of 14.6, suggesting the binary may have been compiled and linked with MACOSX_DEPLOYMENT_TARGET set to 14.6 by accident.
The problem: when older versions of Final Cut Pro or Motion load a third-party plugin (itself built with the appropriate deployment target, macOS 11 or 12, for example) on pre-macOS 14.6, the dynamic linker immediately loads Apple’s own FxPlug.framework, but this causes the process to crash immediately:
Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
0 libobjc.A.dylib 0x7ff81e065955 map_images_nolock + 5399
1 libobjc.A.dylib 0x7ff81e0643d6 map_images + 67
2 dyld 0x10bd551fb invocation function for block in dyld4::RuntimeState::setObjCNotifiers(void (*)(unsigned int, char const* const*, mach_header const* const*), void (*)(char const*, mach_header const*), void (*)(char const*, mach_header const*)) + 275
3 dyld 0x10bd506c9 dyld4::RuntimeState::withLoadersReadLock(void () block_pointer) + 41
4 dyld 0x10bd550e2 dyld4::RuntimeState::setObjCNotifiers(void (*)(unsigned int, char const* const*, mach_header const* const*), void (*)(char const*, mach_header const*), void (*)(char const*, mach_header const*)) + 82
5 dyld 0x10bd68d45 dyld4::APIs::_dyld_objc_notify_register(void (*)(unsigned int, char const* const*, mach_header const* const*), void (*)(char const*, mach_header const*), void (*)(char const*, mach_header const*)) + 79
6 libobjc.A.dylib 0x7ff81e064244 _objc_init + 1279
7 libdispatch.dylib 0x7ff81e01d993 _os_object_init + 13
8 libdispatch.dylib 0x7ff81e02b1b8 libdispatch_init + 311
9 libSystem.B.dylib 0x7ff828fd585f libSystem_initializer + 238
10 dyld 0x10bd5ae4f invocation function for block in dyld4::Loader::findAndRunAllInitializers(dyld4::RuntimeState&) const + 182
11 dyld 0x10bd81aad invocation function for block in dyld3::MachOAnalyzer::forEachInitializer(Diagnostics&, dyld3::MachOAnalyzer::VMAddrConverter const&, void (unsigned int) block_pointer, void const*) const + 242
12 dyld 0x10bd78e26 invocation function for block in dyld3::MachOFile::forEachSection(void (dyld3::MachOFile::SectionInfo const&, bool, bool&) block_pointer) const + 557
13 dyld 0x10bd47db3 dyld3::MachOFile::forEachLoadCommand(Diagnostics&, void (load_command const*, bool&) block_pointer) const + 129
14 dyld 0x10bd78bb7 dyld3::MachOFile::forEachSection(void (dyld3::MachOFile::SectionInfo const&, bool, bool&) block_pointer) const + 179
15 dyld 0x10bd81604 dyld3::MachOAnalyzer::forEachInitializer(Diagnostics&, dyld3::MachOAnalyzer::VMAddrConverter const&, void (unsigned int) block_pointer, void const*) const + 466
16 dyld 0x10bd5ad82 dyld4::Loader::findAndRunAllInitializers(dyld4::RuntimeState&) const + 144
17 dyld 0x10bd6165a dyld4::PrebuiltLoader::runInitializers(dyld4::RuntimeState&) const + 30
18 dyld 0x10bd6e76e dyld4::APIs::runAllInitializersForMain() + 38
19 dyld 0x10bd4c38d dyld4::prepare(dyld4::APIs&, dyld3::MachOAnalyzer const*) + 3443
20 dyld 0x10bd4b4e4 start + 388
Can someone at Apple with the right domain expertise confirm that this is the type of crash you would see because the framework was built assuming it would run on macOS 14.6 and later, and when facing an older environment (e.g. ObjC runtime) it lacks extra code that would ensure backward compatibility with the earlier ObjC runtime found on macOS 12.x?
Hello all! I've been having this issue for a while, on my iPhone 12 Pro.
The volume when listening to music, watching YouTube, TikTok, etc. It will randomly lower, but the actual audio slider won't it will still be at max volume but get very quiet. I've followed other instructions such as turn off audio awareness, and other settings but nothing seems to be working. And phone calls too Has anyone else had this issue and managed to fix it?
Topic:
Media Technologies
SubTopic:
Audio
Does anyone have a template of an Apple Projected Media Profile Format Description or a File of a Stereo wideFOV video?
Use case I have 2 compatible cameras that I stereo sync and I want to move the projection information from the compatible video to the Spatial video that combines them.
Every version I can come up with crashes the AVP and when viewing as Spatial in Tahoe I just get a black screen.
Hi guys,
How to achieve the following feature on macOS when a USB device (Camera/Mic/Speaker) is connected:
When a third-party video conferencing app is not in a meeting, ensure the app defaults to using the USB device (Camera/Mic/Speaker).
When a third-party conferencing app is in a meeting, ensure the app automatically switches to the USB device (Camera/Mic/Speaker).
I want to make use of IOKit extension to hidden or ignore build-in camera to realize the requirement.
however the extension can't be loaded for Invalid permissions in MacOS 15.4.1, buildVersion:24E263. I also tried to run in in MacOS 14.4.1, which can be loaded but can't auto load when restart laptop as KDK version not match.
Could you please give me some suggestion? Is it possible hidden build-in camera in MacOS M-series chip? Is there any other method to realize the feature. Thanks a lot.
Can anyone explain how AVAssetExportSession works in iOS 18 and earlier versions?
In my project, i want to add emoji to my video but emoji image becomes dark when add in hdr video, im trying to convert my emoji image to hdr format or create with hdr format but seems not work. I hope someone has experiences in this case could help me.
I am developing an iOS app with video call functionality and implementing Picture in Picture (PiP) mode for video calls. The issue I am facing is that the camera stops capturing video when the app goes to the background, even though the PiP view is still visible.
I have noticed that some apps, like Telegram, manage to keep the camera working in PiP mode while the app is in the background. How can I achieve this in my app?
i have a CarPlay implementation eand I want to show previous/next track button on player UI
MPRemoteCommandCenter.shared().seekForwardCommand.isEnabled = false
MPRemoteCommandCenter.shared().seekBackwardCommand.isEnabled = false
MPRemoteCommandCenter.shared().previousTrackCommand.isEnabled = true
MPRemoteCommandCenter.shared().nextTrackCommand.isEnabled = true
It works correctly on CarPlay simulator , but on some car only SEEK button are shown .
I have to suppose that it is that a problem on the car side , but I would ask about your opinion , maybe there is some pieces I'm missing
I am would like to look at AVMetricEvent data during video playback, so I have added this code to a test video player app:
let playerItem: AVPlayerItem = ...
let allMetrics = playerItem.allMetrics()
Task.init {
print("metrics task started")
do {
for try await metricEvent in allMetrics {
print("metric event: \(metricEvent.description)")
}
} catch {
print("unexpected metric iterator error \(error)")
}
}
Running this in Simulator on iPhone 16 Pro (18.0) does not result in any "metric event" diagnostic messages being printed when the video associated with this AVPlayerItem is playing. Only the "metric task started" diagnostic message is seen.
What am I doing wrong that prevents metric data being received?
Hello everyone, I need some help about this things.
If you also know, pls comment.
Overview
We are planning to develop an app using the “Support external cameras in your iPadOS app” feature introduced in iPadOS 17.
Before implementing this feature, it is necessary for the iPad to recognize external cameras. However, among the iPad models compatible with iPadOS 17, we have found that some of the iPads owned by our development team can recognize external cameras, while others cannot.
If you have any reports regarding compatibility issues or information on how to resolve these problems, please share them with us.
Detailed Explanation:
The results of our investigation are as follows:
External Camera Used: A 360-degree camera
Devices Firmware
RICOH Theta X 2.61.0(2024/12/26Latest)
RICOH Theta Z1
Tested iPad
Devices Firmware Status
12.9インチiPad Pro(第3世代) IOS 17.5.1 OK
11インチiPad Pro(M4) IOS 18.2 NG
Verification Method
Step 1: Power on the iPad and the external camera, ensuring both are ready for connection.
Step 2: Connect the iPad and the external camera using a USB-C cable.
Step 3: Launch FaceTime on the iPad and check the displayed camera feed.
If the external camera is recognized, the feed from the external camera will be displayed.
Hello!
Just curious if AVAssetWriter append() randomly fails for anyone but me? Seems to happen when live streaming (encoding with VTCompressionSession) and recording (with AVAssetWriter) at the same time.
Topic:
Media Technologies
SubTopic:
Streaming
Does the library exists in xCode 16.4?
"import WorldCaptureKit" gives error "No such module 'WorldCaptureKit'".
And I do not find any information about the library in the apple documentation.
But AI keeps suggesting me to use the library
Topic:
Media Technologies
SubTopic:
Photos & Camera
Hello there,
I need to move through video loaded in an AVPlayer one frame at a time back or forth. For that I tried to use AVPlayerItem's method step(byCount:) and it works just fine.
However I need to know when stepping happened and as far as I observed it is not immediate using the method. If I check the currentTime() just after calling the method it's the same and if I do it slightly later (depending of the video itself) it shows the correct "jumped" time.
To achieve my goal I tried subclassing AVPlayerItem and implement my own async method utilizing NotificationCenter and the timeJumpedNotification assuming it would deliver it as the time actually jumps but it's not the case.
Here is my "stripped" and simplified version of the custom Player Item:
import AVFoundation
final class PlayerItem: AVPlayerItem {
private var jumpCompletion: ( (CMTime) -> () )?
override init(asset: AVAsset, automaticallyLoadedAssetKeys: [String]?) {
super .init(asset: asset, automaticallyLoadedAssetKeys: automaticallyLoadedAssetKeys)
NotificationCenter.default.addObserver(self, selector: #selector(timeDidChange(_:)), name: AVPlayerItem.timeJumpedNotification, object: self)
}
deinit {
NotificationCenter.default.removeObserver(self, name: AVPlayerItem.timeJumpedNotification, object: self)
jumpCompletion = nil
}
@discardableResult func step(by count: Int) async -> CMTime {
await withCheckedContinuation { continuation in
step(by: count) { time in
continuation.resume(returning: time)
}
}
}
func step(by count: Int, completion: @escaping ( (CMTime) -> () )) {
guard jumpCompletion == nil else {
completion(currentTime())
return
}
jumpCompletion = completion
step(byCount: count)
}
@objc private func timeDidChange(_ notification: Notification) {
switch notification.name {
case AVPlayerItem.timeJumpedNotification where notification.object as? AVPlayerItem [==](https://www.example.com/) self:
jumpCompletion?(currentTime())
jumpCompletion = nil
default: return
}
}
}
In short the notification never gets called thus the above is not working.
I guess the key there is that in the docs about the timeJumpedNotification: is said:
"A notification the system posts when a player item’s time changes discontinuously."
so the step(byCount:) is not considered as discontinuous operation and doesn't trigger it.
I'd be really helpful if somebody can help as I don't want to use seek(to:toleranceBefore:toleranceAfter:) mainly cause it's not accurate in terms of the exact next/previous frame as the video might have VFR and that causes repeating frames sometimes or even skipping one or another.
Thanks a lot
Howdy. I'm trying to access media from a users song library and receive:
<ICUserIdentityStoreACAccountBackend: 0x148f8af30> Failed to initialize active account, error=Error Domain=ICError Code=-7013 "Client is not entitled to access account store" UserInfo={NSDebugDescription=Client is not entitled to access account store}
I'm told I need to add a Media Library Access Capability. Nothing like this shows up in Xcode under Signing & Capabilities > +Capabilities. Also I can't find anything like this in my account in dev.apple.com.
How do I enable myself and a test user using another iPhone device to access my music and their music respectively?
Thanks!
Topic:
Media Technologies
SubTopic:
General
Tags:
App Tracking Transparency
Media Player
iOS
MusicKit