Hi everyone,
I am developing a .NET MAUI Mac Catalyst app (sandboxed) that communicates with a custom vendor-specific HID USB device.
Within the Catalyst app, I am using a native iOS library (built with Objective-C and IOKit) and calling into it via P/Invoke from C#.
The HID communication layer relies on IOHIDManager and IOUSBInterface APIs.
The device is correctly detected and opened using IOHIDManager APIs.
However, IOHIDDeviceRegisterInputReportCallback never triggers — I don’t receive any input reports.
To investigate, I also tried using low-level IOKit USB APIs via P/Invoke from my Catalyst app, calling into a native iOS library.
When attempting to open the USB interface using IOUSBInterfaceOpen() or IOUSBInterfaceOpenSeize(), both calls fail with: kIOReturnNotPermitted (0xe00002e2).
— indicating an access denied error, even though the device enumerates and opens successfully.
Interestingly, when I call IOHIDDeviceSetReport(), it returns status = 0, meaning I can successfully send feature reports to the device.
Only input reports (via the InputReportCallback) fail to arrive.
I’ve confirmed this is not a device issue — the same hardware and protocol work perfectly under Windows using the HIDSharp library, where both input and output reports function correctly.
What I’ve verified
•Disabling sandboxing doesn’t change the behavior.
•The device uses a vendor-specific usage page (not a standard HID like keyboard/mouse).
•Enumeration, open, and SetReport all succeed — only reading input reports fails.
•Tried polling queues, in queues Input_Misc element failed to add to the queues.
•Tried getting report in a loop but no use.
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Problem Description:
Since Our USB hubs are capable of sending Vendor Defined Messages (VDMs) over a USB Type-C cable connection, they can programmatically place iOS, iPadOS, and macOS devices into DFU mode—without requiring any physical button interaction.
Recently, we identified an issue when invoking DFU mode on an iPhone 15 using this method. Upon entering DFU mode, the device enumerates with USB Product ID 0x1881 (“Debug USB” – KIS interface). At that point, the deviceinterfaced daemon (launched by launchd) immediately detects the device and claims exclusive access to the USB interface.
As a result, when our API Service attempts to communicate with the device through standard IOKit methods, it fails with the following error:
0xe00002c5 ((iokit/common) exclusive access and device already open)
This prevents our libraries from reading the iBoot string (USB serial number string) that Apple devices normally expose in standard or recovery modes—information that includes ECID, CPID, CPRV, CPFM, BDID, and SCEP. This creates a significant barrier, as our API service becomes unable to perform subsequent device restoration operations as we missed the critical information.
Request for Guidance:
I’ve included the following context for your analysis and review. Using the launchctl unload command can temporarily stop it; however, I’d like to know if there’s an API-level mechanism to programmatically prevent deviceinterfaced from claiming access from within our API Service.
Could you please advise on the following points?
1. Managing deviceinterfaced Access
• What is the proper way to stop or prevent deviceinterfaced from claiming exclusive access in this case, so that the API Service can read device information and starts restoring the device from that point?
• Is there a recommended method or entitlement that allows third-party services to communicate with Apple devices while they are in Debug USB (KIS) mode?
2. Guidelines and API Access
• Are there any Apple-supported APIs or developer guidelines that would permit controlled access to the iBoot interface without conflicting with deviceinterfaced?
Topic:
App & System Services
SubTopic:
Hardware
I have an iOS/iPadOS app and 'm trying to communicate with usb smart card reader using CryptoTokenKit on all platforms (ios/ipados/macos).
Minimal Repro Code
import CryptoTokenKit
import SwiftUI
struct ContentView: View {
@State var status = ""
var body: some View {
VStack {
Text("Status: \(status)")
}
.padding()
.onAppear {
let manager = TKSmartCardSlotManager.default
if manager != nil {
status = "Initialized"
} else {
status = "Unsupported"
}
}
}
}
And my entitlement file has only one key:
com.apple.security.smartcard = YES
Behavior
• iPadOS (on device): status = "Initialized" ✅
• macOS (native macOS app, with the required CryptoTokenKit entitlement): status = "Initialized" ✅
• macOS (Designed for iPad, regardless of CryptoTokenKit entitlement): status = "Unsupported" → TKSmartCardSlotManager.default is nil ❌
Expectation
Given that the same iPadOS build initializes TKSmartCardSlotManager, I expected the iPad app running in Designed for iPad mode on Apple silicon Mac to behave the same (or to have a documented limitation).
Questions
Is CryptoTokenKit (and specifically TKSmartCardSlotManager) supported for iPad apps running on Mac in Designed for iPad mode?
If support exists, what entitlements / capabilities are required for USB smart-card access in this configuration?
If not supported, is Mac Catalyst the correct/only path on macOS to access USB smart-card readers via CryptoTokenKit?
Are there recommended alternatives for iPad apps on Mac (Designed for iPad) to communicate with USB smart-card readers (e.g., ExternalAccessory, DriverKit, etc.), or is this scenario intentionally unsupported?
Thanks!
Hi,
We are facing the issue of commissioning our Matter device to google home through iOS device will be 100% failed.
Here is our test summary regarding the issue:
TestCase1 [OK]: Commissioning our Matter 1.4.0 device to Google Nest Hub 2 by Android device (see log
DoorWindow_2.0.1_Google_Success.txt
)
TestCase2 [NG]: Commissioning Matter 1.4.0 device to Google Nest Hub 2 by iPhone13 or iPhone16 (see log
DoorWindow_2.0.1_Google_by_iOS_NG.txt
)
TestCase3 [OK]: Commissioning our Matter 1.3.0 device to Google Nest Hub 2 by iPhone13
In TestCase2, we noticed that device was first commissioned to iOS(Apple keychain) then iOS opened a commissioning window again to commission it in Google’s ecosystem, and the device was failed at above step 2, so we also tried:
Commissioning the device to Apple Home works as expected, next share the device to Google Home app on iOS, this also fails.
Commissioning the device to Apple Home works as expected, next share the device to Google Home app on Android, this works as expected and device pops up in Google home of iOS as well.
Could you help check what's the issue of TestCase2?
Append the environment of our testing:
NestHub 2 version
Google Home app version
Hello.
I am attempting to wrap the C library libnfc as a Swift library. This is not for use on macOS - it's mainly for use on Linux (Raspberry Pi).
I have a USB reader and my code appears to work so far, however the code/test/debug cycle is suboptimal if I'm running the code on the Pi.
As I use a Mac for day-to-day coding, I'd prefer to use Xcode and my Mac for development. MacOS appears to capture the NFC hardware for its own frameworks and attempting to open a connection to the USB device gives a Unable to claim USB interface (Permission denied) error.
ioreg shows that the hardware is claimed by an Apple framework:
"UsbExclusiveOwner" = "pid 10946, com.apple.ifdbun"
Is there a way to temporarily over-ride that system and use the hardware myself? I've tried Googling but most of the replies are out of date and Claude's advice launchctl unload /System/Library/LaunchDaemons/com.apple.ifdreader.plist doesn't appear to work...
I'm wary of disabling SIP - is there a simple way to have access to the hardware myself?
Thanks.
Which HomeKit API serves for the Home application scene (HMActionSet)-related functionality “Remove from Home View” and “Add to Home View”?
There must be a public API for that, for at the very least one 3rd party application shows/hides scenes appropriately as they are set up in Home; nevertheless, whatever I try, I can't find the API.
Thanks!
I am creating a barcode reader using the AVfoundation framework for iOS and IPadOS. The read result goes into payloadstringvalue, but I want to check the control characters contained in the symbol, so I am using the raw data of the description, which is a property of NSObjectProtocol inherited by VNBarcodeObservation. However, I noticed that if the length set in the raw data exceeds 26, some of the raw data in the description is omitted. So my question is, is it possible to set it so that all the raw data in the description is written out without omitting any raw data? If so, could you please tell me how to set this up? Also, if you know of any other way to extract the raw barcode data, I would appreciate it if you could let me know.
Thank you.
I am using NFC when the phone is near the NFC reader times below the error:
2024-07-15 15:43:03.608427+0800 TestNFC[16022:1038141] [xpc.exceptions] <NSXPCConnection: 0x282ba90e0> connection to service with pid 58 named com.apple.nfcd.service.corenfc: Exception caught during decoding of received selector didDetectExternalReaderWithNotification:, dropping incoming message.
Exception: Exception while decoding argument 0 (#2 of invocation):
Exception: decodeObjectForKey: class "NFFieldNotification" not loaded or does not exist
my code:
#import <CoreNFC/CoreNFC.h>
@interface ViewController ()<NFCTagReaderSessionDelegate>
@property (strong, nonatomic) NFCTagReaderSession *session;
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view.
self.view.backgroundColor = [UIColor whiteColor];
if (@available(iOS 13.0, *)) {
// 初始化 NFC 设置代理 NFCTagReaderSessionDelegate
if (NFCNDEFReaderSession.readingAvailable)
{
self.session = [[NFCTagReaderSession alloc]
initWithPollingOption:NFCPollingISO14443 delegate:self queue:nil];
// NFC 显示提示信息
self.session.alertMessage = @"准备扫描,请将卡片贴近手机";
// 开启 NFC
[self.session beginSession];
}
} else {
}
}
#pragma mark - NFCNDEFReaderSessionDelegate
//读取失败回调-读取成功后还是会回调这个方法
- (void)tagReaderSessionDidBecomeActive:(NFCTagReaderSession *)session API_AVAILABLE(ios(13.0)){
NSLog(@"tagReaderSessionDidBecomeActive");
}
- (void)tagReaderSession:(NFCTagReaderSession *)session didInvalidateWithError:(NSError *)error API_AVAILABLE(ios(13.0)){
NSLog(@"readerSession:didInvalidateWithError: (%@)", [error localizedDescription]);
}
- (void)tagReaderSession:(NFCTagReaderSession *)session didDetectTags:(NSArray<__kindof id<NFCTag>> *)tags API_AVAILABLE(ios(13.0)){
}
I have a 4-input, 4-output hardware device and an 8-input, 8-output virtual device, which I combine into an aggregate device. I am using the SimplyCoreAudio library to get the channel count. The code is as follows:
aggregationDevice!.channels(scope: .input) =>> 12 aggregationDevice!.channels(scope: .output) =>> 12
When the program's MicrophoneMode is set to standard, the channel count is correct. However, when I set the MicrophoneMode to voiceIsolation, the channel count is incorrect:
aggregationDevice!.channels(scope: .input) =>> 4 aggregationDevice!.channels(scope: .output) =>> 12
Below is the code for creating the aggregate device:
func createAggregateDevice(mainDevice: AudioDevice,
secondDevice: AudioDevice?,
named name: String,
uid: String) -> AudioDevice?
{
guard let mainDeviceUID = mainDevice.uid else { return nil }
var deviceList: [[String: Any]] = [
[
kAudioSubDeviceUIDKey: mainDeviceUID,
kAudioSubDeviceDriftCompensationKey:1
]
]
// make sure same device isn't added twice
if let secondDeviceUID = secondDevice?.uid, secondDeviceUID != mainDeviceUID {
deviceList.append([
kAudioSubDeviceUIDKey: secondDeviceUID,
kAudioSubDeviceDriftCompensationKey:1,
kAudioSubDeviceInputChannelsKey:8
])
}
let desc: [String: Any] = [
kAudioAggregateDeviceNameKey: name,
kAudioAggregateDeviceUIDKey: uid,
kAudioAggregateDeviceSubDeviceListKey: deviceList,
kAudioAggregateDeviceMainSubDeviceKey: mainDeviceUID,
kAudioAggregateDeviceIsPrivateKey:false,
]
var deviceID: AudioDeviceID = 0
let error = AudioHardwareCreateAggregateDevice(desc as CFDictionary, &deviceID)
guard error == noErr else {
return nil
}
return AudioDevice.lookup(by: deviceID)
}
I hope someone can tell me the reason Thank you!
We just updated our ATS to the latest 8.3.0 version and tried to run the iAP2 Session Test via BPA100 Bluetooth Analyzer and we are experiencing this EXC_BAD_INSTRUCTION. This same test still seems to work on ATS version 6. Please advise.
Process: ATS [1782]
Path: /private/var/folders/*/ATS.app/Contents/MacOS/ATS
Identifier: com.apple.ATSMacApp
Version: 8.3.0 (1826)
Build Info: ATSMacApp-1826000000000000~2 (1A613)
Code Type: X86-64 (Native)
Parent Process: launchd [1]
User ID: 501
Date/Time: 2025-01-27 11:05:21.1334 -0800
OS Version: macOS 15.2 (24C101)
Report Version: 12
Bridge OS Version: 9.2 (22P2093)
Anonymous UUID: 098E2BB5-CB98-CA1C-CEFE-188AF6EFE8CF
Time Awake Since Boot: 9700 seconds
System Integrity Protection: enabled
Crashed Thread: 2 com.apple.ATSMacApp.FrontlineFrameworkInterface
Exception Type: EXC_BAD_INSTRUCTION (SIGILL)
Exception Codes: 0x0000000000000001, 0x0000000000000000
Termination Reason: Namespace SIGNAL, Code 4 Illegal instruction: 4
Terminating Process: exc handler [1782]
Topic:
App & System Services
SubTopic:
Hardware
Tags:
Developer Tools
External Accessory
Testing
Core Bluetooth
Hello everyone,
I want send haptics to ps4 controller.
CHHapticPatternPlayer and CHHapticAdvancedPatternPlayer good work with iPhone.
On PS4 controller If I use CHHapticPatternPlayer all work good, but if I use CHHapticAdvancedPatternPlayer I get error. I want use CHHapticAdvancedPatternPlayer to use additional settings. I don't found any information how to fix it -
CHHapticEngine.mm:624 -[CHHapticEngine finishInit:]_block_invoke: ERROR: Server connection broke with error 'Не удалось завершить операцию. (com.apple.CoreHaptics, ошибка -4811)'
The engine stopped because a system error occurred.
AVHapticClient.mm:1228 -[AVHapticClient getSyncDelegateForMethod:errorHandler:]_block_invoke: ERROR: Sync XPC call for 'loadAndPrepareHapticSequenceFromEvents:reply:' (client ID 0x21) failed: Не удалось установить связь с приложением-помощником.
Не удалось создать или воспроизвести паттерн: Error Domain=NSCocoaErrorDomain Code=4097 "connection to service with pid 5087 named com.apple.GameController.gamecontrollerd.haptics" UserInfo={NSDebugDescription=connection to service with pid 5087 named com.apple.GameController.gamecontrollerd.haptics}
My Haptic class -
import Foundation
import CoreHaptics
import GameController
protocol HapticsControllerDelegate: AnyObject {
func didConnectController()
func didDisconnectController()
func enginePlayerStart(value: Bool)
}
final class HapticsControllerManager {
static let shared = HapticsControllerManager()
private var isSetup = false
private var hapticEngine: CHHapticEngine?
private var hapticPlayer: CHHapticAdvancedPatternPlayer?
weak var delegate: HapticsControllerDelegate? {
didSet {
if delegate != nil {
startObserving()
}
}
}
deinit {
NotificationCenter.default.removeObserver(self)
}
private func startObserving() {
guard !isSetup else { return }
NotificationCenter.default.addObserver(
self,
selector: #selector(controllerDidConnect),
name: .GCControllerDidConnect,
object: nil
)
NotificationCenter.default.addObserver(
self,
selector: #selector(controllerDidDisconnect),
name: .GCControllerDidDisconnect,
object: nil
)
isSetup = true
}
@objc private func controllerDidConnect(notification: Notification) {
delegate?.didConnectController()
self.createAndStartHapticEngine()
}
@objc private func controllerDidDisconnect(notification: Notification) {
delegate?.didDisconnectController()
hapticEngine = nil
hapticPlayer = nil
}
private func createAndStartHapticEngine() {
guard let controller = GCController.controllers().first else {
print("No controller connected")
return
}
guard controller.haptics != nil else {
print("Haptics not supported on this controller")
return
}
hapticEngine = createEngine(for: controller, locality: .default)
hapticEngine?.playsHapticsOnly = true
do {
try hapticEngine?.start()
} catch {
print("Не удалось запустить движок тактильной обратной связи: \(error)")
}
}
private func createEngine(for controller: GCController, locality: GCHapticsLocality) -> CHHapticEngine? {
guard let engine = controller.haptics?.createEngine(withLocality: locality) else {
print("Failed to create engine.")
return nil
}
print("Successfully created engine.")
engine.stoppedHandler = { reason in
print("The engine stopped because \(reason.message)")
}
engine.resetHandler = {
print("The engine reset --> Restarting now!")
do {
try engine.start()
} catch {
print("Failed to restart the engine: \(error)")
}
}
return engine
}
func startHapticFeedback(haptics: [CHHapticEvent]) {
do {
let pattern = try CHHapticPattern(events: haptics, parameters: [])
hapticPlayer = try hapticEngine?.makeAdvancedPlayer(with: pattern)
hapticPlayer?.loopEnabled = true
try hapticPlayer?.start(atTime: 0)
self.delegate?.enginePlayerStart(value: true)
} catch {
self.delegate?.enginePlayerStart(value: false)
print("Не удалось создать или воспроизвести паттерн: \(error)")
}
}
func stopHapticFeedback() {
do {
try hapticPlayer?.stop(atTime: 0)
self.delegate?.enginePlayerStart(value: false)
} catch {
self.delegate?.enginePlayerStart(value: true)
print("Не удалось остановить воспроизведение вибрации: \(error)")
}
}
}
extension CHHapticEngine.StoppedReason {
var message: String {
switch self {
case .audioSessionInterrupt:
return "the audio session was interrupted."
case .applicationSuspended:
return "the application was suspended."
case .idleTimeout:
return "an idle timeout occurred."
case .systemError:
return "a system error occurred."
case .notifyWhenFinished:
return "playback finished."
case .engineDestroyed:
return "the engine was destroyed."
case .gameControllerDisconnect:
return "the game controller disconnected."
@unknown default:
return "an unknown error occurred."
}
}
}
custom haptic events -
static func changeVibrationPower(power: HapricPower) -> [CHHapticEvent] {
let continuousEvent = CHHapticEvent(eventType: .hapticContinuous, parameters: [
CHHapticEventParameter(parameterID: .hapticSharpness, value: 1.0),
CHHapticEventParameter(parameterID: .hapticIntensity, value: power.value)
], relativeTime: 0, duration: 0.5)
return [continuousEvent]
}
Hello,
I am currently working on a USB HID-class device and I wanted to test communications between various OSes and the device.
I was able to communicate through standard USB with the device on other OSes such as Windows and Linux, through their integrated kernel modules and generic HID drivers. As a last test, I wanted to test macOS as well.
This is my code, running in a Swift-based command line utility:
import Foundation
import CoreHID
let matchingCriteria = HIDDeviceManager.DeviceMatchingCriteria(vendorID: 0x1234, productID: 0x0006) // This is the VID/PID combination that the device is actually listed under
let manager = HIDDeviceManager()
for try await notification in await manager.monitorNotifications(matchingCriteria: [matchingCriteria]) {
switch notification {
case .deviceMatched(let deviceReference):
print("Device Matched!")
guard let client = HIDDeviceClient(deviceReference: deviceReference) else {
fatalError("Unable to create client. Exiting.") // crash on purpose
}
let report = try await client.dispatchGetReportRequest(type: .input)
print("Get report data: [\(report.map { String(format: "%02x", $0) }.joined(separator: " "))]")
case .deviceRemoved(_):
print("A device was removed.")
default:
continue
}
}
The client.dispatchGetReportRequest(...) line always fails, and if I turn the try expression into a force-unwrapped one (try!) then the code, unsurprisingly, crashes.
The line raises a CoreHID.HIDDeviceError.unknown() exception with a seemingly meaningless IOReturn code (last time I tried I got an IOReturn code with the value of -536870211).
The first instinct is to blame my own custom USB device for not working properly, but it doesn't cooperate with with ANY USB device currently connected: not a keyboard (with permissions granted), not a controller, nothing.
I did make sure to enable USB device access in the entitlements (when I tried to run this code in a simple Cocoa app) as well.
...What am I doing wrong here? What does the IOReturn code mean?
Thanks in advance for anybody willing to help out!
Topic:
App & System Services
SubTopic:
Hardware
I have an Iphone Xsmax and the battery health is degraded to 69
i noticed whenever I put it on charge it just restarts and keeps doing that until I start using it or keep the screen on before it charges
please is it my charger or it’s because the battery health has degraded to 69?
Topic:
App & System Services
SubTopic:
Hardware
So the battery level value is in accurate returns the battery percentage in multiple of 5 values e.g. battery percentage is 42 but the api returns it as 40. So please fix the issue if possible because i checked that the
devices running iOS versions below 17 appear to be working fine.
Im having issue with OneDrive that is affected our company iPads. User are able to drag and drop any folder or files over and now they cant. they are on the latest update for OneDrive and the IOS. Can someone look at this and also i reach to Microsoft and they said that nothing have change on there end.
When are you guys going to fix the CarPlay issues with this new update? I use this for work and it’s really an issue. Nothing is working and it takes up entirely too much space.
Our company is developing an MFi headset with a button that we would like to use for initiating PTT.
We can detect the button press and initiate PTT successfully, even when the app is not in the foreground, using the ExternalAccessory framework.
But I wonder if this is a coincidence, or a scenario that should reliably work with Push to Talk?
After updating to ios18.4, 3d scanning function, including AR function in apple clips, cannot be used. Does anyone else have the same problem?
Topic:
App & System Services
SubTopic:
Hardware
**
Every time after I downloaded an app this window opens and never closes how to close it?**
Topic:
App & System Services
SubTopic:
Hardware
My audio and MIDI sequencer application consumes about 600 % of CPU power with 10 different instruments during playback. While idle approximately 100%.
What is the maximum of CPU power that an application can consume? Are there any limits and could they be modified?
I am asking because if I add more instruments the real-time behaviour gets bad at 700 % of CPU power.
I have got following HW:
MacBook Pro
14-inch, Nov 2024
Apple M4 Pro
24 GB