Hi everyone,
I’m working on a custom camera implementation in iOS using native code. My goal is to capture unprocessed, realistic images directly from the camera — without any filters or post-image processing applied by the system.
I’ve implemented RAW image capture using the native camera APIs (AVFoundation) and successfully received .dng files. However, even the RAW outputs don’t look like the real environment — the colors, tone, and exposure still seem processed or corrected in some way.
I’ve tried various configurations such as photoSettings.rawPhotoPixelFormatType, experimenting with AVCaptureDevice and AVCapturePhotoOutput settings, and reviewing ProRAW and standard RAW behavior, but I’m still not getting truly unprocessed results that reflect the actual sensor data.
Has anyone experienced similar results when capturing RAW images on iOS, or found a way to bypass Apple’s image signal processing (ISP) pipeline for more realistic captures?
Any insights or references from Apple’s camera framework behavior would be greatly appreciated.
Thank you!
Hardware
RSS for tagDelve into the physical components of Apple devices, including processors, memory, storage, and their interaction with the software.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi everyone,
I submitted an app for review and was met with a rejection for unresolved issues.
This was what was asked in the rejection:
Provide detailed answers to the following questions:
-Does your app interact with any hardware?
Would that be referring to the camera/microphone of the device? My app uses haptics when you select an option. I didn't see anything in connect where I needed to specify the use of haptics.
Also, does this mean that when the reviewer answers me I have to resubmit as a version 1.1? I'm not sure what I would need to change. This is my first app so I'm not entirely sure on the procedure.
Topic:
App & System Services
SubTopic:
Hardware
Hi everyone,
I am developing a .NET MAUI Mac Catalyst app (sandboxed) that communicates with a custom vendor-specific HID USB device.
Within the Catalyst app, I am using a native iOS library (built with Objective-C and IOKit) and calling into it via P/Invoke from C#.
The HID communication layer relies on IOHIDManager and IOUSBInterface APIs.
The device is correctly detected and opened using IOHIDManager APIs.
However, IOHIDDeviceRegisterInputReportCallback never triggers — I don’t receive any input reports.
To investigate, I also tried using low-level IOKit USB APIs via P/Invoke from my Catalyst app, calling into a native iOS library.
When attempting to open the USB interface using IOUSBInterfaceOpen() or IOUSBInterfaceOpenSeize(), both calls fail with: kIOReturnNotPermitted (0xe00002e2).
— indicating an access denied error, even though the device enumerates and opens successfully.
Interestingly, when I call IOHIDDeviceSetReport(), it returns status = 0, meaning I can successfully send feature reports to the device.
Only input reports (via the InputReportCallback) fail to arrive.
I’ve confirmed this is not a device issue — the same hardware and protocol work perfectly under Windows using the HIDSharp library, where both input and output reports function correctly.
What I’ve verified
•Disabling sandboxing doesn’t change the behavior.
•The device uses a vendor-specific usage page (not a standard HID like keyboard/mouse).
•Enumeration, open, and SetReport all succeed — only reading input reports fails.
•Tried polling queues, in queues Input_Misc element failed to add to the queues.
•Tried getting report in a loop but no use.
Im running macOS Tahoe and I have the proper
nvram boot-args , however when I try to poke the log stream im not getting any verb information related to the card im using. The audio system im using is AppleHDA.kext from the Beta 1 KDK.
I've tried asking AI it doesn't make a difference what it suggests to me..... In the meantime of while im asking for assistance what ill do is go ahead and let it template me a kernel extension that I guess just traffics it to the Log for me and hopefully this isn't filtered out as what I suspect is it saying is happening is is that it actually masks some of the information.
Why am I doing this? not For the Linux Driver its so I can see from the Log where it came from as this is what the developer said he did GitHub/davidjo/snd_hda_macbookpro is the kabylake iMac.
Device: iPhone [model], iOS 18.6.2
Xcode: 16.0.x
Team: Individual paid Apple Developer Program (not Personal Team), shows as my full name in Xcode
I’m trying to use CoreNFC via NFCTagReaderSession in a small SwiftUI app (part of a larger project).
So far I’ve done:
• Enrolled in the Apple Developer Program (individual).
• Confirmed that in Certificates, Identifiers & Profiles → Identifiers, my App ID for com.<…> has Near Field Communication Tag Reading enabled.
• Created an iOS App Development provisioning profile for that App ID, including:
• my Apple Development certificate
• my iPhone device
• Downloaded the profile, double-clicked it, and set it in Xcode under Signing & Capabilities with:
• Team = my full-name team
• “Automatically manage signing” off, using the custom profile.
• Added the NFC Scan capability in Signing & Capabilities.
• Added Privacy - NFC Scan Usage Description (NFCReaderUsageDescription) in Info.plist with a non-empty string.
The app builds and runs on device. When I start the session:
func beginScanning() {
print("NFCTagReaderSession.readingAvailable =", NFCTagReaderSession.readingAvailable)
session = NFCTagReaderSession(pollingOption: [.iso14443, .iso15693],
delegate: self,
queue: nil)
session?.alertMessage = "Hold your iPhone near your Ori tag."
session?.begin()
}
func tagReaderSession(_ session: NFCTagReaderSession, didInvalidateWithError error: Error) {
print("NFC session invalidated:", error.localizedDescription)
}
readingAvailable is false, and I immediately see:
NFC session invalidated: Session invalidated unexpectedly
Earlier in this process I was seeing XPC sandbox messages like:
Error Domain=NSCocoaErrorDomain Code=4099
"The connection to service named com.apple.nfcd.service.corenfc was invalidated: failed at lookup with error 159 - Sandbox restriction."
Those went away after I created the explicit iOS App Development profile and pointed the target at it, but the session still invalidates right away and readingAvailable never becomes true.
Safari can read NDEF URL tags on this device, so the NFC hardware is working.
Question:
Is there anything else required on the App ID / provisioning / team side to enable CoreNFC with NFCTagReaderSession for an individual (non-enterprise) developer account? Or any known issues where readingAvailable stays false even with NFC Tag Reading enabled and a custom iOS App Development profile?
Any hints on what I might still be missing would be greatly appreciated.
We have an app that connects to an external device that we developed in-house that measures electroencephalography (EEG), as well as PPG and IMU. This is not a medical device and we have stated that many times but the app review process keeps rejecting the app for the same reason 1.4.1 - Safety physical harm because they say it is connecting to a medical device. We have submitted documentation for FCC certification for safety but we do not have FDA certification because it is not used for medical purposes - purely wellness. Despite several messages explaining it is not a medical device the response is always the same without actually addressing any of the supporting documents we have sent. Any help to find a way to explain to the Apple team that not all EEG devices are medical and in fact most are NOT FDA approved would be appreciated as it seems like whoever is reviewing the app doesn't understand that.
Topic:
App & System Services
SubTopic:
Hardware
We have developed an accessory that supports Find My. When using the Find My app to set it up, it occasionally gets stuck at the final " setting up"" interface. The app just stays like that. We would like to know what could cause this situation and how to resolve it.
Thanks a lot.
iOS 26.2 Beta 23C5044b
Phone's battery reports 0% Health, leaving it on multiple chargers (high/low wattage) doesn't change anything. Icon changes to charging.
Changed the battery with a high quality aftermarket of ~70% charge, same issue.
Unable to remove beta or update to RC2 due to the 20% minimum required... not sure what to do.
where can i get apple's all device hardware vid and pid ?
Topic:
App & System Services
SubTopic:
Hardware
Hi,
We are facing the issue of commissioning our Matter device to google home through iOS device will be 100% failed.
Here is our test summary regarding the issue:
TestCase1 [OK]: Commissioning our Matter 1.4.0 device to Google Nest Hub 2 by Android device (see log
DoorWindow_2.0.1_Google_Success.txt
)
TestCase2 [NG]: Commissioning Matter 1.4.0 device to Google Nest Hub 2 by iPhone13 or iPhone16 (see log
DoorWindow_2.0.1_Google_by_iOS_NG.txt
)
TestCase3 [OK]: Commissioning our Matter 1.3.0 device to Google Nest Hub 2 by iPhone13
In TestCase2, we noticed that device was first commissioned to iOS(Apple keychain) then iOS opened a commissioning window again to commission it in Google’s ecosystem, and the device was failed at above step 2, so we also tried:
Commissioning the device to Apple Home works as expected, next share the device to Google Home app on iOS, this also fails.
Commissioning the device to Apple Home works as expected, next share the device to Google Home app on Android, this works as expected and device pops up in Google home of iOS as well.
Could you help check what's the issue of TestCase2?
Append the environment of our testing:
NestHub 2 version
Google Home app version
Hi,
We are facing the issue of commissioning our Matter device to google home through iOS device will be 100% failed.
Here is our test summary regarding the issue:
TestCase1 [OK]: Commissioning our Matter 1.4.0 device to Google Nest Hub 2 by Android device (see log
DoorWindow_2.0.1_Google_Success.txt
)
TestCase2 [NG]: Commissioning Matter 1.4.0 device to Google Nest Hub 2 by iPhone13 or iPhone16 (see log
DoorWindow_2.0.1_Google_by_iOS_NG.txt
)
TestCase3 [OK]: Commissioning our Matter 1.3.0 device to Google Nest Hub 2 by iPhone13
In TestCase2, we noticed that device was first commissioned to iOS(Apple keychain) then iOS opened a commissioning window again to commission it in Google’s ecosystem, and the device was failed at above step 2, so we also tried:
Commissioning the device to Apple Home works as expected, next share the device to Google Home app on iOS, this also fails.
Commissioning the device to Apple Home works as expected, next share the device to Google Home app on Android, this works as expected and device pops up in Google home of iOS as well.
Could you help check what's the issue of TestCase2?
Append the environment of our testing:
NestHub 2 version
Google Home APP version
We are currently planning to develop a third‑party hardware accessory that supports Wi‑Fi Aware using AccessorySetupKit on iOS, based on the official documentation:
https://developer.apple.com/documentation/accessorysetupkit/
Before finalizing our hardware and firmware design, we would like to better understand the real‑world behavior and user experience of Wi‑Fi Aware in actual third‑party accessories.
Specifically, we would like to ask:
Existing Third‑Party Hardware
Are there any commercially available third‑party accessories (not Apple products) that already support Wi‑Fi Aware via AccessorySetupKit?
If so, are there any public examples, reference designs, or recommended products we can purchase to observe the real onboarding, discovery, and pairing experience?
Reference or Evaluation Hardware
Does Apple provide any reference hardware, evaluation kits, or recommended vendor solutions (for example, based on common Wi‑Fi chipsets) that are known to work well with Wi‑Fi Aware on iOS?
Are there specific Wi‑Fi chipset vendors that have validated interoperability with AccessorySetupKit?
Practical Behavior and Limitations
In real usage, what are the typical discovery latency, reliability, and background/foreground behavior developers should expect?
Are there known limitations or best practices when designing hardware that relies on Wi‑Fi Aware for initial accessory discovery and setup?
Our goal is to evaluate the feasibility and user experience of Wi‑Fi Aware for third‑party accessories by testing against existing implementations or recommended hardware, before investing heavily in custom hardware development.
Any guidance, examples, or pointers to existing accessories or partners would be greatly appreciated.
Hi,
I’m developing a Matter commissioning flow and would like to clarify Apple Home’s support for concatenated (multi-device) QR codes.
In my implementation, I generate a single QR code that contains multiple Matter onboarding payloads (concatenated payloads), intended to commission multiple devices in one scan, similar to a multi-pack / multi-accessory flow.
What I’ve tested:
Standard single-device Matter QR codes work as expected in the Apple Home app
A concatenated QR code (multiple Matter payloads combined into one QR) does not get recognized / commissioned by Apple Home
My questions:
Does Apple Home officially support commissioning via concatenated or multi-device Matter QR codes?
If yes, is there a specific payload format or delimiter that Apple Home expects?
If not, is this a known limitation or something planned for future iOS/Home releases?
I want to add matter device to my own fabric,not same as to homeKit in Home APP
I implemented a demo which add a matter support extension, and it can success, but I use MTRDeviceController to commission,it go wrong, blow is the log
Couldn't read values in CFPrefsPlistSource<0x1062ec100> (Domain: group.wxx.MatterTest, User: kCFPreferencesAnyUser, ByHost: Yes, Container: (null), Contents Need Refresh: Yes): Using kCFPreferencesAnyUser with a container is only allowed for System Containers, detaching from cfprefsd
<<5 [E:46634i S:0 M:188511265] (U) Msg Retransmission to 0:0000000000000000 failure (max retries:4)
PASESession timed out while waiting for a response from the peer. Expected message type was 33
controller(:commissioningSessionEstablishmentDone:) error = nil
Error on commissioning step 'AttestationVerification': 'src/controller/CHIPDeviceController.cpp:1288: CHIP Error 0x000000AC: Internal error'
Failed verifying attestation information. Now checking DAC chain revoked status.
Failed in verifying 'Attestation Information' command received from the device: err 101. Look at AttestationVerificationResult enum to understand the errors
Error on commissioning step 'AttestationRevocationCheck': 'src/controller/CHIPDeviceController.cpp:1337: CHIP Error 0x000000AC: Internal error'
Failed to send Solitary ack for MessageCounter:265529558 on exchange 46643i:src/messaging/ExchangeContext.cpp:99: CHIP Error 0x00000002: Connection aborted
Creating NSError from src/controller/CHIPDeviceController.cpp:1337: CHIP Error 0x000000AC: Internal error (context: (null))
controller(:commissioningComplete:nodeID:metrics:) error = Optional(Error Domain=MTRErrorDomain Code=1 "General error: 172" UserInfo={NSLocalizedDescription=General error: 172, errorCode=172})
Is there any suggestion to me with the issue
Topic:
App & System Services
SubTopic:
Hardware
I followed the instructions on the page https://mfi.apple.com/en/help/login-help/How-to-Register-Your-Existing-Apple-ID.html to apply for the MFi Program. According to step 7 of the guide: "You have now created and registered your Apple Account. You will be automatically directed to the MFi Portal to begin the enrollment process," I should have been taken to the enrollment process after logging in. However, instead of accessing the enrollment page, a pop-up message appears stating: "The Apple Account you signed in with does not have permission to view this page. If you believe your company is currently enrolled in the MFi Program, please contact your company’s Account Administrator to request access to the MFi Portal. If your company is not currently enrolled in the MFi Program, please click here to learn about the program and start the enrollment process." This has created an endless loop—I cannot proceed to the enrollment process as instructed, and the pop-up only redirects me to information that leads back to the same login and permission issue. Could you please provide guidance on how to resolve this and successfully access the MFi Program enrollment process?
Topic:
App & System Services
SubTopic:
Hardware
Hello Apple Forums,
We are developing an iOS application that connects to a custom BLE accessory and sends control commands to it.
Our system architecture is as follows:
A separate hardware device collects data and sends it to our backend server via Wi-Fi.
The backend evaluates state changes and determines when the BLE accessory should update its display.
The iOS app acts purely as a BLE command executor for this accessory.
Our goal is to:
Maintain a BLE connection with the accessory while the app is in the background.
Receive state-change events from our backend server.
Upon receiving such events, send a BLE command to the accessory to update its state.
We understand that iOS does not allow arbitrary background execution. We would like to confirm whether there is any supported mechanism, entitlement, or program that allows:
Long-running background execution for BLE control, or
Server-originated events (other than APNs) to trigger background BLE actions.
If this is not supported, we would appreciate confirmation that APNs (silent push) is the only supported way to trigger such background BLE actions, or guidance on any recommended alternative architectures.
Thank you for your guidance.
Summary
On Mac Studio systems (no built-in camera), macOS does not initialize camera services after a normal reboot if no physical camera is present. As a result, Continuity Camera does not appear anywhere in the system.
Observed behavior
System Information → Camera reports “No video capture devices were found.”
Continuity Camera (iPhone) is completely absent from camera lists.
Plugging in any USB UVC webcam immediately initializes camera services and causes both the USB camera and the iPhone (Continuity Camera) to appear.
The USB camera can then be unplugged and Continuity Camera continues working until the next reboot.
Reproduction steps
Use a Mac Studio (no built-in camera) on recent macOS.
Ensure no USB webcam or external camera is connected.
Reboot the Mac normally.
After login, open System Information → Camera.
Expected
Camera services should initialize even when no physical camera is present, allowing Continuity Camera to be available as the primary camera.
Actual
No camera devices are present unless a physical USB camera is connected at least once after boot.
This reproduces 100% of the time on Mac Studio and appears to be a camera service bootstrap issue where Continuity Camera cannot be the first camera device.
Issue has been filed via Feedback Assistant.
My team has developed an app with a biref Matter commissioner feature using the Matter framework on the MatterSupport extension.
Our app support iOS and Android. However, we ran into a problem that the control certificate generated by the iOS app could not control the device on the Android side. And the control certificate generated by the Android app could not control the device on the iOS side.
The Matter library used by Android is compiled by connectedhomeip.
Does anyone have the same problem as us? How to solve this?
Thank you
My iOS application needs to connect a device by wifi and exchange data between them.
The way of transmission is using the UDP protocol, most of all, it works well. But some part of iOS devices will loss the package always. Even if re-open the application or reboot wifi devices, or reboot iOS devices, it can not be solved.
Only reset the network settings on iOS devices could fix it.
But this can not make sure that be well always, if occurs in the future. User need to reset network setting again.
Are there any brothers know or meet this phenomenon?
Thanks for your time to research this.
as i want to tract activity of iphone user using core motion framework , guide me through .