Windows 10 使用 VirtualBox 创建的 Monterey 12.6.7 macOS 虚拟机不能识别到 iPhone 7 手机。
iPhone 7 已经连接到电脑主机 (win 10) 的 USB 3.0 口子,手机已经信任电脑。
在 win 10,我看到了 “此电脑\Apple iPhone”,就是说,宿主机识别到了 手机。
现在,开启macOS 虚拟机,虚拟机右下角的 usb 图标,显示并且勾选到了 "Apple Inc. iPhone [0901]",但虚拟机还是没看到手机设备,导致 Xcode 也看不到手机设备。
虚拟机运行后,插拔 iPhone 7 手机,通过
sudo log show --predicate 'eventMessage contains "usbmuxd"' --info
看到了报错信息:
2025-02-13 10:31:06.541201+0800 0xa3c Error 0x0 0 0 kernel: (Sandbox) 1 duplicate report for System Policy: usbmuxd(22583) deny(1) file-write-mode /private/var/db/lockdown
2025-02-13 10:31:07.090321+0800 0xf807 Error 0x0 140 0 sandboxd: [com.apple.sandbox.reporting:violation] System Policy: usbmuxd(22583) deny(1) file-write-mode /private/var/db/lockdown
Violation: deny(1) file-write-mode /private/var/db/lockdown
Process: usbmuxd [22583]
Path: /usr/local/sbin/usbmuxd
Load Address: 0x10564b000
Identifier: usbmuxd
Version: ??? (???)
Code Type: x86_64 (Native)
Parent Process: sudo [22582]
Responsible: /System/Applications/Utilities/Terminal.app/Contents/MacOS/Terminal
User ID: 0
Date/Time: 2025-02-13 10:31:06.793 GMT+8
OS Version: macOS 12.6.7 (21G651)
Release Type: User
Report Version: 8
MetaData: {"vnode-type":"DIRECTORY","hardlinked":false,"pid":22583,"process":"usbmuxd","primary-filter-value":"/private/var/db/lockdown","platform-policy":true,"binary-in-trust-cache":false,"path":"/private/var/db/lockdown","primary-filter":"path","action":"deny","matched-extension":false,"process-path":"/usr/local/sbin/usbmuxd","file-flags":0,"responsible-process-path":"/System/Applications/Utilities/Terminal.app/Contents/MacOS/Terminal","flags":21,"platform-binary":false,"rdev":0,"summary":"deny(1) file-write-mode /private/var/db/lockdown","target":"/private/var/db/lockdown","mount-flags":76582912,"profile":"platform","matched-user-intent-extension":false,"apple-internal":false,"storage-class":"Lockdown","platform_binary":"no","operation":"file-write-mode","profile-flags":0,"normalized_target":["private","var","db","lockdown"],"file-mode":448,"errno":1,"build":"macOS 12.6.7 (21G651)","policy-description":"System Policy","responsible-process-signing-id":"com.apple.Terminal","hardware":"Mac","uid":0,"release-type":"User"}
Thread 0 (id: 63477):
0 libsystem_kernel.dylib 0x00007ff80d8368ae __chmod + 10
1 usbmuxd 0x000000010565584e main + 3582 (main.c:816)
2 dyld 0x0000000114e3f52e start + 462
Binary Images:
0x10564b000 - 0x10565afff usbmuxd (0) <0fc9b657-d311-38b5-bf02-e294b175a615> /usr/local/sbin/usbmuxd
0x114e3a000 - 0x114ea3567 dyld (960) <2517e9fe-884a-3855-8532-92bffba3f81c> /usr/lib/dyld
0x7ff80d832000 - 0x7ff80d869fff libsystem_kernel.dylib (8020.240.18.701.6) /usr/lib/system/libsystem_kernel.dylib
2025-02-13 10:35:39.751714+0800 0x27f Default 0x0 0 0 kernel: (Sandbox) Sandbox: usbmuxd(119) allow iokit-get-properties kCDCDoNotMatchThisDevice
2025-02-13 10:35:45.025063+0800 0x27f Default 0x0 0 0 kernel: (Sandbox) Sandbox: usbmuxd(119) allow iokit-get-properties kCDCDoNotMatchThisDevice
Dive into the vast array of tools, services, and support available to developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
What happen when a deprecated method is used in project submitted to Apple Store Connect ?
Generally the Apple documentation give the alternative like "Use ***: instead" but sometime the documentation do not suggest anything.
What to do when a such Warning happen ?
I am using ARKit with RealityKit to scan objects using LiDAR on iOS. I can generate an OBJ file from ARMeshAnchors, but I am missing the texture export (JPG + MTL).
What I Have So Far:
Successfully capturing mesh using ARMeshAnchor.
Converting mesh into MDLAsset and exporting .obj.
I need help generating the .jpg texture and linking it to the .mtl file.
private func exportScannedObject() {
guard
let camera = arView.session.currentFrame?.camera
else { return }
func convertToAsset(meshAnchors: [ARMeshAnchor]) -> MDLAsset? {
guard let device = MTLCreateSystemDefaultDevice() else {return nil}
let asset = MDLAsset()
for anchor in meshAnchors {
let mdlMesh = anchor.geometry.toMDLMesh(device: device, camera: camera, modelMatrix: anchor.transform)
// Apply a gray material to the mesh
let material = MDLMaterial(name: "GrayMaterial", scatteringFunction: MDLScatteringFunction())
material.setProperty(MDLMaterialProperty(name: "baseColor", semantic: .baseColor, float3: SIMD3(0.5, 0.5, 0.5))) // Gray color
if let submeshes = mdlMesh.submeshes as? [MDLSubmesh] {
for submesh in submeshes {
submesh.material = material
}
}
asset.add(mdlMesh)
}
return asset
}
func export(asset: MDLAsset) throws -> URL {
let directory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
let url = directory.appendingPathComponent("scaned.obj")
if MDLAsset.canExportFileExtension("obj") {
do {
try asset.export(to: url)
return url
} catch let error {
fatalError(error.localizedDescription)
}
} else {
fatalError("Can't export USD")
}
}
if let meshAnchors = arView.session.currentFrame?.anchors.compactMap({ $0 as? ARMeshAnchor }),
let asset = convertToAsset(meshAnchors: meshAnchors) {
do {
let url = try export(asset: asset)
showScanPreview(url)
} catch {
print("export error")
}
}
}
extension ARMeshGeometry {
func vertex(at index: UInt32) -> SIMD3<Float> {
assert(vertices.format == MTLVertexFormat.float3, "Expected three floats (twelve bytes) per vertex.")
let vertexPointer = vertices.buffer.contents().advanced(by: vertices.offset + (vertices.stride * Int(index)))
let vertex = vertexPointer.assumingMemoryBound(to: SIMD3<Float>.self).pointee
return vertex
}
// helps from StackOverflow:
// https://stackoverflow.com/questions/61063571/arkit-3-5-how-to-export-obj-from-new-ipad-pro-with-lidar
func toMDLMesh(device: MTLDevice, camera: ARCamera, modelMatrix: simd_float4x4) -> MDLMesh {
func convertVertexLocalToWorld() {
let verticesPointer = vertices.buffer.contents()
for vertexIndex in 0..<vertices.count {
let vertex = self.vertex(at: UInt32(vertexIndex))
var vertexLocalTransform = matrix_identity_float4x4
vertexLocalTransform.columns.3 = SIMD4<Float>(x: vertex.x, y: vertex.y, z: vertex.z, w: 1)
let vertexWorldPosition = (modelMatrix * vertexLocalTransform).columns.3
let vertexOffset = vertices.offset + vertices.stride * vertexIndex
let componentStride = vertices.stride / 3
verticesPointer.storeBytes(of: vertexWorldPosition.x, toByteOffset: vertexOffset, as: Float.self)
verticesPointer.storeBytes(of: vertexWorldPosition.y, toByteOffset: vertexOffset + componentStride, as: Float.self)
verticesPointer.storeBytes(of: vertexWorldPosition.z, toByteOffset: vertexOffset + (2 * componentStride), as: Float.self)
}
}
convertVertexLocalToWorld()
let allocator = MTKMeshBufferAllocator(device: device);
let data = Data.init(bytes: vertices.buffer.contents(), count: vertices.stride * vertices.count);
let vertexBuffer = allocator.newBuffer(with: data, type: .vertex);
let indexData = Data.init(bytes: faces.buffer.contents(), count: faces.bytesPerIndex * faces.count * faces.indexCountPerPrimitive);
let indexBuffer = allocator.newBuffer(with: indexData, type: .index);
let submesh = MDLSubmesh(indexBuffer: indexBuffer,
indexCount: faces.count * faces.indexCountPerPrimitive,
indexType: .uInt32,
geometryType: .triangles,
material: nil);
let vertexDescriptor = MDLVertexDescriptor();
vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition,
format: .float3,
offset: 0,
bufferIndex: 0);
vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: vertices.stride);
let mesh = MDLMesh(vertexBuffer: vertexBuffer,
vertexCount: vertices.count,
descriptor: vertexDescriptor,
submeshes: [submesh])
return mesh
}
}
What I Need Help With:
How do I generate the JPG texture from the AR scene?
How do I save an MTL file linking the OBJ model to the texture?
How can I correctly apply the texture when viewing the OBJ in an external 3D viewer?
I appreciate any guidance, including sample code or resources! If you have a complete working solution, I’d love to discuss further via private channels.
I’ve encountered a bug in Swift Playgrounds within “Learn to Code 2”, specifically in the “Variables: Seeking Seven Gems” section. No matter what I do—or even if I do nothing at all—an error always occurs.
I’ve tested this on both an iPad (9th generation) and an iPad Pro 11-inch (4th generation), and the issue happens on both devices.
Has anyone else experienced this? Any ideas on how to fix or work around it?
Thanks!
I have 2 versions of my app. This version I issued the "Clean Build Folder..." command. After doing so it will no longer compile/build. I've restored the directory to a copy prior to the Clean Build Folder with same result. The other version of this code will build and execute.
Command SwiftCompile failed with a nonzero exit code
/Users/martinwoscek/develop/iOS_switft_DEVELOP/FullScreenCamera-master-no_drivemonitor/AV Foundation/Base.lproj/LaunchScreen.storyboard: Encountered an error communicating with IBAgent-iOS.
Showing Recent Issues
.
.
.
Command SwiftCompile failed with a nonzero exit code
Command SwiftCompile failed with a nonzero exit code
Build failed 2/11/25, 8:15 AM 1.4 seconds
Hello,
I want to know how to directly open the host app instead of appclips during the development and testing process of appclips
Hello,
I am doing the SwiftUI tutorial with Xcode 16.2
For the watch app the simulator working fine but the preview keep saying “This app cannot run on the selected target device.” while the preview is working ……
For the macOS part the simulator working fine but the preview is totally broken with -> ”ContentView.swift” not found in any targets.
While it is correctly added to the target …
Please fix or make tutorial clear.
Topic:
Developer Tools & Services
SubTopic:
Xcode
Ever since I updated to Xcode 16.2, no device running iOS 18 shows up in Xcode. I can see the devices in Finder and Apple Configurator, but not in Xcode. When I run xcrun devicectl list devices, I also do not see the devices, so I assume the issue is in CoreDevice somewhere.
I've tried many things I've seen suggested in similar threads (toggling developer mode, reinstalling Xcode, using a different physical cable, etc) and so far nothing has helped. This same behavior is also happening for my entire development team, which has been as difficult as you might imagine.
Any info on how to resolve this would be much appreciated.
HStack{
FormLable(text: NSLocalizedString("Color", comment: ""))
Spacer()
Image(systemName: "circle.fill")
.font(.system(size: AppFontSize.baseSize.size() * 1.2, weight: .medium))
.foregroundColor(color)
.overlay(ColorPicker("Color", selection: $color).labelsHidden().opacity(0.015))
}
This is how I use the color picker. I used the same code in two different apps, and the color picker appeared in one app but not in the other. By the way, I upgraded Xcode to the latest version.
I have an Iphone application in OBJECTIVE-C and I wanted to and an iPad version . I added the required values in the .plist via the editor (and and iPad to the hardware selection) and in the info of the appli : IPAD screen was not loaded , with no error message on the console. Absolutely no error message. Simulator and real iPad produce the same results.
I did a new more test . File, new, file from template and created a new project (objective-c) with a simple Dashboard and a ViewController that is installed by default and populated by Xcode. It works without Errors with iPhone. If I add the iPad values in the plist and info : no IPAd screen loaded, no error in the console.
Another test , this time I ask Swift as langage (I dont use this langage it was just for a test). Create a new project, simple Swift application with a storyboard and a ViewController. I add the .plist and.info value (Screenshot) an iPad storyboard and the same exact problem arrives.
No loading of the iPad storyboard, No error message in the console. And iPad storyboard is correct , I I force it as the main screen , in "storyboard name" , it is loaded without and error on the iPad and the iPhone. but i dont want that, I want TWO storyboard and the iPad users sent to their interface, with their launchimage
IS THIS A BUG IN XCODE 16.2 ? DID I MISS SOMETHING ?I did not found and other "iPad" strings to add to plist/info. I use Xcode Version 16.2 (16C5032a)
Topic:
Developer Tools & Services
SubTopic:
Xcode
Have been trying to enroll in the Apple developer program, but every time after filling out all the forms and agreeing to the agreements, I'm being prompted with, "We are unable to process your request" repeatedly.
I wanted to know what is the issue and how to fix it so I can properly enroll in the developer program
I'm attempting to test an in app purchase for my app on my phone (not in a simulator, not sandbox testing). I'm getting an error that certificate check has failed.
Could this have anything to do with the SHA-1 warnings that Apple has recently mentioned?
I've tried regenerating my StoreKit file, cleaning the build, restarting XCode, resetting all of my devices purchases from the Debug > StoreKit menu, all with no luck.
Any help would be greatly appreciated.
2025-01-10 19:52:19.974564-0500 MyApp[74478:30675548] [Default] Failed to verify certificate chain due to client recoverable failure:
Error Domain=NSOSStatusErrorDomain Code=-67818 "“StoreKit Testing in Xcode” certificate is expired" UserInfo={NSLocalizedDescription=“StoreKit Testing in Xcode” certificate is expired, NSUnderlyingError=0x3027b9d40 {Error Domain=NSOSStatusErrorDomain Code=-67818 "Certificate 0 “StoreKit Testing in Xcode” has errors: Certificate is not temporally valid;" UserInfo={NSLocalizedDescription=Certificate 0 “StoreKit Testing in Xcode” has errors: Certificate is not temporally valid;}}}
2025-01-10 19:52:19.978233-0500 MyApp[74478:30675483] [Default] Failed to verify signature for Transaction, will assume invalid: failedToVerifyCertificateChain
Purchase succeeded but verification failed: Certificate Chain Invalid
Failed to purchase Premium: invalidCertificateChain
saveUnencrypted: Started saving form_info.json
saveUnencrypted: Saved form_info.json to Documents Directory in 9 ms (JSONEncoder chunk-based copy-on-write, 1 chunks) at ...
I was just comparing the build settings of two of my apps to try to understand why they behave differently (one of them uses the full screen on iPad, and the other one has small top and bottom black borders, although that's not the issue I want to discuss now). I saw that the option CLANG_CXX_LANGUAGE_STANDARD is set to gnu++0x for the older project, while it's set to gnu++17 for the newer one. The documentation lists different possible values and also a default one:
Compiler Default: Tells the compiler to use its default C++ language dialect. This is normally the best choice unless you have specific needs. (Currently equivalent to GNU++98.)
If it really is the best choice (normally), why is it not used when creating a new default Xcode project? Or is it better to select a newer compiler version (GNU++98 sounds quite old compared to GNU++17)? Also, does this affect Swift code?
Hi,
Overview
I am using Xcode Cloud for my multi platform app.
The macOS test case fails, however the iOS test case runs and succeeds.
I don't have any UI test cases written, the test case are simple and have nothing platform (macOS) specific.
Questions
What can I do to fix this?
Is there any user privileges needed to launch the macOS app for testing? I ask because when I ran the UI tests locally it launched the app and asked for my macOS user password. Just wondering if that is the reason it didn't launch in Xcode Cloud.
Error:
<Appname> encountered an error (Failed to install or launch the test runner. If you believe this error represents a bug, please attach the result bundle at /Volumes/workspace/resultbundle.xcresult.(Underlying Error: Could not launch "AppnameTests. The LaunchServices launcher has returned an error. Please check the system logs for
the underlying cause of the error. (Underlying Error: The operation couldn't be completed. Launch failed. (Underlying Error: Launch job spawn failed) )))
× Could not launch "<Appname>"
× Could not launch "AppnameTests"
× AppnameUITests.testExample()
Failed to get launch progress for <XCUIApplicationImpl: 0x600000564630 <BundleID> at /Volumes/workspace/TestProducts/Debug-Dev/<Appname>.app>: Could not launch "app name". The LaunchServices launcher has returned an error. Please check the system logs for the underlying cause of the error. (Underlying Error: The operation couldn't be completed. Launch failed. (Underlying Error: Launch job spawn failed))
AppnameUITests.swift:28
* AppnameUITests.testLaunchPerformance)
Failed to get launch progress for «XCUIApplicationimpl: 0x60000054630 <BundleID> at /Volumes/workspace/TestProducts/Debug-Dev/<Appname>.apps: Could not launch "<Appname>". The LaunchServices launcher has returned an error. Please check the system logs for the underlying cause of the error. (Underlying Error: The operation couldn't be completed. Launch failed. (Underlying Error: Launch job spawn failed))
AppnameUITests.swift:37 g
* AppnameUITestsLaunchTests.testLaunch)
Failed to get launch progress for «XCUIApplicationimpl: 0x60000054630 <BundleID> at /Volumes/workspace/Testroducts/Debug-Dev/<Appname>.apps: Could not launch "<Appname>". The LaunchServices launcher has returned an error. Please check the system logs for the underlying cause of the error. (Underlying Error: The operation couldn't be completed. Launch failed.
tüm geçmişe bakma
Hello,
I have been enrolled for apple developer program in 14th of February which says it should takes up to 48 hours to enroll. I have received my order number which is W1510513031 but my account is still seems Pending when i logged in. How long it should take? In my account when i logged in it still have purchase button there which i have already paid for it.
I implemented AppTrackingTransparency framework but Apple is rejecting the review because it seems the request is failing on iPadOS 18.3.1. which seems rare since it is working on all other devices. Has anyone faced the same issue?
I have tested this in a physical iPhone with IOS 18.3.1 and in several devices using Xcode runtime simulator including iPad and iPhone in different IOS versions up to 18.2 (which I understand is the latest available in Xcode).
The problem is that 18.3.1 runtime simulator is not available yet. Does anyone have more information when it will be available or what to do in these cases?
Question:
I have created a workspace containing a SwiftUI app and two frameworks:
B Framework: Handles UI components
C Framework: Handles service classes
I added a Podfile to manage dependencies. My Podfile is structured as follows:
inhibit_all_warnings!
workspace 'ABC.xcworkspace'
def shared_pods
# Shared pods here
pod 'Alamofire'
end
target 'A' do
use_frameworks!
project 'A/A.xcodeproj'
shared_pods
end
target 'B' do
use_frameworks!
project 'B/B.xcodeproj'
shared_pods
end
target 'C' do
use_frameworks!
project 'C/C.xcodeproj'
shared_pods
end
post_install do |installer|
installer.pods_project.targets.each do |target|
target.build_configurations.each do |config|
config.build_settings['IPHONEOS_DEPLOYMENT_TARGET'] = '13.0'
end
end
end
After installing pods and building the project, everything works fine except for Xcode’s SwiftUI Preview, which crashes with the following error:
xcpreviewagent crashed because Alamofire.framework is missing
Question:
How can I resolve this issue and make SwiftUI Previews work without Xcode crashing?
how to fix swift preview on real device no showing
Example:
I have a state var curHighScore declared in ContentView.
I select it, right click and select "Find Selected Symbol in Workspace"
The result is : 1 resul in 1 file, i.e. the line of declaration of the var
But obviously, the same var is used at line #102:
How come this occurence is not found by the search command.
Such happens every now and then. Is there something I am missing ? What is causing this ? Thanks for your help.
Topic:
Developer Tools & Services
SubTopic:
Xcode