Build, test, and submit your app using Xcode, Apple's integrated development environment.

Xcode Documentation

Posts under Xcode subtopic

Post

Replies

Boosts

Views

Activity

Uncategorized (Xcode): Command ClangStatCache failed with a nonzero exit code
I'm encountering an issue while trying to build my iOS app in Xcode for the simulator. Issue: Xcode Version: 15.x macOS Version: 14.x (Sonoma) Simulator: iPhone 16 Pro Error Message: Command ClangStatCache failed with a nonzero exit code Steps Taken: Cleaned build folder (Shift + Cmd + K) Deleted DerivedData (rm -rf ~/Library/Developer/Xcode/DerivedData) Updated Xcode and checked iOS SDK updates Disabled Clang Static Analyzer Cache None of these steps resolved the issue. Any help would be appreciated! Thanks in advance.
0
0
238
Mar ’25
How to Export OBJ with Texture (JPG + MTL) from ARKit LiDAR Scan in iOS?
I am using ARKit with RealityKit to scan objects using LiDAR on iOS. I can generate an OBJ file from ARMeshAnchors, but I am missing the texture export (JPG + MTL). What I Have So Far: Successfully capturing mesh using ARMeshAnchor. Converting mesh into MDLAsset and exporting .obj. I need help generating the .jpg texture and linking it to the .mtl file. private func exportScannedObject() { guard let camera = arView.session.currentFrame?.camera else { return } func convertToAsset(meshAnchors: [ARMeshAnchor]) -> MDLAsset? { guard let device = MTLCreateSystemDefaultDevice() else {return nil} let asset = MDLAsset() for anchor in meshAnchors { let mdlMesh = anchor.geometry.toMDLMesh(device: device, camera: camera, modelMatrix: anchor.transform) // Apply a gray material to the mesh let material = MDLMaterial(name: "GrayMaterial", scatteringFunction: MDLScatteringFunction()) material.setProperty(MDLMaterialProperty(name: "baseColor", semantic: .baseColor, float3: SIMD3(0.5, 0.5, 0.5))) // Gray color if let submeshes = mdlMesh.submeshes as? [MDLSubmesh] { for submesh in submeshes { submesh.material = material } } asset.add(mdlMesh) } return asset } func export(asset: MDLAsset) throws -> URL { let directory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first! let url = directory.appendingPathComponent("scaned.obj") if MDLAsset.canExportFileExtension("obj") { do { try asset.export(to: url) return url } catch let error { fatalError(error.localizedDescription) } } else { fatalError("Can't export USD") } } if let meshAnchors = arView.session.currentFrame?.anchors.compactMap({ $0 as? ARMeshAnchor }), let asset = convertToAsset(meshAnchors: meshAnchors) { do { let url = try export(asset: asset) showScanPreview(url) } catch { print("export error") } } } extension ARMeshGeometry { func vertex(at index: UInt32) -> SIMD3<Float> { assert(vertices.format == MTLVertexFormat.float3, "Expected three floats (twelve bytes) per vertex.") let vertexPointer = vertices.buffer.contents().advanced(by: vertices.offset + (vertices.stride * Int(index))) let vertex = vertexPointer.assumingMemoryBound(to: SIMD3<Float>.self).pointee return vertex } // helps from StackOverflow: // https://stackoverflow.com/questions/61063571/arkit-3-5-how-to-export-obj-from-new-ipad-pro-with-lidar func toMDLMesh(device: MTLDevice, camera: ARCamera, modelMatrix: simd_float4x4) -> MDLMesh { func convertVertexLocalToWorld() { let verticesPointer = vertices.buffer.contents() for vertexIndex in 0..<vertices.count { let vertex = self.vertex(at: UInt32(vertexIndex)) var vertexLocalTransform = matrix_identity_float4x4 vertexLocalTransform.columns.3 = SIMD4<Float>(x: vertex.x, y: vertex.y, z: vertex.z, w: 1) let vertexWorldPosition = (modelMatrix * vertexLocalTransform).columns.3 let vertexOffset = vertices.offset + vertices.stride * vertexIndex let componentStride = vertices.stride / 3 verticesPointer.storeBytes(of: vertexWorldPosition.x, toByteOffset: vertexOffset, as: Float.self) verticesPointer.storeBytes(of: vertexWorldPosition.y, toByteOffset: vertexOffset + componentStride, as: Float.self) verticesPointer.storeBytes(of: vertexWorldPosition.z, toByteOffset: vertexOffset + (2 * componentStride), as: Float.self) } } convertVertexLocalToWorld() let allocator = MTKMeshBufferAllocator(device: device); let data = Data.init(bytes: vertices.buffer.contents(), count: vertices.stride * vertices.count); let vertexBuffer = allocator.newBuffer(with: data, type: .vertex); let indexData = Data.init(bytes: faces.buffer.contents(), count: faces.bytesPerIndex * faces.count * faces.indexCountPerPrimitive); let indexBuffer = allocator.newBuffer(with: indexData, type: .index); let submesh = MDLSubmesh(indexBuffer: indexBuffer, indexCount: faces.count * faces.indexCountPerPrimitive, indexType: .uInt32, geometryType: .triangles, material: nil); let vertexDescriptor = MDLVertexDescriptor(); vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition, format: .float3, offset: 0, bufferIndex: 0); vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: vertices.stride); let mesh = MDLMesh(vertexBuffer: vertexBuffer, vertexCount: vertices.count, descriptor: vertexDescriptor, submeshes: [submesh]) return mesh } } What I Need Help With: How do I generate the JPG texture from the AR scene? How do I save an MTL file linking the OBJ model to the texture? How can I correctly apply the texture when viewing the OBJ in an external 3D viewer? I appreciate any guidance, including sample code or resources! If you have a complete working solution, I’d love to discuss further via private channels.
0
0
482
Feb ’25
Icon dark mode not working on iPadOS
I updated the icons in Xcode to customize the appearance of my app icons to be light, dark, or tinted. The app is allowed to be downloaded and run on iPadOS. The app icon behaves as expected on iOS however the app icon does not change on the iPad. Am I missing a step for the iPad? To be clear the app does not target iPadOS rather it is just running the iOS version. Thanks kof
0
0
47
Apr ’25
How to retrieve required Apple Pay parameters for PayFort payment request in Swift?
I'm integrating Apple Pay with PayFort in a Swift iOS application, and I’m currently working on preparing a valid purchase request using Apple Pay, as described in PayFort’s documentation: 🔗 https://docsbeta.payfort.com/docs/api/build/index.html?shell#apple-pay-authorization-purchase-request The documentation outlines the following required parameters: apple_data apple_signature apple_header apple_transactionId apple_ephemeralPublicKey apple_publicKeyHash apple_paymentMethod apple_displayName apple_network apple_type Optional: apple_applicationData I understand these should be derived from the PKPayment object after Apple Pay authorization, but I’m having trouble mapping everything correctly. Here’s what I’m seeing in code: payment.token // Returns something like: <PKPaymentToken: 0x28080ae80; transactionIdentifier: "..."; paymentData: 3780 bytes> payment.token.paymentData // Contains 3780 bytes of encrypted data payment.token.paymentData.base64EncodedString() // Returns a long base64 string, which at first glance seems like it could be used for apple_data, // but PayFort doesn't accept it as-is — so this value appears to be incomplete or incorrectly formatted I can successfully retrieve the following values from payment.token.paymentMethod: apple_displayName apple_network apple_type However, I’m still unsure how to extract or build the following in the format accepted by PayFort: apple_data apple_signature apple_header apple_transactionId apple_ephemeralPublicKey apple_publicKeyHash apple_paymentMethod These may be contained within the paymentData JSON, but I’m not sure how to decode it or if Apple allows decrypting it in a way that matches PayFort’s expected format. How can I correctly extract or build apple_data, apple_signature, and apple_header from the Apple Pay token? Also, how should I handle the decryption or decoding (if necessary) of paymentData to retrieve values like apple_transactionId, apple_ephemeralPublicKey, and apple_publicKeyHash? If anyone has successfully set this up or has example code that bridges Apple Pay and PayFort’s expected request format, it would be super helpful! Thanks in advance 🙏
0
0
84
Apr ’25
WatchOS Xcode 16.4 to sample live RR intervals from the PPG sensor ERROR
Hi, I am getting an error stating "Argument passed to call that takes no arguments". I want this Apple Watch App to measure and store RR Intervals from the PPG sensor on the Apple Watch for Heart Rate Variability calculations. Please help me fix this, I can't figure it out. Here is my code: heartbeatQuery = HKHeartbeatSeriesQuery(predicate: predicate, dataReceivedHandler: { (query, timeSinceLastBeat, ended, error) in // Switch to main thread for UI updates DispatchQueue.main.async { if let error = error { print("Heartbeat query error: (error.localizedDescription)") self.fetchErrorMessage = "Heartbeat query error: (error.localizedDescription)" // Consider stopping the workout session if the query fails critically // self.stopWorkoutSession() return } if ended { print("Heartbeat query indicates series ended.") } // Append valid RR intervals if timeSinceLastBeat > 0 { self.rrIntervals.append(timeSinceLastBeat) self.beatCount += 1 } } // End DispatchQueue.main.async }) // End query data handler // --- END OF PROBLEMATIC INITIALIZER --- // Execute the query if it was created successfully It recommends the fix as removing this part: '(predicate: predicate, dataReceivedHandler: { (query, timeSinceLastBeat, ended, error) in // Switch to main thread for UI updates DispatchQueue.main.async { if let error = error { print("Heartbeat query error: (error.localizedDescription)") self.fetchErrorMessage = "Heartbeat query error: (error.localizedDescription)" // Consider stopping the workout session if the query fails critically // self.stopWorkoutSession() return } if ended { print("Heartbeat query indicates series ended.") } // Append valid RR intervals if timeSinceLastBeat > 0 { self.rrIntervals.append(timeSinceLastBeat) self.beatCount += 1 } } // End DispatchQueue.main.async })' But after I remove that it says "Cannot assign value of type 'HKHeartbeatSeriesQuery.Type' to type 'HKHeartbeatSeriesQuery'" PLEASE HELP ME Thanks
0
0
109
Apr ’25
How to exclude header files from 3rd party pods from getting the Apple Clang warning policies?
I have the 'warnings as errors' policy set to Yes for Apple Clang and Swift compiler. As far as I know, there is no way to make exceptions to the warning policies from Swift compiler (correct me if I am wrong), what about Clang warning policies, specifically when it comes to header files from 3rd party pods? They don't show up on the Compile Sources so I can't apply flags such as -Wno-error=deprecated to them. Adding #pragma GCC... to them won't help as a fresh pod install will wipe the #pragma statements out. Is there a way to exclude the header files from pods from getting the warning policies from Clang so I won't see errors from them when compiling Clang modules that's a part of the build process?
0
0
320
Feb ’25
Latest Xcode build giving multiple system level errors
with the latest Xcode version, i am getting all kinds of errors: few main ones: /Users/akashbhatia/MyApp/ios/Pods/Target Support Files/ReactCodegen/ReactCodegen-prefix.pch:2:9 Could not build module 'UIKit' /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator18.4.sdk/System/Library/Frameworks/CFNetwork.framework/Headers/CFNetwork.h:18:10 Could not build module 'CoreFoundation' /Users/akashbhatia/MyApp/ios/Pods/Headers/Public/RCT-Folly/folly/portability/Config.h:20:10 'folly/folly-config.h' file not found I have reinstalled Xcode, removed derived data, installed pod over and over again, but I have realized from the first error here, that it has more to do with Xcode and React Native now. Any help will be appreciated.
0
0
83
Apr ’25
XCode 16 run Fastlane fail when called get_version_number function after convert project from groups to folders
My project is using Fastlane 2.226.0. After converted groups to folders on XCode 16, I got an error when executed below function in Fastfile: get_version_number(xcodeproj: "MyProject.xcodeproj", target: "MyProject") Below is the error message output after I ran fastlane: Unable to find XCode build setting: MARKETING_VERSION
0
0
109
Mar ’25
Privacy - Siri Usage Description being reset to default text "Describe why your app needs Siri access" on generating archive
I have an iOS app and that has CarPlay enabled. I have Siri capability and the feature has been tested in Car. The voice commands are working perfectly fine. However, I am facing a weird issue as described below, The key NSSiriUsageDescription, is set to custom text in info.plist. After generating archive, I exported and checked the package contents, in which the the key NSSiriUsageDescription was reset to default text(Describe why your app needs Siri access) in the info.plist. I do not have any dynamic build process that's writing to the info.plist. Only the Siri key is being reset, rest of keys like camera/location permissions are intact. Kindly suggest what needs to be done at my end
0
0
131
May ’25
cannot update system time on AppleWatch using simctl
I am attempting to generate screenshots at various logical times using a watchOS 11 simulator. % xcrun simctl status_bar "Apple Watch Series 10 (46mm)" override --time 9:06 An error was encountered processing the command (domain=NSPOSIXErrorDomain, code=45): Simulator device failed to complete the requested operation. Operation not supported Underlying error (domain=NSPOSIXErrorDomain, code=45): Status bar overrides not supported on this platform. Operation not supported Using WatchOS Clock Settings gives me at most 59 minutes in which to succeed. Even if I had a physical watch for every series and size, I would have to wait up to 23 hours before being able to generate my marketing screenshots to support various user scenarios. With three settings variations, a minimal wait time to get to good marketable content would be on the order of 1 month per app version.
0
0
64
May ’25
App Suspended during active voip call on Xcode 16
We have started facing an issue after updating Xcode from version 15.2 to 16, we have a voip application with webview and call kit, and we have the Background Modes capabilities: Voip, Audio, Background fetch, and Background processing. We had no problem on Xcode 15, but ever since updating Xcode 16 and sdk 18, when app goes into the background during an active call, the app is suspended and no events are triggered UNTIL the app is resumed to the foreground.
0
0
152
May ’25
XCFrameworks deploying .a / include to output target directory
Hello Friends, We have a strange bug that Xcode is deploying static binaries from within .XCFramework into the CONFIGURATION_BUILD_DIR An example of our xcframeworks structure is: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>AvailableLibraries</key> <array> <dict> <key>BinaryPath</key> <string>libglfw3.a</string> <key>HeadersPath</key> <string>Headers</string> <key>LibraryIdentifier</key> <string>macos-arm64_x86_64</string> <key>LibraryPath</key> <string>libglfw3.a</string> <key>SupportedArchitectures</key> <array> <string>arm64</string> <string>x86_64</string> </array> <key>SupportedPlatform</key> <string>macos</string> </dict> </array> <key>CFBundlePackageType</key> <string>XFWK</string> <key>XCFrameworkFormatVersion</key> <string>1.0</string> </dict> </plist> for open source creative coding toolkit https://github.com/openframeworks/openFrameworks Can you see any issues in the above. Some ideas for me is to generate xcarchive instead of deploying the .a in the xcframeworks, however that does not explain the include being packaged there, so I think this might be a Xcode issue
0
0
175
Mar ’25
Network Instability in TestFlight Builds When Using Xcode 16+
We are experiencing networking issues in our iOS application when it is built with Xcode 16 or later. Our app includes a video conferencing feature that works reliably when built with Xcode 15 or earlier — we can sustain hour-long video sessions without interruption. However, when the app is built using Xcode 16 or higher, network connections drop after 2–3 minutes during a session. This triggers an auto-reconnect, which succeeds, but the connection drops again after another 2–3 minutes. This loop continues indefinitely. Key Details: The issue only occurs in TestFlight builds. When running the app via Xcode debugger, the issue does not occur. The issue is consistently reproducible in TestFlight builds created with Xcode 16 or later. TestFlight builds created with Xcode 15 do not exhibit this issue. All the videoconferencing runs on C and C++ code What We’ve Tried: Reviewed Xcode 16+ release notes but found no relevant changes or deprecations. Verified app configuration and entitlements. Confirmed that no app-side changes occurred between the working and broken builds. Request: We’re seeking guidance on what changes in Xcode 16+ could be affecting networking behavior in release/TestFlight builds. Any insight into relevant build settings, compiler changes, or runtime behavior differences would be greatly appreciated. Thank you in advance for your assistance.
0
0
95
May ’25
Non-deterministic xctestplan files
Working with .xctestplan files reminds me very much of working with project files and nibs and storyboards back in the day 😅 It seems that they are non-deterministic. This means that every time we add a new test module or edit the inclusion rules at all the whole file is recomputed. But it doesn't have a fixed order so everything in the file changes. It makes it very difficult to see what has happened in PR reviews and causes conflicts any time anything is changed. TBH... because it isn't alphabetical, even just finding a particular test suite is difficult without changes. Are there any plans to bring this file format up to date with the likes of xcodeproj and xib files? It would help massively while working with source control in a larger team. Thanks
0
0
297
Mar ’25