Posts under Developer Tools & Services topic

Post

Replies

Boosts

Views

Created

A Summary of the WWDC25 Group Lab - Developer Tools
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Developer Tools. Will my project codebase be used for training when I use Xcode's intelligent assistant powered by cloud-based models? When using ChatGPT without logging in, your data will not be used to improve any models. If you log in to a ChatGPT account, this is based on your ChatGPT account settings, which allows you to opt-out (it defaults to on). When using Xcode with accounts for other model providers, you should check with the policies of your provider. And finally, at no point will any portion of your codebase be used to train or improve any Apple models. We'd love to make our SwiftUI Previews (and soon, Playgrounds) as snappy as possible. Is there any way to skip certain build steps, such as running linters? It seems the build environment is exactly the same (compared to a debug build), but maybe there's a trick. Starting with Xcode 16, SwiftUI previews use the exact same build artifacts as the regular build. The new Playgrounds support in Xcode 26 uses these build artifacts too. Shell script build phases are the most common thing that introduces extra build time, so as a first step, try turning off all shell script build phases (like linters) to get an idea if that’s the issue. If those build phases add significant time to your build, consider moving some of those phases into asynchronous steps, such as running linters before committing instead of on every build. If you do need a shell script build phase to run during your build, make sure to explicitly define the input and output files, as that is a huge way to improve your build performance. Are we able to provide additional context for the models, like coding standards? Documentation for third party dependencies? Documentation on your own codebase that explains things like architecture and more? In general, Xcode will automatically search for the right context based on the question and the evolving answer, as the model can interact multiple times with your project as it develops an answer. This will automatically pick up the coding style of the code it sees, and can include files that contain architecture comments, etc. Beyond automatic context, you can manually attach other documents, even if they aren't in your project. For example, you could make a file with rules and ideas and attach it, and it will influence the response. We are very aware of other kinds of automatic context like rule files, etc, though Xcode does not support these at this time. Once ChatGPT is enabled for Coding Intelligence in Xcode 26, and I sign into my existing ChatGPT account, will the ChatGPT Coding Intelligence model in Xcode know about chat conversations on Xcode development done previously in the ChatGPT Mac app? Xcode does not use information from other conversations, and conversations started in Xcode are not accessible in the web UI or ChatGPT app. Is there a plan to make SwiftUI views easier to locate and understand in the view hierarchy like UIKit views? SwiftUI uses a declarative paradigm to define your user interface. That allows you to specify what you want, with the system translating that into an efficient representation at runtime. Unlike traditional AppKit and UIKit, seeing the runtime representation of SwiftUI views isn't sufficient in order to understand why it's not doing what you want. This year, we introduced a SwiftUI Instrument that shows why things are happening, like view re-rendering. Is it possible to use the AI chat with ChatGPT Enterprise? My company doesn't allow us to use the general ChatGPT, only the enterprise version they have setup that prevents data from being leaked Yes, Xcode 26 supports logging into any existing ChatGPT account, including enterprise accounts. If that does not meet your needs, you can also setup a local server that implements the popular chat completions REST API to talk to your enterprise account how you need. Now that Icon Composer is here, how does it complement or replace existing vector design tools such as Sketch for icon design? Icon Composer complements your existing vector design tools. You should continue to create your shapes, gradients, and layers in another tool like Sketch, and compose the exported SVG layers in Icon Composer. Once you bring your layers into Icon Composer, you can then use it to influence the translucency, blur, and specular highlights for your icon. What’s one feature or improvement in the new Xcode that you personally think developers will love, but might not immediately discover? Maybe something tucked away or quietly powerful that’s flown under the radar so far? One feature we're particularly excited about is the new power profiler for iOS, which gives you further insights into the energy consumption of your app beyond what was possible with the energy instrument previously. You can learn more about how to use this instrument and how it can help you greatly reduce your apps battery usage in the documentation, as well as the session Profile and optimize power usage in your app. There were also improvements in accessibility this year with Voice Control, where you can naturally speak your Swift code to Xcode, and it understands the Swift syntax as you speak. To see it in action, take a look at the demonstration in What’s new in Xcode 26. We have a software advisory council that is very sensitive to having our private information going to the cloud in any form. What information do you have to help me guide Xcode and Apple Intelligence through the acceptance process? One thing you can do is configure a proxy for your enterprise that implementing the popular Chat Completions API endpoint protocol. When using a model provider via URL, you can use your proxy endpoint to inspect the network traffic for anything that you do not want sent outside of your enterprise, and then forward the traffic through the proxy to your chosen model provider. Are there list of recommended LLMs to use with Xcode via Intelligence/Local? I've tried Gemma3-12B, but.. I hope there are better options? Apple doesn't have a published list of recommended local models. This is a fast-moving space, and so a recommendation would become out of date very quickly as new models are released. We encourage you to try out the local model support in Xcode 26 with models that you find meet your needs, and let us and the community know! (continued below)
1
0
807
Jul ’25
Questions about macOS App Store update package generation and optimization
Hello, According to documentation, the App Store does not re-download the entire app when updating, but instead generates an update package containing only the changed content compared to the previous version. I’d like to clarify the following points: 1. Granularity of file changes If only part of a large file changes, does the update package include the entire file, or does it patch only the modified portions within that file? 2. Guideline on separating files The documentation recommends separating files that are likely to change from those that are not. How should this be interpreted in practice? 3. Verifying the diff result Is there a way for developers to check the actual diff result of the update package generated by the App Store without submitting the app? Is there a diff command tool or comparison method closer to the actual App Store update process? 4. Estimating update size during development For apps with large-scale resources, minimizing update size is critical. Are there any tools or best practices to estimate the size of the update package before submitting to the App Store? Any clarification or reference materials would be greatly appreciated. Thank you.
0
0
15
2h
Gitlab CI: Unable to find a device matching the provided destination specifier
Gitlab runner on M4 Mac Mini 2024, XCode 26.2, supported platforms: iOS, native UIKit swift project. Cocoapods. Trying to run tests via fastlane: run_tests(workspace: 'Project.xcworkspace', destination: "platform=macOS,id=#{deviceId}", scheme: "Release", clean: true) And having an error: $ xcodebuild -showBuildSettings -workspace Project.xcworkspace -scheme Release -destination platform\=macOS,id\=01234567-0123456789B9001C 2>&1 [12:23:53]: Could not read the "SUPPORTED_PLATFORMS" build setting, assuming that the project supports iOS only. [12:23:54]: Found simulator "iPhone 16 Pro (18.5)" The tests runs just fine in xcode on "My Mac (Designed for iPad)" device on the very same machine. Compared the output of xcodebuild -showdestinations in terminal: $ xcodebuild -showdestinations -workspace Project.xcworkspace -scheme Release Available destinations for the "Release" scheme: { platform:macOS, arch:arm64, variant:Designed for [iPad,iPhone], id:01234567-0123456789B9001C, name:My Mac } { platform:iOS, id:dvtdevice-DVTiPhonePlaceholder-iphoneos:placeholder, name:Any iOS Device } { platform:iOS Simulator, id:dvtdevice-DVTiOSDeviceSimulatorPlaceholder-iphonesimulator:placeholder, name:Any iOS Simulator Device } { platform:iOS Simulator, arch:arm64, id:A7E98A79-B29B-4E9A-8103-ECBA55ABC0C6, OS:26.2, name:iPhone 17 Pro } But when I launch this command in gitlab runner, I get this: Available destinations for the "Release" scheme: { platform:iOS, id:dvtdevice-DVTiPhonePlaceholder-iphoneos:placeholder, name:Any iOS Device } { platform:iOS Simulator, id:dvtdevice-DVTiOSDeviceSimulatorPlaceholder-iphonesimulator:placeholder, name:Any iOS Simulator Device } { platform:iOS Simulator, arch:arm64, id:A7E98A79-B29B-4E9A-8103-ECBA55ABC0C6, OS:26.2, name:iPhone 17 Pro } I suspect gitlab runner to ruin some ENV variable, probably. So I googled the most common issues on the internet and tried everything I found: LANG: "en_US.UTF-8" LC_ALL: "en_US.UTF-8" unset CFProcessPath And none is worked. I also compared the printenv output in terminal and gitlab runner, didn't found anything suspicious. Ran out of ideas. How can I troubleshoot this?
0
0
18
2h
Xcode Cloud: Preparing build for App Store Connect failed
Error: Preparing build for App Store Connect failed No error logs in log view. Download the archive and try to submit to App Store: Detailed logs: 2025-12-17 07:26:00 +0000 [MT] Command line name "app-store" is deprecated. Use "app-store-connect" instead. 2025-12-17 07:26:00 +0000 App Store Connect request for store configuration failed for account Session Proxy Provider (Account "Session Proxy Provider": Unable to authenticate with App Store Connect (Error Domain=DVTITunesSoftwareServiceFoundation.DVTServicesSessionProviderCredentialITunesAuthenticationContextError Code=1 "(null)")) 2025-12-17 07:26:00 +0000 App Store Connect response failed with unknown failure for credential <DVTFoundation.DVTServicesSessionProviderCredential: 0x600002e099c0>; response (null); error (null). 2025-12-17 07:26:00 +0000 App Store Connect request for store configuration for credential <DVTFoundation.DVTServicesSessionProviderCredential: 0x600002e099c0>; configuration: (null); error DVTITunesSoftwareServiceFoundation.DVTServicesSessionProviderCredentialITunesAuthenticationContextError.proxy 2025-12-17 07:26:00 +0000 [MT] Failed to find an account with App Store Connect access for team <IDEProvisioningBasicTeam: 0x6000029c0240; teamID='984L9QX9X5', teamName='(null)'> 2025-12-17 07:26:00 +0000 [MT] Running step: IDEDistributionFetchAppRecordStep with <IDEDistributionContext: 0x7fa62b216680; archive(resolved)="<IDEArchive: 0x600000315ea0>", distributionTask(resolved)="2", distributionDestination(resolved)="1", distributionMethod(resolved)="<IDEDistributionMethodiOSAppStoreDistribution: 0x600002a1f560>", team(resolved)="<IDEProvisioningBasicTeam: 0x6000029c0240; teamID='984L9QX9X5', teamName='(null)'>">
0
0
24
3h
Developer app on the Mac doesn't play videos in full screen
Hi, Problem When playing videos on the Developer app on the Mac, there is no full screen button for the video. So it is not possible to play the video in full screen. Note I am not referring to the app going into full screen, the issue is with there is no option to play the video in full screen. Environment OS: macOS 26.2 (25C56) Developer app: Version 10.8.3 (1083.3.1) Feedback FB21343934 Recording Question Are others also facing this issue? Is there a workaround? Suggestion This seems to be recurring problem that seems get broken with app / OS releases. Please write some UI tests to ensure the full screen button is present. Would really appreciate if this gets fixed.
0
0
40
6h
Where is Find My network Supplementary Agreement?
I am new to Find My network development and i am going to use Nordic solution for my FMN application. I have asked the MFi representative to enable the "Find My network" in our MFi portal. But there are just a sets of PDF in the "Find My network" under "Technology" in MFi Portal. Is there any Find My network Supplementary Agreement in MFi portal? Is it a PDF or where can i find it? Because I need to sign this document back to Nordic solution representative. But it seems there are no such FMN Supplementary Agreement.
0
0
39
8h
Instruments Crash using swiftui instrument
Instruments is crashing when the swiftui instrument is stopped (the session is finished) and the transfer begins from device to device: Crashed Thread: 11 Dispatch queue: com.apple.swiftuitracingsupport.reading Exception Type: EXC_BAD_INSTRUCTION (SIGILL) Exception Codes: 0x0000000000000001, 0x0000000000000000 Termination Reason: Namespace SIGNAL, Code 4 Illegal instruction: 4 Terminating Process: exc handler [1633] I've tried removing derived data, reinstalling xcode, updating xcode (I originally thought this might be the issue -- I needed to update to 26.2 from the 26 RC -- the update didn't fix crash or change the crash report), and restarting both devices. I'm running Instruments/Xcode 26.2 on a MacBook Pro 15" (2018) running Mac OS 15.7.2 (24G325) with an iPhone 16 Pro Max running 26.2. Hoping someone else might have seen this or could help me troubleshoot. I find the swiftui instrument be helpful and like to use it :) I can post a complete crash report as well.
1
0
18
10h
Getting Valgrind to run on macOS 10.15 Catalina, reboot
I posted about this about 5 years ago, and now at last it's close to being finished. The main problem that I now have is related to matching up DWARF debuginfo and global variables. This works fairly well on macOS 10.14. On 10.15 much less so, and I think that the reason is how the macho data is mmap'd. When Valgrind runs it does the job of the OS and loads the guest exe into memory. It'll then load and run dyld in Valgrind. I can get memory map debug traces. In one test with a problem I see 4 load segments. __DATA_CONST and __DATA both have prot 3 (RW) so we load them RW and not R then RW. Then I think that dyld munmaps and re mmaps the _DATA_CONST segment as RO. Valgrind works based on mmaps triggering reading debuginfo. I don't think that it handles munmap correctly. I need to debug that a lot more - I can see the changed mappings in the debug output but I don't see exactly what is happening with munmap and mmap (unless dyld is doing that on a section by basis). Does what I'm saying about the mappings make any sense?
2
0
20
13h
Xcode 26.1 re-release?
The developer downloads page now lists an Xcode 26.1 which was released on 11th Dec (the original Xcode 26.1 was posted on 3rd Nov). Strangely, this new Xcode 26.1 has a CFBundleShortVersionString of 26.1.1, and a DTXcodeBuild of 17B55 % ls -ln total 4413136 -rw-r--r--@ 1 503 20 2259523057 16 Dec 19:01 Xcode_26.1_Apple_silicon.xip % xip --expand Xcode_26.1_Apple_silicon.xip xip: signing certificate was "Software Update" (validation not attempted) xip: expanded items from "/Users/me/Downloads/temp/Xcode_26.1_Apple_silicon.xip" % plutil -p Xcode.app/Contents/Info.plist | grep CFBundleShort "CFBundleShortVersionString" => "26.1.1" % plutil -p Xcode.app/Contents/Info.plist | grep DTXcodeBuild "DTXcodeBuild" => "17B55" 17B55 does correspond to the original Xcode 26.1 final release. The Xcode 26.1.1 release that was previously posted had a DTXcodeBuild of 17B100, though. The pairing of 26.1.1 and 17B55 looks new and probably a packaging error?
2
1
79
15h
Controlling UIDesignRequiresCompatibility via Remote Config
Hello, I am currently in the process of gradually adding support for LiquidGlass to my app. The transition is taking place incrementally, i.e., new screens and minor features are gradually being adapted to the new design and already deployed. Currently, the old design is still active via the feature flag UIDesignRequiresCompatibility, as the existing UI should remain locked for the public app store version until the transition is complete. My challenge is as follows: I would like to work with the new LiquidGlass design during development without having to manually change the UIDesignRequiresCompatibility flag with each deployment. Ideally, I am looking for a solution where: • the new design is only activated for me (e.g., specific account or specific devices) • the old design remains active for all other users • the App Store version can be delivered unchanged So my question is: Is it possible to control UIDesignRequiresCompatibility via remote config or server-side logic in order to activate the new design specifically for certain users or devices? I have observed similar behavior on WhatsApp—two devices with the same app version, but only one shows the new design. This suggests server-side or remote config-based control. Do you have any experience or recommendations on how to implement something like this cleanly? Kind regards Heinz
1
0
22
16h
Real Time Spatial Video Streaming with Vision Pro
Hello, I am trying to build an AVP app for real-time "zero-latency" spatial video streaming. I am trying to figure out, on a high level, the best way to do this. Currently this is my method: Server sends stereo images via a WebRTC service (ie, livekit) The WebRTC stream is converted to a CVPixelBuffer, writes them to file, plays via AVPlayer, and applies a VideoMaterial to a plane entity. However, this is a bit hacky and it seems like this won't be compatible with Apple's spatial experinces. To my understanding, Apple supports HLS streaming for spatial experiences and APMP content. However, HLS (and even Low Latency HLS) introduces a second or more of latency, likely do to the segmentation nature of HLS. Thus, HLS will not work for us. Some other alternatives I've thought of are streaming the live stream video via webrtc from the server to a local computer in the AVP's network, and then using LL-HLS to stream from the local computer to the vision pro. Still, it seems like this would introduce latency on the order of seconds. Is my current approach the best way to implement this? Or could anyone suggest a better way, perhaps something compatible with AVP's spatial experiences
0
0
11
18h
Is Xcode's predictive code completion model has some censoring?
It seems Xcode's predictive code completion model is censored. Specifically, when typing the word "torrent," the model stops working completely. It doesn't matter whether the word is written directly in the code or in a comment. It could also be part of another word, such as "qBittorrent." In either case, the model stops working. Reproducing this issue is fairly simple. Create a Swift file and type the word "torrent." The model will stop generating code. Xcode Version 26.2 (17C52) Predictive Code Completion Model: [com.apple.fm.code.generate_small_v2.base: 700.0.81600.13.202379,0] [com.apple.fm.code.generate_safety_guardrail.base: 1.6.81619.13.202072,0] [com.apple.gm.safety_deny.input.code_intelligence.base: 32025010.20251009.91600.100.1651,0] [com.apple.gm.safety_deny.output.code_intelligence.base: 32025010.20251009.91600.100.1651,0] (Installed)
0
0
74
1d
AR location errors on cellular + WiFi model iPad with device connected to Wi-Fi
I am developing an Augmented Reality (AR) navigation application for the iPad, utilizing the ARCL library to place Points of Interest (POIs) in the real world. The application's behavior varies significantly based on the device's networking configuration: Cellular Network (Expected Behavior): On an iPad with a cellular modem, when using the cellular network, all POIs are placed accurately with correct orientation. Wi-Fi Only (Expected Behavior): On a Wi-Fi-only model (no GPS chip), POI placement is inaccurate, confirming the need for an external GPS receiver for that hardware configuration. Cellular + Wi-Fi (Anomalous Behavior): The iPad is a cellular model (equipped with GNSS/GPS). The device is connected to a Wi-Fi network (enforced via an MDM profile, preventing the user from disabling Wi-Fi). When actively connected to this specific Wi-Fi network, the AR POIs consistently display with an incorrect orientation and placement, even though the device hardware has a dedicated GPS chip. The placement error strongly suggests that the device's determined location or heading is erroneous. It appears that the active Wi-Fi connection is somehow interfering with or overriding the high-accuracy GNSS/GPS data, leading to a flawed Core Location determination that negatively impacts the ARCL world tracking and anchor placement. Has anyone experienced a scenario where an active Wi-Fi connection on a cellular iPad model causes Core Location to prioritize less accurate location data (potentially Wi-Fi-based location services) over the device's built-in GNSS/GPS, resulting in severe orientation errors? We observed that on Apple map(native application) as well it is showing wrong location and orientation when it is connected to WiFi
0
0
62
1d
Trouble setting up watches to use TestFlight that are AWFK configured
I am developing a simple watch app and I use my personal watch for development with Xcode. Personal watch is series 10 gps only. I have two other watches that I want to use for testing the app, but not needing them to be connected to Xcode. The test watches have cellular option, and I need a cell plan per watch because the watches need to be standalone, not counting initial setup. To get the standalone cell plan the watches need to be configured using AWFK. Here is what I have tried/current issues. I switch between all three watches on my phone using the watch app. Originally tried to put test watches in developer mode, thinking I would connect to Xcode, developer mode is not available when watch is setup using AWFK. Pushed the watch app to apple connect, setup TestFlight group, added the test users and my phone user, accepted invites TestFlight is installed on my phone, I see the testflight setup for the watch app I set a test watch using watch app on the phone, run install for the test app from TestFlight on the phone, spinner moves for awhile then goes back to Install. I am not able to get the watch app installed on the test watches from the phone. Is what I am attempting to do supported? I haven't found much specific documentation on this. If I pair the test watches as regular watches, set them to developer mode, can I pair them again as AWFK and will developer mode survive the switch? Or is there something really simple that I'm overlooking? Appreciate any help that can be extended.
0
0
64
1d
How to delete iOS simulator runtimes?
There are multiple iOS simulator runtimes located at /System/Library/AssetsV2/com_apple_MobileAsset_iOSSimulatorRuntime which I don't need. I tried the following approaches to delete them but not working. Xcode > Settings > Components > Delete (It can delete some but not all of them) xcrun simctl runtime list (It doesn't show the old runtimes) sudo rm (Operation not permitted) sudo rm -rf cc1f035290d244fca4f74d9d243fcd02d2876c27.asset Password: rm: cc1f035290d244fca4f74d9d243fcd02d2876c27.asset/AssetData/096-69246-684.dmg: Operation not permitted rm: cc1f035290d244fca4f74d9d243fcd02d2876c27.asset/AssetData: Operation not permitted rm: cc1f035290d244fca4f74d9d243fcd02d2876c27.asset/Info.plist: Operation not permitted rm: cc1f035290d244fca4f74d9d243fcd02d2876c27.asset/version.plist: Operation not permitted rm: cc1f035290d244fca4f74d9d243fcd02d2876c27.asset: Operation not permitted I have two questions: Why "xcrun simctl runtime list" is unable to list some old runtimes? How to delete them?
0
0
43
1d
Forecasts missing in WeatherKit
I am using the WeatherKit REST API with hourlyStart/hourlyEnd parameters to request up to 240 hours of forecast data. However, when requesting later in the day, the API returns fewer than 240 hourly forecasts — e.g., 239 at 08:00, 238 at 09:00, etc. and goes up to 224 for 23:00 It appears the returned list is contiguous but truncated at the end compared to the full 240-hour window. I have also tried getting the data after sometime, like 09:00 data at 09:45 but still was missing the same data at the end. Is this expected WeatherKit behavior or a bug? If it’s expected, is there documentation explaining how the “forecast horizon” is determined and when it is updated? Thank you.
0
0
41
1d