I am developing an app in swiftUI using Xcode 12.3, deployment target iOS 14.0. The launch screen is setup through info.plist by specifying 'background color' and 'image name'. The file used in 'image name' is from Assets catalog. (PNG format, size300 x 300 and corresponding @2x and @3x resolutions) What I have observed, when the app is installed for the first time the launch image is centered and have original resolutions but all subsequent launches show launch images stretched to cover full screen. Any ideas why this is happening and how to have more consistent behavior either way?
I have tried 'respect safe area' option but it does not make a difference.
Thank you.
Explore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
In WWDC25 video 284: Build a UIKit app with the new design, there is mention of a cornerConfiguration property on UIVisualEffectView. But this properly isn't documented and Xcode 26 isn't aware of any such property.
I'm trying to replicate the results of that video in the section titled Custom Elements starting at the 19:15 point. There is a lot of missing details and typos in the code associated with that video.
My attempts with UIGlassEffect and UIViewEffectView do not result in any capsule shapes. I just get rectangles with no rounded corners at all.
As an experiment, I am trying to recreate the capsule with the layers/location buttons in the iOS 26 version of the Maps app.
I put the following code in a view controller's viewDidLoad method
let imgCfgLayer = UIImage.SymbolConfiguration(hierarchicalColor: .systemGray)
let imgLayer = UIImage(systemName: "square.2.layers.3d.fill", withConfiguration: imgCfgLayer)
var cfgLayer = UIButton.Configuration.plain()
cfgLayer.image = imgLayer
let btnLayer = UIButton(configuration: cfgLayer, primaryAction: UIAction(handler: { _ in
print("layer")
}))
var cfgLoc = UIButton.Configuration.plain()
let imgLoc = UIImage(systemName: "location")
cfgLoc.image = imgLoc
let btnLoc = UIButton(configuration: cfgLoc, primaryAction: UIAction(handler: { _ in
print("location")
}))
let bgEffect = UIGlassEffect()
bgEffect.isInteractive = true
let bg = UIVisualEffectView(effect: bgEffect)
bg.contentView.addSubview(btnLayer)
bg.contentView.addSubview(btnLoc)
view.addSubview(bg)
btnLayer.translatesAutoresizingMaskIntoConstraints = false
btnLoc.translatesAutoresizingMaskIntoConstraints = false
bg.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
btnLayer.leadingAnchor.constraint(equalTo: bg.contentView.leadingAnchor),
btnLayer.trailingAnchor.constraint(equalTo: bg.contentView.trailingAnchor),
btnLayer.topAnchor.constraint(equalTo: bg.contentView.topAnchor),
btnLoc.centerXAnchor.constraint(equalTo: bg.contentView.centerXAnchor),
btnLoc.topAnchor.constraint(equalTo: btnLayer.bottomAnchor, constant: 15),
btnLoc.bottomAnchor.constraint(equalTo: bg.contentView.bottomAnchor),
bg.centerXAnchor.constraint(equalTo: view.safeAreaLayoutGuide.centerXAnchor),
bg.topAnchor.constraint(equalTo: view.safeAreaLayoutGuide.topAnchor, constant: 40),
])
The result is pretty close other than the complete lack of capsule shape.
What changes would be needed to get the capsule shape? Is this even the proper approach?
In tvOS 18 the onMoveCommand is missing the first press after a view is loaded and every time the direction is changed. It also misses the first press on a button after a focus change. This appears to only impact the newer silver remote and not the older black remote or IR remotes.
With the code bellow press any direction 3 times and it will only log twice.
struct ButtonTest: View {
var body: some View {
VStack {
Button {
debugPrint("button 1")
} label: {
Text("Button 1")
}
Button {
debugPrint("button 2")
} label: {
Text("Button 2")
}
Button {
debugPrint("button 3")
} label: {
Text("Button 3")
}
}
.onMoveCommand(perform: { direction in
debugPrint("move \(direction)")
})
.padding()
}
}
I'm using GoogleMaps in my project.
Legacy preview works well but new preview (Xcode 16.3.1 beta) produces error. It doesn't seem to find Googlemaps.a.
== PREVIEW UPDATE ERROR:
FailedToLaunchAppError: Failed to launch ***
==================================
| [Remote] JITError
|
| ==================================
|
| | [Remote] CouldNotLoadInputStaticArchiveFile: Could not load static archive during preview: /Users/xxx/Library/Developer/Xcode/DerivedData/BOA-eiluspltxasszsfkpqrnnsxsjhth/Build/Products/Debug_BOA_Inhouse-iphonesimulator/GoogleMaps.a
| |
| | path: /Users/xxx/Library/Developer/Xcode/DerivedData/BOA-eiluspltxasszsfkpqrnnsxsjhth/Build/Products/Debug_BOA_Inhouse-iphonesimulator/GoogleMaps.a
| |
| | ==================================
| |
| | | [Remote] XOJITError
| | |
| | | XOJITError: arm64 slice of /Users/xxx/Library/Developer/Xcode/DerivedData/BOA-eiluspltxasszsfkpqrnnsxsjhth/Build/Products/Debug_BOA_Inhouse-iphonesimulator/GoogleMaps.a does not contain an archive
The sample code provided in "Building a document-based app with SwiftUI" (https://developer.apple.com/documentation/swiftui/building-a-document-based-app-with-swiftui) does not work as expected.
The DocumentGroup/StoryView toolbar does not appear for documents opened in the App.
By removing the DocumentGroupLaunchScene block from the App the toolbar does appear and works as expected - but of course the App's DocumentGroupLaunchScene customizations are lost.
I've tested this on 18.0 devices, as well as production 18.0 and 18.1 beta 6 simulators.
If I modify the StoryView by wrapping the content in a NavigationStack I can make some progress - but the results are unstable and hard to pin down - with this change the first time a document is opened in the WritingApp the toolbar appears as expected. When opening a document subsequently the toolbar is corrupted.
Please is this a bug or is there a good example of incorporate both DocumentGroupLaunchScene customizations at the App level and retina the toolbar in documents presented via DocumentGroup?
Topic:
UI Frameworks
SubTopic:
SwiftUI
I integrated an Advanced App Clip Experience to my app. In trying to test the App Clip Card, the card does not appear when I tap the link on my device that is associated to the Advanced App Clip Experience.
Listed are some contextual information:
Testing on device running on iOS 18.3.1
Associated Domains:
Main target app: applinks:
App clips target app: applinks: and appclips:
Archived and uploaded build to App Store Connect.
Green "Testing" status via Testflight.
On Distribution tab, green "Valid" status for build domain.
Advanced App Clip Experience green "Received" status.
Developer App Clip Testing Diagnostics:
Green "Register Advanced Experience" status
Green "App Clip Code" status
Warning "App Clip Published on App Store"
Orange Circle "Associated Domains"
After looking at countless threads, I still cannot for the life of me find a solution to test the App Clip. Any guidance would be extremely appreciated. I had also submitted a support ticket with case ID #102552504973.
Fatal Exception: NSInternalInconsistencyException
Cannot remove an observer <WKWebView 0x135137800> for the key path "configuration.enforcesChildRestrictions" from <STScreenTimeConfigurationObserver 0x13c6d7460>, most likely because the value for the key "configuration" has changed without an appropriate KVO notification being sent. Check the KVO-compliance of the STScreenTimeConfigurationObserver [class.]
I noticed that on iOS 26, WKWebView registers STScreenTimeConfigurationObserver, Is this an iOS 26 system issue? What should I do?
Hello, community and Apple engineers. I need your help.
Our app has the following issue: NavigationStack pushes a view twice if the NavigationStack is inside TabView and NavigationStack uses a navigation path of custom Hashable elements.
Our app works with issues in Xcode 18 Beta 13 + iOS 18.0. The same issue happened on previous beta versions of Xcode 18.
The issue isn’t represented in iOS 17.x and everything worked well before iOS 18.0 beta releases.
I was able to represent the same issue in a clear project with two simple views. I will paste the code below.
Several notes:
We use a centralised routing system in our app where all possible routes for navigation path are implemented in a View extension called withAppRouter().
We have a enum RouterDestination that contains all possible routes and is resolved in withAppRouter() extension.
We use Router class that contains @Published var path: [RouterDestination] = [] and this @Published property is bound to NavigationStack. In the real app, we need to have an access to this path property for programmatic navigation purposes.
Our app uses @ObservableObject / @StateObject approach.
import SwiftUI
struct ContentView: View {
@StateObject private var router = Router()
var body: some View {
TabView {
NavigationStack(path: $router.path) {
NavigationLink(value: RouterDestination.next, label: {
Label("Next", systemImage: "plus.circle.fill")
})
.withAppRouter()
}
}
}
}
enum RouterDestination: Hashable {
case next
}
struct SecondView: View {
var body: some View {
Text("Screen 2")
}
}
class Router: ObservableObject {
@Published var path: [RouterDestination] = []
}
extension View {
func withAppRouter() -> some View {
navigationDestination(for: RouterDestination.self) { destination in
switch destination {
case .next:
return SecondView()
}
}
}
}
Below you can see the GIF with the issue:
What I tried to do:
Use iOS 17+ @Observable approach. It didn’t help.
Using @State var path: [RouterDestination] = [] directly inside View seems to help. But it is not what we want as we need this property to be @Published and located inside Router class where we can get an access to it, and use for programmatic navigation if needed.
I ask Apple engineers to help with that, please, and if it is a bug of iOS 18 beta, then please fix it in the next versions of iOS 18.0
Basic Information
Please provide a descriptive title for your feedback:
Sheet presentationDetents breaks after rapid open/dismiss cycles
Which platform is most relevant for your report?
iOS
Description
Steps to Reproduce:
Create a sheet with presentationDetents([.medium])
Rapidly perform these actions multiple times (usually 3-4 times):
a. Open the sheet
b. Immediately scroll down to dismiss
Open the sheet again
Observe that the sheet now appears at .large size, ignoring the .medium detent
Expected Result:
Sheet should consistently maintain .medium size regardless of how quickly
it is opened and dismissed.
Actual Result:
After rapid open/dismiss cycles, the sheet ignores .medium detent and
appears at .large size.
Reproduction Rate:
Occurs consistently after 3-4 rapid open/dismiss cycles
More likely to occur with faster open/dismiss actions
Configuration:
iOS 18
Xcode 16.0 (16A242d)
SwiftUI
Device: iPhone 14
I have set up a collection view cell programmatically which I am registering in the cell registration of a diffable data source. I have set up a delegate for the cell, and the view controller which contains the collection view as the delegate for the cell. However, calling the cell delegate method from the cell is not calling the method in the view controller.
This is my code:
`// in my view controller: cell.delegate = self
// in my collection view cell: delegate?.repromptLLMForIconName(iconName, index: index, emotion: emotion, red: red, green: green, blue: blue)
`
But although the delegate method in the collection view gets called, my method implementation in the view controller does not. What could possibly be going wrong?
Topic:
UI Frameworks
SubTopic:
UIKit
Is there a way to access an Icon Composer .icon file in Swift or Objective-C? Any way to get this in an NSImage object that I can display in an image view? Thanks.
I'm working on a NavigationStack based app. Somewhere I'm using:
@Environment(\.dismiss) private var dismiss
and when trying to navigate to that view it gets stuck.
I used Self._printChanges() and discovered the environment variable dismiss is changing repeatedly. Obviously I am not changing that variable explicitly. I wasn't able to reproduce this in a small project so far, but does anybody have any idea what kind of thing I could be doing that might be causing this issue?
iOS 17.0.3
I'm trying to piece together how I should reason about multiple windows, activating menus and items.
For instance, I have an email command. If a document is open, it emails just that document. If the collection view holding that document is open, it generates an attachment representing the collection. (this happens with buttons right now but this really feels like a menu item) How do I tell which window is active (document/collection) in order to have the right menu item available.
If the user is adding a document I don't want the "new" command open, I only want one editing view. I saw sample code to include or remove commands but not disable them. I feel like there's a whole conceptual layer I want to understand with the interplay with scenes but don't know where to look for documentation. I searched here but there's hardly any threads on this.
TIA
Topic:
UI Frameworks
SubTopic:
UIKit
I was hoping for an update of SwiftData which adopted the use of shared and public CloudKit containers, in the same way it does for the private CloudKit container.
So firstly, a big request to any Apple devs reading, for this to be a thing!
Secondly, what would be a sensible way of adding a shared container in CloudKit to an existing app that is already using SwiftData?
Would it be possible to use the new DataStore method to manage CloudKit syncing with a public or shared container?
The following code won't work:
- (void)windowDidLoad {
[super windowDidLoad];
self.window.isVisible = NO;
}
The only main window still shows on application startup (in a minimal newly created app).
One of my published apps in App Store relies on this behavior which had been working for many years since I started Xcode development.
Topic:
UI Frameworks
SubTopic:
AppKit
I am trying to implement "Live activity" to my app. I am following the Apple docs.
Link: https://developer.apple.com/documentation/activitykit/displaying-live-data-with-live-activities
Example code:
struct LockScreenLiveActivityView: View {
let context: ActivityViewContext<PizzaDeliveryAttributes>
var body: some View {
VStack {
Spacer()
Text("\(context.state.driverName) is on their way with your pizza!")
Spacer()
HStack {
Spacer()
Label {
Text("\(context.attributes.numberOfPizzas) Pizzas")
} icon: {
Image(systemName: "bag")
.foregroundColor(.indigo)
}
.font(.title2)
Spacer()
Label {
Text(timerInterval: context.state.deliveryTimer, countsDown: true)
.multilineTextAlignment(.center)
.frame(width: 50)
.monospacedDigit()
} icon: {
Image(systemName: "timer")
.foregroundColor(.indigo)
}
.font(.title2)
Spacer()
}
Spacer()
}
.activitySystemActionForegroundColor(.indigo)
.activityBackgroundTint(.cyan)
}
}
Actually, the code is pretty straightforward. We can use the timerInterval for count-down animation. But when the timer ends, I want to update the Live Activity view. If the user re-opens the app, I can update it, but what happens if the user doesn't open the app? Is there a way to update the live activity without using push notifications?
Xcode downloaded a crash report for my app which I don't quite understand. It seems the following line caused the crash:
myEntity.image = newImage
where myEntity is of type MyEntity:
class MyEntity: NSObject, Identifiable {
@objc dynamic var image: NSImage!
...
}
The code is called on the main thread. According to the crash report, thread 0 makes that assignment, and at the same time thread 16 is calling [NSImageView asynchronousPreparation:prepareResultUsingParameters:].
What could cause such a crash? Could I be doing something wrong or is this a bug in macOS?
crash.crash
After updating to Sonoma, the following is logged in the Xcode console when an editable text field becomes key. This doesn't occur on any text field, but it seems to happen when the text field is within an NSPopover or an NSSavePanel.
ViewBridge to RemoteViewService Terminated: Error Domain=com.apple.ViewBridge Code=18 "(null)" UserInfo={com.apple.ViewBridge.error.hint=this process disconnected remote view controller -- benign unless unexpected, com.apple.ViewBridge.error.description=NSViewBridgeErrorCanceled}
What does this mean?
Topic:
UI Frameworks
SubTopic:
AppKit
When I build my app for iPad OS, either 26, or 18.5, as well as iOS on 16.5 from Xcode 26 with UIDesignRequiresCompatibility enabled my app is crashing as it loads the main UIViewController, a subclassed UITabBarController which is being loaded programatically from a Storyboard from another SplashScreen ViewController.
On i(Pad)OS 18.5 I get this error:
Thread 1: "Could not instantiate class named _TtGC5UIKit17UICoreHostingViewVCS_21ToolbarVisualProvider8RootView_ because no class named _TtGC5UIKit17UICoreHostingViewVCS_21ToolbarVisualProvider8RootView_ was found; the class needs to be defined in source code or linked in from a library (ensure the class is part of the correct target)"
On iPadOS 26 I get this error:
UIKitCore/UICoreHostingView.swift:54: Fatal error: init(coder:) has not been implemented
There is no issue building from Xcode 16.4, regardless of targeted i(Pad)OS.
In the WWDC 2025 session "Build a UIKit app with the with the new design", at the 23:22 mark, the presenter says:
And finally, when you no longer need the glass on screen animate it out by setting the effect to nil.
The video shows a UIVisualEffectView whose effect is set to a UIGlassEffect animating away as its effect is set to nil. But when I do this in my app (or a sample app), setting effect to nil does not remove the glass appearance. Is this expected? Is the video out of date? Or is this a bug?