There's an easily reproducible SwiftUI bug on macOS where an app's UI state no longer updates/re-renders for "Designed for iPad" apps (i.e. ProcessInfo.processInfo.isiOSAppOnMac == true). The bug occurs in Xcode and also if the app is running independent of Xcode.
The bug occurs when:
the user Hides the app (i.e. it goes into the background)
the user puts the Mac to sleep (e.g. Apple menu > Sleep)
a total of ~60 seconds transpires (i.e. macOS puts the app into the "suspended state")
when the app is brought back into the foreground the UI no longer updates properly
The only way I have found to fix this is to manually open a new actual full app window via File > New, in which case the app works fine again in the new window.
The following extremely simple code in a default Xcode project illustrates the issue:
import SwiftUI
@main
struct staleApp: App {
@State private var isBright = true
var body: some Scene {
WindowGroup() {
ZStack {
(isBright ? Color.white : Color.black).ignoresSafeArea()
Button("TOGGLE") { isBright.toggle(); print("TAPPED") }
}
.onAppear { print("\(isBright ? "light" : "dark") view appeared") }
}
}
}
For the code above, after Hiding the app and putting the computer to sleep for 60 seconds or more, the button no longer swaps views, although the print statements still appear in the console upon tapping the button. Also, while in this buggy state, i can get the view to update to the current state (i.e. the view triggered by the last tap) by manually dragging the corner of the app window to resize the window. But after resizing, the view again does not update upon button tapping until I resize the window again.
so it appears the diff engine is mucked or that the Scene or WindowGroup are no longer correctly running on the main thread
I have tried rebuilding the entire view hierarchy by updating .id() on views but this approach does NOT work. I have tried many other options/hacks but have not been able to reset the 'view engine' other than opening a new window manually or by using: @Environment(.openWindow) private var openWindow
openWindow could be a viable solution except there's no way to programmatically close the old window for isiOSAppOnMac (@Environment(.dismissWindow) private var dismissWindow doesn't work for iOS)
Explore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have a DocumentGroup working with a FileDocument, and that's fine.
However, when someone creates a new document I want them to have to immediately save it. This is the behavior on ipadOS and iOS from what I can understand (you select where before the file is created).
There seems to be no way to do this on macOS?
I basically want to have someone:
create a new document
enter some basic data
hit "create" which saves the file
then lets the user start editing it
(1), (2), and (4) are done and fairly trivial.
(3) seems impossible, though...?
This really only needs to support macOS but any pointers would be appreciated.
I want record screen in my app,the method startCaptureWithHandler:completionHandler:,the sampleBuffer, It is supposed to exist but it has become nil.Not only that,but there‘s another problem,when I want to stop recording and save the video,I will check [RPScreenRecorder sharedRecorder].recording first, it will be false sometime,that problems are unusual in iOS 18.3.2 iPhoneXs Max,and unexpected,here is my code
-(void)startCaptureScreen {
NSLog(@"AKA++ startCaptureScreen");
if ([[RPScreenRecorder sharedRecorder] isRecording]) {
return;
}
//屏幕录制
[[RPScreenRecorder sharedRecorder]setMicrophoneEnabled:YES];
NSLog(@"AKA++ MicrophoneEnabled AAAA startCaptureScreen");
[[RPScreenRecorder sharedRecorder]setCameraEnabled:YES];
[[RPScreenRecorder sharedRecorder] startCaptureWithHandler:^(CMSampleBufferRef _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error) {
if(self.assetWriter == nil){
if (self.AVAssetWriterStatus == 0) {
[self setupAssetWriterAndStartWith:sampleBuffer];
}
}
if (self.AVAssetWriterStatus != 2) {
return;
}
if (error) {
// deal with error
return;
}
if (self.assetWriter.status != AVAssetWriterStatusWriting) {
[self assetWriterAppendSampleBufferFailWith:bufferType];
return;
}
if (bufferType == RPSampleBufferTypeVideo) {
if(self.assetWriter.status == 0 ||self.assetWriter.status > 2){
} else if(self.videoAssetWriterInput.readyForMoreMediaData == YES){
BOOL success = [self.videoAssetWriterInput appendSampleBuffer:sampleBuffer];
}
}
if (bufferType == RPSampleBufferTypeAudioMic) {
if(self.assetWriter.status == 0 ||self.assetWriter.status > 2){
} else if(self.audioAssetWriterInput.readyForMoreMediaData == YES){
BOOL success = [self.audioAssetWriterInput appendSampleBuffer:sampleBuffer];
}
}
} completionHandler:^(NSError * _Nullable error) {
//deal with error
}];
}
and than ,when want to save it :
-(void)stopRecording {
if([[RPScreenRecorder sharedRecorder] isRecording]){
// The problem is sporadic,recording action failed,it makes me confused
}
[[RPScreenRecorder sharedRecorder] stopCaptureWithHandler:^(NSError * _Nullable error) {
if(!error) {
//post message
}
}];
}
Hi, I'm working on RealityView and I have two entities in RCP. In order to set views for both entities, I have to create two separate attachments for each entity. What I want to achieve is that when I hover (by eye) on one entity's attachment, it would trigger the hover effect of the other entity's attachment. I try to use the hoverEffectGroup, but it would only activate the hover effect in a subview, instead a complete separate view. I refer to the following WWDC instruction for the hover effect.
https://developer.apple.com/videos/play/wwdc2024/10152/
I've been trying to add a header to the tabSection of the tabview in tvos 18+ .
init(
@TabContentBuilder<SelectionValue> content: () -> Content,
@ViewBuilder header: () -> Header
) where Header : View, Footer == EmptyView
Here the ehader clearly conforms to View but i cant quite fit the label with uiimage as the icon into this. This Label when i add it to any other view, the image is in the specified 50 x 50 size but inside header it functions weirdly to be of a huge size. but also to note, if i simply hav an icon here, it is correct. So what is the problem here.. can someone help me? im supposed to add the user profile and name in the header. I dont think there's any other way
How should I program the globe key? If possible, could you teach me in C language?
Topic:
UI Frameworks
SubTopic:
General
We've seen a spike in crashes on iOS 18.4 across both iPhone & iPad. We can't reproduce it, but it looks like it happens when the app goes into the background.
Crash Log
Hi Apple team and community,
We’re encountering a strange issue with Live Activity that seems related to memory management or background lifecycle.
❓ Issue:
Our app updates a Live Activity regularly (every 3 minutes) using .update(...). However, after the app remains in the background for around 8 hours, the Live Activity reverts to the initial state that was passed into .request(...).
Even though the app continues sending updates in the background, the UI on the Lock Screen and Dynamic Island resets to the original state.
I'm trying to determine if it’s possible to detect when a user interacts with a Slide Over window while my app is running in the background on iPadOS. I've explored lifecycle methods such as scenePhase and various UIApplication notifications (e.g., willResignActiveNotification) to detect focus loss, but these approaches don't seem to capture the event reliably. Has anyone found an alternative solution or workaround for detecting this specific state change? Any insights or recommended practices would be greatly appreciated.
为什么App 上传testFlight之后。无法通过NFC的方式唤醒 APP Clips。是必须要上架商店之后才能支持么?
We are using the contactAccessPicker modifier connected to a Button to allow the user to change the selection of contacts that he allows for use in our app. In the two places where the iOS UI screen refers to our app:
"manage which contacts can access." on top,
and below in the explanatory text, again ,
the value of is taken probably from the app's PRODUCT_NAME. Instead, we need it to be either CFBundleName or CFDisplayBundleName.
In our case they are different (PRODUCT_NAME is legacy, reasons of rebranding, which is a very common reason in apps).
Is there a specific reason why iOS is using PRODUCT_NAME (or something similar) in the contactAccessPicker UI screen instead of the user facing CFBundleName or CFDisplayBundleName? or is this a bug?
Topic:
UI Frameworks
SubTopic:
SwiftUI
I have a popover/sheet in iOS which allows users to search and add items to a list. When the sheet is shown, the search should always be active.
I am using searchable on a NavigationStack inside the sheet. I am using the isPresented parameter to activate search.
My issue is with the animation of the search activation. Even if I use...
isPresented: .constant(true)
...the search isn't activated until the sheet has completed it's entrance animation, resulting in two stages of animation.
I can't add a video here, but the two images below show the steps I am seeing. First a slide up animation, with the search in the navigation drawer, then a second animation, once the sheet is fully in place, as the search becomes active.
Is it possible to merge these two animations, so search is in place when the sheet animates up?
Topic:
UI Frameworks
SubTopic:
SwiftUI
hi does any one know if there changes in lifecycle in xcode 16 ios 18 cause i notice that my view will appear does what view didappear use to do in older version and it kind of a problem cause all the rest of ios work diffrently does anyone else found a problem with it?
or does anyone know if there was a known change to life cycles
Hello, I have encountered a question that I hope to receive an answer to. Currently, I am working on a music project for Mac Catalyst and need to enable music files such as FLAC to be opened by right clicking to view my Mac Catalyst app. But currently, I have encountered a problem where I can see my app option in the right-click open mode after debugging the newly created macOS project using the following configuration. But when I created an iOS project and converted it to a Mac Catalyst app, and then modified the info.plist with the same configuration, I couldn't see my app in the open mode after debugging. May I ask how to solve this problem? Do I need to configure any permissions or features in the Mac Catalyst project? I have been searching for a long time but have not found a solution regarding it. Please resolve it, thank you.
Here is the configuration of my macOS project:
CFBundleDocumentTypes
CFBundleTypeExtensions
flac
CFBundleTypeIconSystemGenerated
1
CFBundleTypeName
FLAC Audio File
CFBundleTypeRole
Viewer
LSHandlerRank
Default
Note: Sandbox permissions have been enabled for both the macOS project and the iOS to Mac Catalyst project. The Mac Catalyst project also has additional permissions for com. apple. security. files. user taught. read write
I want to add the option to choose an alternative icon inside the app.
Is there a way to load an icon asset from within the app? I downloaded Apple’s alternative icon sample, which is supposed to show a list of icons to choose from, but even in the sample, it did not work.
So the current solution is to add every alternative icon along with another image asset of the same image to display to the user. This sounds like a waste of bytes.
Thank you in advance for any help.
Topic:
UI Frameworks
SubTopic:
General
import SwiftUI
struct ContentView: View {
var body: some View {
VStack {
Button ("Button 1") {
print ("Button 1");
}
.keyboardShortcut("k", modifiers: .command)
Button ("Button 2") {
print ("Button 2");
}
.keyboardShortcut("k", modifiers: .command)
}
}
}
I the above snippet, I have assigned the same keyboard shortcut (cmd +k) to 2 different buttons. According to the docs, if multiple controls are associated with the same shortcut, the first one found is used.
How do I figure out if Button 1 would be found first during the traversal or Button 2 ?
Is it based on the order of declaration? Is it always the case that Button 1 would be found first since it was declared before Button 2 ?
App update in which there were no changes regarding the widget. Just after it updated, the widget turns black in some cases. It also appears black in the widget gallery. Removing and adding it again did not work in this case, only after an iOS restart it works fine again
This is the log
2025-03-20 02:14:05.961611 +0800 Content load failed: unable to find or unarchive file for key: [com.aa.bb::com.aa.bb.widget:cc_widget:systemMedium::360.00/169.00/23.00:(null)~(null)] on no host. The session may still produce one shortly. Error: Using url file:///private/var/mobile/Containers/Data/PluginKitPlugin/51C5E4F2-6F1F-4466-A428-73C73B9CC887/SystemData/com.apple.chrono/placeholders/cc_widget/systemMedium----360.00w--169.00h--23.00r--1f--0.00t-0.00l-0.00b0.00t.chrono-timeline ... Error Domain=NSCocoaErrorDomain Code=4 "file“systemMedium----360.00w--169.00h--23.00r--1f--0.00t-0.00l-0.00b0.00t.chrono-timeline”not exist。" UserInfo={NSFilePath=/private/var/mobile/Containers/Data/PluginKitPlugin/51C5E4F2-6F1F-4466-A428-73C73B9CC887/SystemData/com.apple.chrono/placeholders/cc_widget/systemMedium----360.00w--169.00h--23.00r--1f--0.00t-0.00l-0.00b0.00t.chrono-timeline, NSUnderlyingError=0xa693d3a80 {Error Domain=NSPOSIXErrorDomain Code=2 "No such file or directory"}}
Hi,
I have an existing Mac app on the App Store with a couple of widgets as part of the app. I want to now add a new widget to the WidgetBundle. When I build the updated app with Xcode, and then run the updated app, the widgets list doesn't seem to get updated in Notification Center or in the WidgetKit Simulator.
I do have the App Store version installed in the /Applications folder as well, so there might be some conflict. What's the trick to getting the widgets list to run the debug version?
I have to decrease main window screen size when user open Immersive space in my project.
Using frame i try it but it not updated main window size it just update view frame.
My assumption has always been that [NSApp runModalForWindow:] runs a modal window in NSModalPanelRunLoopMode.
However, while -[NSApplication _doModalLoop:peek:] seems to use NSModalPanelRunLoopMode when pulling out the next event to process via nextEventMatchingMask:untilDate:inMode:dequeue:, the current runloop doesn't seem to be running in that mode, so during -[NSApplication(NSEventRouting) sendEvent:] of the modal-specific event, NSRunLoop.currentRunLoop.currentMode returns kCFRunLoopDefaultMode.
From what I can tell, this means that any event processing code that e.g. uses [NSTimer addTimer:forMode:] based on the current mode will register a timer that will not fire until the modal session ends.
Is this a bug? Or if not, is the correct way to run a modal session something like this?
[NSRunLoop.currentRunLoop performInModes:@[NSModalPanelRunLoopMode] block:^{
[NSApp runModalForWindow:window];
}];
[NSRunLoop.currentRunLoop limitDateForMode:NSModalPanelRunLoopMode];
Alternatively, if the mode of the runloop should stay the same, I've seen suggestions to run modal sessions like this:
NSModalSession session = [NSApp beginModalSessionForWindow:theWindow];
for (;;) {
if ([NSApp runModalSession:session] != NSModalResponseContinue)
break;
[NSRunLoop.currentRunLoop limitDateForMode:NSModalPanelRunLoopMode];
}
[NSApp endModalSession:session];
Which would work around the fact that the timer/callbacks were scheduled in the "wrong" mode. But running NSModalPanelRunLoopMode during a modal session seems a bit scary. Won't that potentially break the modality?