Explore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.

All subtopics
Posts under UI Frameworks topic

Post

Replies

Boosts

Views

Activity

NSTextLists not rendered when NSTextContentStorageDelegate textContentStorage (_:, textParagraphWith:) is implemented
I have a UITextView that contains paragraphs with text bullet lists (via NSTextList). I also implement NSTextContentStorageDelegate.textContentStorage(_:, textParagraphWith:) in order to apply some custom attributes to the text without affecting the underlying attributed text. My implementation returns a new NSParagraph that modifies the foreground color of the text. I based this on the example in the WWDC 21 session "Meet Text Kit 2". UITextView stops rendering the bullets when I implement the delegate function and return a custom paragraph. Why? func textContentStorage(_ textContentStorage: NSTextContentStorage, textParagraphWith range: NSRange) -> NSTextParagraph? { guard let originalText = textContentStorage.textStorage?.attributedSubstring(from: range) else { return nil } let updatedText = NSMutableAttributedString(attributedString: originalText) updatedText.addAttribute(.foregroundColor, value: UIColor.green, range: NSRange(location: 0, length: updatedText.length)) let paragraph = NSTextParagraph(attributedString: updatedText) // Verify that the text still contains NSTextList if let paragraphStyle = paragraph.attributedString.attribute(.paragraphStyle, at: 0, effectiveRange: nil) as? NSParagraphStyle { assert(!paragraphStyle.textLists.isEmpty) } else { assertionFailure("Paragraph has lost its text lists") } return paragraph }
0
0
250
May ’25
ARKit Camera Feed Zoom & Macro Support for Close-Range Objects
I am currently developing an AR experience using ARKit with SceneKit and am looking to implement functionality that enables: Zooming into the AR camera feed, ideally leveraging the ultra-wide or telephoto lenses available on supported devices. Macro-style focus capabilities, allowing users to view and interact with virtual content closely aligned with small or nearby real-world objects (within a few centimeters). My objective is to ensure that ARKit continues to render the scene accurately while enabling a zoomed-in view or macro-level focus for better detail visibility and alignment. Could you please advise on: Whether ARKit currently supports camera zoom or allows access to macro or ultra-wide cameras within an ARSession. Limitations or considerations when using multi-camera setups in conjunction with ARKit. Any guidance or references to documentation or sample code would be greatly appreciated.
0
0
147
May ’25
Regarding ARKit camera feed zoom and macro support for closer object
I am currently developing an AR experience using ARKit with SceneKit and am looking to implement functionality that enables: Zooming into the AR camera feed, ideally leveraging the ultra-wide or telephoto lenses available on supported devices. Macro-style focus capabilities, allowing users to view and interact with virtual content closely aligned with small or nearby real-world objects (within a few centimeters). My objective is to ensure that ARKit continues to render the scene accurately while enabling a zoomed-in view or macro-level focus for better detail visibility and alignment. Could you please advise on: Whether ARKit currently supports camera zoom or allows access to macro or ultra-wide cameras within an ARSession. Limitations or considerations when using multi-camera setups in conjunction with ARKit. Any guidance or references to documentation or sample code would be greatly appreciated. Best regards, Ayush
0
0
119
May ’25
Updating sort order of items in a LazyVGrid
I have a grid setup where I'm displaying multiple images which is working fine. Images are ordered by the date they're added, newest to oldest. I'm trying to set it up so that the user can change the sort order themselves but am having trouble getting the view to update. I'm setting the fetch request using oldest to newest as default when initialising the view, then when its appears updating the sort descriptor struct ProjectImagesListView: View { @Environment(\.managedObjectContext) private var viewContext var project : Project let columns = [ GridItem(.flexible()), GridItem(.flexible()), GridItem(.flexible()), GridItem(.flexible()) ] @FetchRequest var pictures: FetchedResults<Picture> var body: some View { ScrollView { LazyVGrid(columns: columns) { ForEach(pictures) { picture in NavigationLink(destination: ProjectImageDetailView(picture: picture)) { if let pictureData = picture.pictureThumbnailData, let uiImage = UIImage(data: pictureData) { Image(uiImage: uiImage) .resizable() .scaledToFit() .frame(height: 100) } else { Image("missing") .resizable() .scaledToFit() .frame(height: 100) } } } } } .navigationBarTitle("\(project.name ?? "") Images", displayMode: .inline) .onAppear() { guard let sortOrder = getSettingForPhotoOrder() else { return } guard let sortOrderValue = sortOrder.settingValue else { return } NSLog("sortOrderPhotos: \(String(describing: sortOrder.settingValue))") if sortOrderValue == "Newest" { NSLog("sortOrderPhotos: Change from default") let newSortDescriptor = NSSortDescriptor(keyPath: \Picture.dateTaken, ascending: false) pictures.nsSortDescriptors = [newSortDescriptor] } } } func getSettingForPhotoOrder() -> Setting? { let fetchRequest: NSFetchRequest<Setting> = Setting.fetchRequest() fetchRequest.predicate = NSPredicate(format: "name = %@", "photoSortOrder") fetchRequest.fetchLimit = 1 do { let results = try viewContext.fetch(fetchRequest) return results.first } catch { print("Fetching Failed") } return nil } init(project: Project) { self.project = project _pictures = FetchRequest( entity: Picture.entity(), sortDescriptors: [ NSSortDescriptor(keyPath: \Picture.dateTaken, ascending: true) ], predicate: NSPredicate(format: "project == %@", project) ) } }
Topic: UI Frameworks SubTopic: SwiftUI
0
0
87
May ’25
AVAudioSession dropping wired USBAudio source
My app inputs electrical waveforms from an IV485B39 2 channel USB device using an AVAudioSession. Before attempting to acquire data I make sure the input device is available as follows: AVAudiosSession *audioSession = [AVAudioSession sharedInstance]; [audioSession setCategory :AVAudioSessionCategoryRecord error:&err]; NSArray *inputs = [audioSession availableInputs]; I have been using this code for about 10 years. My app is scriptable so a user can acquire data from the IV485B29 multiple times with various parameter settings (sampling rates and sample duration). Recently the scripts have been failing to complete and what I have notice that when it fails the list of available inputs is missing the USBAudio input. While debugging I have noticed that when working properly the list of inputs includes both the internal microphone as well as the USBAudio device as shown below. VIB_TimeSeriesViewController:***Available inputs = ( "<AVAudioSessionPortDescription: 0x11584c7d0, type = MicrophoneBuiltIn; name = iPad Microphone; UID = Built-In Microphone; selectedDataSource = Front>", "<AVAudioSessionPortDescription: 0x11584cae0, type = USBAudio; name = 485B39 200095708064650803073200616; UID = AppleUSBAudioEngine:Digiducer.com :485B39 200095708064650803073200616:000957 200095708064650803073200616:1; selectedDataSource = (null)>" ) But when it fails I only see the built in microphone. VIB_TimeSeriesViewController:***Available inputs = ( "<AVAudioSessionPortDescription: 0x11584cef0, type = MicrophoneBuiltIn; name = iPad Microphone; UID = Built-In Microphone; selectedDataSource = Front>" ) If I only see the built in microphone I immediately repeat the three lines of code and most of the "inputs" contains both the internal microphone and the USBAudioDevice AVAudiosSession *audioSession = [AVAudioSession sharedInstance]; [audioSession setCategory :AVAudioSessionCategoryRecord error:&err]; NSArray *inputs = [audioSession availableInputs]; This fix always works on my M2 iPadPro and my iPhone 14 but some of my customers have older devices and even with 3 tries they still get faults about 1 in 10 tries. I rolled back my code to a released version from about 12 months ago where I know we never had this problem and compiled it against the current libraries and the problem still exists. I assume this is a problem caused by a change in the AVAudioSession framework libraries. I need to find a way to work around the issue or get the library fixed.
0
0
95
May ’25
A wrinkle converting a UIKit Document-based app to SwiftUI Document Group
The app I'm converting includes two unique document types. UI-wise they have key similarities (eg contents are password protected) But serialization/model - wise. they are different documents. I have not been able to find any documentation on options for implementing this (eg use a (abstract?) base class derived from FileDocument, with two concrete sub classes? maybe just a single subclass of FileDocument that contains model details for both file types?) Stepping back from implementation options, am I crazy for attempting to use DocumentGroup to create a single app that would need to be able to open/modify/save multiple unique document types? any/all guidance much appreciated.
Topic: UI Frameworks SubTopic: SwiftUI
0
0
67
May ’25
Mixing NavigationLink types (value: and non-value: types)
Hello, I was wondering if someone could clear-up my thinking here. e.g. consider the code below... It has a rootView with a navlink to a childView which in turn has navlinks to GrandchildViews. The root view uses basic navLInks NavigationLink{View} label: {View} The child view uses type-based navLinks navigationLink(value:) {View} and .navigationDestination(for:) {View} I would expect the basic navlinks to work in the root view and the type-based ones to work in the child view. However it appears that both are active when one taps on a link in the child view. e.g. User actions: Start -> RootView is only view on the stack -> (tap on ‘Child View’) -> ChildView is top of the stack -> tap on ‘Alice’ -> a second ChildView is top of the stack with a GrandchildView underneath…. Why does this happen, why are the basic links also applied to the childView's links? Thanks. struct Thing: Identifiable, Hashable { let id = UUID() let name: String } struct RootView: View { var body: some View { NavigationStack { List { NavigationLink { ChildView() } label: { Label("Child View", systemImage: "figure.and.child.holdinghands") } NavigationLink { Text("Hello") } label: { Label("Another navLink item in the list", systemImage: "circle") } } .padding() } } } struct ChildView: View { private var things = [ Thing(name: "Alice"), Thing(name: "Bob"), Thing(name: "Charlie"), ] var body: some View { Text("This is the child view") List { ForEach(things) { thing in NavigationLink(value: thing) { Text(thing.name) } } } .navigationTitle("Child View") .navigationDestination(for: Thing.self) { thing in GrandchildView(thing: thing) } } } struct GrandchildView: View { let thing: Thing var body: some View { Text("This is the GrandchildView: \(thing.name)") } }
Topic: UI Frameworks SubTopic: SwiftUI
0
0
128
Mar ’25
AppEntity with @Parameter Options Works in Shortcuts App but Not with Siri
I’m working with AppIntents and AppEntity to integrate my app’s data model into Shortcuts and Siri. In the example below, I define a custom FoodEntity and use it as a @Parameter in an AppIntent. I’m providing dynamic options for this parameter via an optionsProvider. In the Shortcuts app, everything works as expected: when the user runs the shortcut, they get a list of food options (from the dynamic provider) to select from. However, in Siri, the experience is different. Instead of showing the list of options, Siri asks the user to say the name of the food, and then tries to match it using EntityStringQuery. I originally assumed this might be a design decision to allow hands-free use with voice, but I found that if you use an AppEnum instead, Siri does present a tappable list of options. So now I’m wondering: why the difference? Is there a way to get the @Parameter with AppEntity + optionsProvider to show a tappable list in Siri like it does in Shortcuts or with an AppEnum? Any clarification on how EntityQuery.suggestedEntities() and DynamicOptionsProvider interact with Siri would be appreciated! struct CaloriesShortcuts: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut( intent: AddCaloriesInteractive(), phrases: [ "Add to \(.applicationName)" ], shortTitle: "Calories", systemImageName: "fork" ) } } struct AddCaloriesInteractive: AppIntent { static var title: LocalizedStringResource = "Add to calories log" static var description = IntentDescription("Add Calories using Shortcuts.") static var openAppWhenRun: Bool = false static var parameterSummary: some ParameterSummary { Summary("Calorie Entry SUMMARY") } var displayRepresentation: DisplayRepresentation { DisplayRepresentation(stringLiteral:"Add to calorie log") } @Dependency private var persistenceManager: PersistenceManager @Parameter(title: LocalizedStringResource("Food"), optionsProvider: FoodEntityOptions()) var foodEntity: FoodEntity @MainActor func perform() async throws -> some IntentResult & ProvidesDialog { return .result(dialog: .init("Added \(foodEntity.name) to calorie log")) } } struct FoodEntity: AppEntity { static var defaultQuery = FoodEntityQuery() @Property var name: String @Property var calories: Int init(name: String, calories: Int) { self.name = name self.calories = calories } static var typeDisplayRepresentation: TypeDisplayRepresentation { TypeDisplayRepresentation(name: "Calorie Entry") } static var typeDisplayName: LocalizedStringResource = "Calorie Entry" var displayRepresentation: AppIntents.DisplayRepresentation { DisplayRepresentation(title: .init(stringLiteral: name), subtitle: "\(calories)") } var id: String { return name } } struct FoodEntityQuery: EntityQuery { func entities(for identifiers: [FoodEntity.ID]) async throws -> [FoodEntity] { var result = [FoodEntity]() for identifier in identifiers { if let entity = FoodDatabase.allEntities().first(where: { $0.id == identifier }) { result.append(entity) } } return result } func suggestedEntities() async throws -> [FoodEntity] { return FoodDatabase.allEntities() } } extension FoodEntityQuery: EntityStringQuery { func entities(matching string: String) async throws -> [FoodEntity] { return FoodDatabase.allEntities().filter({$0.name.localizedCaseInsensitiveCompare(string) == .orderedSame}) } } struct FoodEntityOptions: DynamicOptionsProvider { func results() async throws -> ItemCollection<FoodEntity> { ItemCollection { ItemSection("Section 1") { for entry in FoodDatabase.allEntities() { entry } } } } } struct FoodDatabase { // Fake data static func allEntities() -> [FoodEntity] { [ FoodEntity(name: "Orange", calories: 2), FoodEntity(name: "Banana", calories: 2) ] } }
0
1
95
May ’25
How remove AppIntent dialog programmatically?
When the perform method of my AppIntent returns the custom view's dialog, and after I click the "Click Test" button, my app will be launched, but this dialog does not close. How can I close it? struct QuestionResultView: View { var body: some View { VStack { if #available(iOS 17.0, *) { Button(role:.cancel, intent: OpenAppIntent()) { Text("Click Test") } } }.frame(height: 300) } } struct OpenAppIntent : AppIntent { static let title: LocalizedStringResource = "Open my app" static let openAppWhenRun: Bool = true static let isDiscoverable: Bool = false; @MainActor func perform() async throws -> some IntentResult { return .result() } } struct OpenPhotoRecognizing: AppIntent { static let title: LocalizedStringResource = "Read photo" static let description = IntentDescription("") static let openAppWhenRun: Bool = false func perform() async throws -> some IntentResult & ShowsSnippetView & ProvidesDialog{ return .result(dialog: "Demo Test") { DemoResultView() } } }
Topic: UI Frameworks SubTopic: SwiftUI
0
0
72
May ’25
Playground SwiftUI on iPad wont save .png image using fileExporter.
The SwiftUI Playground code below demonstrates that a .jpeg image can be read and written to the iOS file system. While, a.png image can only be read; the writing request appears to be ignored. Can anyone please tell me how to code to save a .png image using SwiftUI to the iOS file system. Code: import SwiftUI import UniformTypeIdentifiers /* (Copied from Playground 'Help' menu popup.) UIImage Summary An object that manages image data in your app. You use image objects to represent image data of all kinds, and the UIImage class is capable of managing data for all image formats supported by the underlying platform. Image objects are immutable, so you always create them from existing image data, such as an image file on disk or programmatically created image data. An image object may contain a single image or a sequence of images for use in an animation. You can use image objects in several different ways: Assign an image to a UIImageView object to display the image in your interface. Use an image to customize system controls such as buttons, sliders, and segmented controls. Draw an image directly into a view or other graphics context. Pass an image to other APIs that might require image data. Although image objects support all platform-native image formats, it’s recommended that you use PNG or JPEG files for most images in your app. Image objects are optimized for reading and displaying both formats, and those formats offer better performance than most other image formats. Because the PNG format is lossless, it’s especially recommended for the images you use in your app’s interface. Declaration class UIImage : NSObject UIImage Class Reference */ @main struct MyApp: App { var body: some Scene { WindowGroup { ContentView() } } } struct ImageFileDoc: FileDocument { static var readableContentTypes = [UTType.jpeg, UTType.png] static var writableContentTypes = [UTType.jpeg, UTType.png] var someUIImage: UIImage = UIImage() init(initialImage: UIImage = UIImage()) { self.someUIImage = initialImage } init(configuration: ReadConfiguration) throws { guard let data = configuration.file.regularFileContents, let some = UIImage(data: data) else { throw CocoaError(.fileReadCorruptFile) } self.someUIImage = some } func fileWrapper(configuration: WriteConfiguration) throws -> FileWrapper { switch configuration.contentType { case UTType.png: if let data = self.someUIImage.pngData() { return .init(regularFileWithContents: data) } case UTType.jpeg: if let data = self.someUIImage.jpegData(compressionQuality: 1.0) { return .init(regularFileWithContents: data) } default: break } throw CocoaError(.fileWriteUnknown) } } struct ContentView: View { @State private var showingExporterPNG = false @State private var showingExporterJPG = false @State private var showingImporter = false @State var message = "Hello, World!" @State var document: ImageFileDoc = ImageFileDoc() @State var documentExtension = "" var body: some View { VStack { Image(systemName: "globe") .imageScale(.large) .foregroundColor(.accentColor) Text(message) Button("export") { if documentExtension == "png" { message += ", showingExporterPNG is true." showingExporterPNG = true } if documentExtension == "jpeg" { message += ", showingExporterJPG is true." showingExporterJPG = true } } .padding(20) .border(.white, width: 2.0) .disabled(documentExtension == "") Button("import") { showingImporter = true } .padding(20) .border(.white, width: 2.0) Image(uiImage: document.someUIImage) .resizable() .padding() .frame(width: 300, height: 300) } // exporter .png .fileExporter(isPresented: $showingExporterPNG, document: document, contentType: UTType.png) { result in switch result { case .success(let url): message += ", .\(documentExtension) Saved to \(url.lastPathComponent)" case .failure(let error): message += ", Some error saving file: " + error.localizedDescription } } // exporter .jpeg .fileExporter(isPresented: $showingExporterJPG, document: document, contentType: UTType.jpeg) { result in switch result { case .success(let url): message += ", .\(documentExtension) Saved to \(url.lastPathComponent)" case .failure(let error): message += ", Some error saving file: " + error.localizedDescription } } // importer .fileImporter(isPresented: $showingImporter, allowedContentTypes: [.png, .jpeg]) { result in switch result { case .failure(let error): message += ", Some error reading file: " + error.localizedDescription case .success(let url): let gotAccess = url.startAccessingSecurityScopedResource() if !gotAccess { message += ", Unable to Access \(url.lastPathComponent)" return } documentExtension = url.pathExtension guard let fileContents = try? Data(contentsOf: url) else { message += ",\n\nUnable to read file: \(url.lastPathComponent)\n\n" url.stopAccessingSecurityScopedResource() return } url.stopAccessingSecurityScopedResource() message += ", Read file: \(url.lastPathComponent)" message += ", path extension is '\(documentExtension)'." if let uiImage = UIImage(data: fileContents) { self.document.someUIImage = uiImage }else{ message += ", File Content is not an Image." } } } } }
0
0
372
Feb ’25
CPTabBarTemplate in CarPlay Simulator: Tab Becomes Inactive on Re-selection
I am facing an issue in my CarPlay app using CPTabBarTemplate. The app has two tabs, and on launch, the first tab is correctly selected. However, when I tap on the first tab again, instead of staying active, it becomes inactive. This behavior is unexpected, as re-selecting the active tab should typically maintain its selected state. Has anyone else encountered this issue or found a workaround to prevent the tab from becoming inactive?
0
0
58
May ’25
macos 15, savepanel return the wrong path when saving files occasionally
macOS: 15.0 macFUSE: 4.8.3 I am using rclone + macFUSE and mount my netdisk where it has created three subdirectories in its root directory: /user, /share, and /group. When I save a file to /[root]/user using NSSavePanel and name it test.txt, I expect the file to be saved as: /[root]/user/test.txt However, occasionally, the delegate method: - (BOOL)panel:(id)sender validateURL:(NSURL *)url error:(NSError **)outError { } returns an incorrect path: /[root]/test.txt This issue only occurs when selecting /user. The same operation works correctly for /share and /group. Is there any logs I could provide to help solving this issue? Many thanks!
Topic: UI Frameworks SubTopic: AppKit Tags:
0
0
390
Feb ’25
ssue with Session Sharing Between Safari and ASWebAuthenticationSession
We are experiencing an issue with session sharing on iOS and would appreciate your guidance. We operate and control our own OpenID Connect (OIDC) server. Our iOS application uses ASWebAuthenticationSession to authenticate users. We're unable to get the authentication session to be shared between the Safari app and the app's ASWebAuthenticationSession. This results in users having to re-authenticate despite being logged in via Safari. We've attempted various configurations related to cookie SameSite settings. These adjustments resolved the session sharing issue on Android using Chrome Custom Tabs. However, no changes we've tried have enabled session sharing to work as expected on iOS. According to documentation from Apple, Microsoft, Okta, and Auth0, session sharing between Safari and ASWebAuthenticationSession should work. Question: Are there any additional settings, configurations, or platform limitations we should be aware of that could impact session sharing on iOS? Where else can we look to resolve this issue?
Topic: UI Frameworks SubTopic: General
0
0
131
May ’25
How to correctly and simply remove the edges of listStyle sidebar?
Hello, I've managed to get rid of these spaces in different ways. Using scrollview, giving negative insets, rewriting modifiers from scratch with plain style etc. But I couldn't solve this with a simple solution. I've read comments from many people experiencing similar problems online. It seems like there isn't a simple modifier to remove these spaces when we use sidebar as the list style in SwiftUI, or I couldn't find the simple solution. I wonder what's the simplest and correct way to reset these spaces? let numbers = Array(1...5) @State private var selected: Int? var body: some View { List(numbers, id: \.self, selection: $selected) { number in HStack { Text("Test") Spacer() } .frame(maxWidth: .infinity, alignment: .leading) } .listStyle(.sidebar) } }
0
0
190
Mar ’25
NavigationSplitView and NavigationPaths
A NavigationStack with a singular enum for .navigationDestination() works fine. Both NavigationLinks(value:) and directly manipulating the NavigationPath work fine for moving around views. Zero problems. The issue is when we instead use a NavigationSplitView, I've only dabbled with two-column splits (sidebar and detail) so far. Now, if the sidebar has its own NavigationStack, everything works nicely on an iPhone, but on an iPad, you can't push views onto the detail from the sidebar. (They're pushed on the sidebar) You can solve this by keeping a NavigationStack ONLY on the detail. Sidebar links now properly push onto the detail, and the detail can move around views by itself. However, if you mix NavigationLink(value:) with manually changing NavigationPath, it stops working with no error. If you only use links, you're good, if you only change the NavigationPath you're good. Mixing doesn't work. No error in the console either, the breakpoints hit .navigationDestination and the view is returned, but never piled up. (Further attempts do show the NavigationPath is being changed properly, but views aren't changing) This problem didn't happen when just staying on NavigationStack without a NavigationSplitView. Why mix? There's a few reasons to do so. NavigationLinks put the appropriate disclosure indicator (can't replicate its look 100% without it), while NavigationPaths let you trigger navigation without user input (.onChange, etc) Any insights here? I'd put some code samples but there's a metric ton of options I've tested here.
0
0
200
Mar ’25
How to Use AccessorySetupKit to Automatically Trigger Hardware Pairing Screen in the Background?
Hello everyone, I’m currently developing an iOS app and would like to leverage the AccessorySetupKit framework introduced in iOS 18 to implement pairing functionality with our company's custom hardware product. The specific requirements are as follows: Our hardware supports both Bluetooth and Wi-Fi connections, and both are enabled. When the hardware device is in proximity to an iPhone, I want the device to be automatically recognized, and a pairing screen similar to the one in AccessorySetupKit should appear. Users should be able to perform the pairing process without needing to open our app, even if the app is not in the foreground. The system-level pairing screen should show the hardware information and allow the user to proceed with the pairing. My questions are: Does AccessorySetupKit allow the pairing screen to trigger when the app is running in the background, or must the app be in the foreground? How should I configure AccessorySetupKit to automatically recognize and display my company’s hardware device information? Are there any specific configurations or code implementations needed? Do I need to implement any specific Bluetooth/Wi-Fi advertising broadcasts to ensure the device is correctly detected by the iOS system when in proximity? Are there any additional permissions or configurations required, especially for handling background tasks? Thank you very much for your help, and I look forward to your advice and insights!
Topic: UI Frameworks SubTopic: UIKit
0
0
78
May ’25
Button Behavior between fullScreenCover and sheet
The behavior of the Button in ScrollView differs depending on how the View is displayed modally. When the View is displayed as a .fullScreenCover, if the button is touched and scrolled without releasing the finger, the touch event is canceled and the action of the Button is not called. On the other hand, if the View is displayed as a .sheet, the touch event is not canceled even if the view is scrolled without lifting the finger, and the action is called when the finger is released. In order to prevent accidental interaction, I feel that the behavior of .fullScreenCover is better, as it cancels the event immediately when scrolling. Can I change the behavior of .sheet? Demo movie is here: https://x.com/kenmaz/status/1896498312737611891 Sample code import SwiftUI @main struct SampleApp: App { var body: some Scene { WindowGroup { ContentView() } } } struct ContentView: View { @State private var showSheet = false @State private var showFullScreen = false var body: some View { VStack(spacing: 16) { Button("Sheet") { showSheet.toggle() } Button("Full screen") { showFullScreen.toggle() } } .sheet(isPresented: $showSheet) { SecondView() } .fullScreenCover(isPresented: $showFullScreen) { SecondView() } .font(.title) } } struct SecondView: View { @Environment(\.dismiss) var dismiss var body: some View { ScrollView { Button("Dismiss") { dismiss() } .buttonStyle(MyButtonStyle()) .padding(.top, 128) .font(.title) } } } private struct MyButtonStyle: ButtonStyle { func makeBody(configuration: Self.Configuration) -> some View { configuration .label .foregroundStyle(.red) .background(configuration.isPressed ? .gray : .clear) } }
Topic: UI Frameworks SubTopic: SwiftUI
0
0
189
Mar ’25
Reliable APIs to check if a Hotkey/Shortcut is already in use?
In our application we have two usecases for a Hotkey/Shortcut identification API/method. We have some predefined shortcuts that will ship with our MacOS application. They may or may not change dynamically, based on what the user has already set as shortcuts/hotkeys, and also to avoid any important system wide shortcuts that the user may or may not have changed. We allow the user to customize the shortcuts/hotkeys in our application, so we want to show what shortcuts the user already has in use system-wide and across their OS experience. This gives rise to the need for an API that lets us know which shortcut/hotkeys are currently being used by the user and also the current system wide OS shortcuts in use. Please let me know if there are any APIs in AppKit or SwiftUI we can use for the above
0
0
227
Mar ’25