Hello!
I'm trying to set a UiRefreshControl.tintColor:
.onAppear {
UIRefreshControl.appearance().tintColor = UIColor.systemBlue
}
But instead of
I get
The color in the second picture is a high contrast version of the first one. I can't understand why it works this way.
I also tried the following.
UIRefreshControl.appearance().tintColor = UIColor(red: 0, green: 0.478, blue: 1, alpha: 1) // doesn't work
UIRefreshControl.appearance().tintColor = UIColor(named: "RefreshControlColor") // doesn't work, here set "High contrast" on and indicated Universal.systemBlueColor
Perhaps I missed something?
Explore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
In my CarPlay app, I am hiding the navigation bar by using the following:
self.mapTemplate?.automaticallyHidesNavigationBar = true
self.mapTemplate?.hidesButtonsWithNavigationBar = false
I don't want the navigation bar to show unless a user interacts with the map by tapping it.
Strangely, when I present a CPNavigationAlert the navigation bar will often appear and then disappear after the alert is dismissed.
Is there a setting or reason that the navigation bar would be appearing when presenting this alert? I would like to keep the nav bar hidden during this time.
Im student, hobbyst on developing.
i have a problem inserting a custom slidee PNG to control volume of an áudio file in an app. The slidee built in Swift, runs ok. When i try to use a custom png it show in the Gui but when move its button right it disappear beyond the maximum but when i move ir left the minimamente is at middle of the slider scale
Topic:
UI Frameworks
SubTopic:
SwiftUI
We recently migrated our app to use NavigationSplitView on iPad with a sidebar and detail setup, and we got reports that the navigation buttons on the sidebar disappear when returning to our app after using a different app. I reproduced the issue from a new empty project with the following code (issue tested on iOS 17.4 and iOS 18.3, was not able to reproduce on iOS 16.4):
import SwiftUI
@main
struct TestApp: App {
var body: some Scene {
WindowGroup {
NavigationSplitView {
Text("sidebar")
.toolbar {
ToolbarItem(placement: .topBarLeading) {
Button(action: {}) {
Image(systemName: "square.and.arrow.down")
}
}
ToolbarItem(placement: .topBarTrailing) {
Button(action: {}) {
Image(systemName: "square.and.arrow.up")
}
}
}
} detail: {
Text("detail")
.toolbar {
ToolbarItem(placement: .topBarLeading) {
Button(action: {}) {
Image(systemName: "eraser")
}
}
ToolbarItem(placement: .topBarTrailing) {
Button(action: {}) {
Image(systemName: "pencil")
}
}
}
}
}
}
}
Please check the following GIF for the simple steps, notice how the navbar buttons in the detail view do not disappear:
Here's the console output, it shows that the constraints break internally:
Hi, I'm working on RealityView and I have two entities in RCP. In order to set views for both entities, I have to create two separate attachments for each entity. What I want to achieve is that when I hover (by eye) on one entity's attachment, it would trigger the hover effect of the other entity's attachment. I try to use the hoverEffectGroup, but it would only activate the hover effect in a subview, instead a complete separate view. I refer to the following WWDC instruction for the hover effect.
https://developer.apple.com/videos/play/wwdc2024/10152/
I am a developer on an enterprise application. Our team just updated our pipeline to build our app on the iOS 18 SDK instead of the 17.4 SDK and this has caused a lot of our ui elements to change and several crashes within the app resulting in just the simple error message "Swift runtime failure: unhandled C++ / Objective-C exception".
Why is just updating the SDK causing all these issues? Is there anyway to keep the previous version or will we have to go component by component to fix the constraints and crashes? These issues seem to be happening to our users on iOS 18 and beyond.
I'm trying to update my app to use TextKit 2. The one thing that I'm still not sure about is how I can get the selection frame. My app uses it to auto-scroll the text to keep the cursor at the same height when the text wraps onto a new line or a newline is manually inserted. Currently I'm using NSLayoutManager.layoutManager!.boundingRect(forGlyphRange:in:).
The code below almost works. When editing the text or changing the selection, the current selection frame is printed out. My expectation is that the selection frame after a text or selection change should be equal to the selection frame before the next text change. I've noticed that this is not always true when the text has a NSParagraphStyle with spacing > 0. As long as I type at the end of the text, everything's fine, but if I insert some lines, then move the selection somewhere into the middle of the text and insert another newline, the frame printed after manually moving the selection is different than the frame before the newline is inserted. It seems that the offset between the two frames is exactly the same as the paragraph style's spacing. Instead when moving the selection with the arrow key the printed frames are correct.
I've filed FB17104954.
class ViewController: NSViewController, NSTextViewDelegate {
private var textView: NSTextView!
override func loadView() {
let scrollView = NSScrollView(frame: CGRect(x: 0, y: 0, width: 400, height: 400))
textView = NSTextView(frame: scrollView.frame)
textView.autoresizingMask = [.width, .height]
textView.delegate = self
let paragraphStyle = NSMutableParagraphStyle()
paragraphStyle.lineSpacing = 40
textView.typingAttributes = [.foregroundColor: NSColor.labelColor, .paragraphStyle: paragraphStyle]
scrollView.documentView = textView
scrollView.hasVerticalScroller = true
view = scrollView
}
func textView(_ textView: NSTextView, shouldChangeTextIn affectedCharRange: NSRange, replacementString: String?) -> Bool {
print("before", selectionFrame.maxY, selectionFrame)
return true
}
func textDidChange(_ notification: Notification) {
print("after ", selectionFrame.maxY, selectionFrame)
}
func textViewDidChangeSelection(_ notification: Notification) {
print("select", selectionFrame.maxY, selectionFrame)
}
var selectionFrame: CGRect {
guard let selection = textView.textLayoutManager!.textSelections.first?.textRanges.first else {
return .null
}
var frame = CGRect.null
textView.textLayoutManager!.ensureLayout(for: selection)
textView.textLayoutManager!.enumerateTextSegments(in: selection, type: .selection, options: [.rangeNotRequired]) { _, rect, _, _ in
frame = rect
return false
}
return frame
}
}
Hello,
is there a way to implement Continuity Markup in our own apps?
(This is what I'm talking about: https://support.apple.com/en-us/102269 , scroll down to "Use Continuity Markup").
Also, why does a QuickLook panel (QLPreviewPanel.shared()) not display the markup options when triggered from my app for png image files in my app's Group Container? Do I need to implement certain NSServicesMenuRequestor methods for that?
Sadly, I could not find any docs on that.
Thank you,
– Matthias
So I am looking to use a custom NSWindow application (so I can implement some enhanced resizing/dragging behavior which is only possible overriding NSWindow).
The problem is my whole application is currently SwiftUI-based (see the project here: https://github.com/msdrigg/Roam/blob/50a2a641aa5f2fccb4382e14dbb410c1679d8b0c/Roam/RoamApp.swift).
I know there is a way to make this work by dropping my @main SwiftUI app and replacing it with a SwiftUI root view hosted in a standard AppKit root app, but that feels like I'm going backwards.
Is there another way to get access (and override) the root NSWindow for a SwiftUI app?
I am facing same issue with major crash while coming out from this function.
Basically using collectionView.dequeReusableCell with size calculation.
func getSizeOfFavouriteCell(_ collectionView: UICollectionView, at indexPath: IndexPath, item: FindCircleInfoCellItem) -> CGSize { guard let dummyCell = collectionView.dequeueReusableCell( withReuseIdentifier: TAButtonAddCollectionViewCell.reuseIdentifier, for: indexPath) as? TAButtonAddCollectionViewCell else { return CGSize.zero }
dummyCell.title = item.title
dummyCell.subtitle = item.subtitle
dummyCell.icon = item.icon
dummyCell.layoutIfNeeded()
var targetSize = CGSize.zero
if viewModel.favoritesDataSource.isEmpty.not,
viewModel.favoritesDataSource.count > FindSheetViewControllerConstants.minimumFavoritesToDisplayInSection {
targetSize = CGSize(width: collectionView.frame.size.width / 2, height: collectionView.frame.height)
var estimatedSize: CGSize = dummyCell.systemLayoutSizeFitting(targetSize)
if estimatedSize.width > targetSize.width {
estimatedSize.width = targetSize.width
}
return CGSize(width: estimatedSize.width, height: targetSize.height)
}
}
We have resolve issue with size calculation with checking nil. Working fine in xcode 15 and 16+.
Note: Please help me with reason of crash? Is it because of xCode 16.2 onwards **strict check on UICollectionView **
Hello everyone,
The setup:
I have an iPadOS app.
The app does not require full screen (Requires full screen option is disabled).
The problem:
The app starts looking unpolished when the canvas becomes too small.
What I tried:
I am trying to limit the canvas size for our app when run in Stage Manager.
How:
I saw that UIWindowScene has sizeRestrictions. This property is not always set as per documentation:
https://developer.apple.com/documentation/uikit/uiwindowscene/sizerestrictions
From my experiments, it only works when it's run on MacOS (in compatibility mode in our case).
Console logs:
Stage Manager - Requires full screen - OFF
willConnectToSession - sizeRestrictions: nil
sceneDidBecomeActive - sizeRestrictions: nil
Stage Manager - Requires full screen - ON
willConnectToSession - sizeRestrictions: nil
sceneDidBecomeActive - sizeRestrictions: nil
Stage Manager - Requires full screen - OFF - RUN on MacOS
willConnectToSession - sizeRestrictions: Available
sceneDidBecomeActive - sizeRestrictions: Available
Question:
Is there a way to enforce this minimum canvas size?
Topic:
UI Frameworks
SubTopic:
UIKit
My assumption has always been that [NSApp runModalForWindow:] runs a modal window in NSModalPanelRunLoopMode.
However, while -[NSApplication _doModalLoop:peek:] seems to use NSModalPanelRunLoopMode when pulling out the next event to process via nextEventMatchingMask:untilDate:inMode:dequeue:, the current runloop doesn't seem to be running in that mode, so during -[NSApplication(NSEventRouting) sendEvent:] of the modal-specific event, NSRunLoop.currentRunLoop.currentMode returns kCFRunLoopDefaultMode.
From what I can tell, this means that any event processing code that e.g. uses [NSTimer addTimer:forMode:] based on the current mode will register a timer that will not fire until the modal session ends.
Is this a bug? Or if not, is the correct way to run a modal session something like this?
[NSRunLoop.currentRunLoop performInModes:@[NSModalPanelRunLoopMode] block:^{
[NSApp runModalForWindow:window];
}];
[NSRunLoop.currentRunLoop limitDateForMode:NSModalPanelRunLoopMode];
Alternatively, if the mode of the runloop should stay the same, I've seen suggestions to run modal sessions like this:
NSModalSession session = [NSApp beginModalSessionForWindow:theWindow];
for (;;) {
if ([NSApp runModalSession:session] != NSModalResponseContinue)
break;
[NSRunLoop.currentRunLoop limitDateForMode:NSModalPanelRunLoopMode];
}
[NSApp endModalSession:session];
Which would work around the fact that the timer/callbacks were scheduled in the "wrong" mode. But running NSModalPanelRunLoopMode during a modal session seems a bit scary. Won't that potentially break the modality?
Why is the pitch slider always visible in the SwiftUI tvOS map view? It doesn't even appear to be supported there, let alone the fact that I specify mapControlVisibility(.hidden). Am I missing something or is Apple? See attached screenshot. This really messes up my UI.
Here is my code:
import SwiftUI
import MapKit
struct ContentView: View {
@State var position = MapCameraPosition.region(MKCoordinateRegion(
center: CLLocationCoordinate2D(latitude: 37.7749, longitude: -122.4194),
span: MKCoordinateSpan(latitudeDelta: 0.05, longitudeDelta: 0.05)))
var body: some View {
Map(position: $position)
.mapControlVisibility(.hidden)
.mapStyle(.standard(pointsOfInterest: .including(.airport)))
}
}
Hello, I have encountered a question that I hope to receive an answer to. Currently, I am working on a music project for Mac Catalyst and need to enable music files such as FLAC to be opened by right clicking to view my Mac Catalyst app. But currently, I have encountered a problem where I can see my app option in the right-click open mode after debugging the newly created macOS project using the following configuration. But when I created an iOS project and converted it to a Mac Catalyst app, and then modified the info.plist with the same configuration, I couldn't see my app in the open mode after debugging. May I ask how to solve this problem? Do I need to configure any permissions or features in the Mac Catalyst project? I have been searching for a long time but have not found a solution regarding it. Please resolve it, thank you.
Here is the configuration of my macOS project:
CFBundleDocumentTypes
CFBundleTypeExtensions
flac
CFBundleTypeIconSystemGenerated
1
CFBundleTypeName
FLAC Audio File
CFBundleTypeRole
Viewer
LSHandlerRank
Default
Note: Sandbox permissions have been enabled for both the macOS project and the iOS to Mac Catalyst project. The Mac Catalyst project also has additional permissions for com. apple. security. files. user taught. read write
Hi,
I am developing a new SwiftUI app. Running under OSX, I see very high cpu usage (I am generating lots of gpu based updates which shouldn't affect the cpu).
I have used the profiler to ensure my swift property updates are minimal, yet the cpu usage is high coming from SwiftUI.
It seems the high cpu usage is coming from NSAppearance, specifically, CUICopyMeasurements - for a single button??? But the swift updates don't show any buttons being updating
Topic:
UI Frameworks
SubTopic:
SwiftUI
Hi,
I have an existing Mac app on the App Store with a couple of widgets as part of the app. I want to now add a new widget to the WidgetBundle. When I build the updated app with Xcode, and then run the updated app, the widgets list doesn't seem to get updated in Notification Center or in the WidgetKit Simulator.
I do have the App Store version installed in the /Applications folder as well, so there might be some conflict. What's the trick to getting the widgets list to run the debug version?
There are two issues about SFSafariViewController.
After rotate from landscape to portrait,
The topAnchor is destroyed.
The specified bar tint color and control tint color are invalidated.(Returns to system color)
Regarding the second issue, I’ve found a temporary workaround.
Override the viewWillTransition(to:with:) and keep it empty. Don't call super.viewWillTransition(to:with:).
Since UIKit is not open source, I don’t know the exact cause, but I found something that could be the key to the issue. So, I reported it to Apple Feedback Assistant. You can check the details and the sample project in the GitHub repository below.
https://github.com/ueunli/SafariViewer
I'm attempting to write a macOS version of https://stackoverflow.com/a/74935849/2178159.
From my understanding, I should be able to set the menu property of an NSResponder and it will automatically show on right click.
I've tried a couple things:
A: set menu on an NSHostingController's view - when I do this and right or ctrl click, nothing happens.
B: set menu on NSHostingController directly - when I do this I get a crash Abstract method -[NSResponder setMenu:] called from class _TtGC7SwiftUI19NSHostingControllerGVS_21_ViewModifier_...__. Subclasses must override
C: manually call NSMenu.popup in a custom subclasses of NSHostingController or NSView's rightMouseDown method - nothing happens.
extension View {
func contextMenu(menu: NSMenu) -> some View {
modifier(ContextMenuViewModifier(menu: menu))
}
}
struct ContextMenuViewModifier: ViewModifier {
let menu: NSMenu
func body(content: Content) -> some View {
Interaction_UI(
view: { content },
menu: menu
)
.fixedSize()
}
}
private struct Interaction_UI<Content: View>: NSViewRepresentable {
typealias NSViewType = NSView
@ViewBuilder var view: Content
let menu: NSMenu
func makeNSView(context: Context) -> NSView {
let v = NSHostingController(rootView: view)
// option A - no effect
v.view.menu = menu
// option B - crash
v.menu = menu
return v.view
}
func updateNSView(_ nsView: NSViewType, context: Context) {
// part of option A
nsView.menu = menu
}
}
When my CPMapButton is selected/focused, I would like to be able to provide a focusedImage to correctly show the button when the blue focus is shown. Currently I have:
What do I need to do to create an image that works more like the panning interface buttons?
My app inputs electrical waveforms from an IV485B39 2 channel USB device using an AVAudioSession. Before attempting to acquire data I make sure the input device is available as follows:
AVAudiosSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory :AVAudioSessionCategoryRecord error:&err];
NSArray *inputs = [audioSession availableInputs];
I have been using this code for about 10 years.
My app is scriptable so a user can acquire data from the IV485B29 multiple times with various parameter settings (sampling rates and sample duration). Recently the scripts have been failing to complete and what I have notice that when it fails the list of available inputs is missing the USBAudio input. While debugging I have noticed that when working properly the list of inputs includes both the internal microphone as well as the USBAudio device as shown below.
VIB_TimeSeriesViewController:***Available inputs = (
"<AVAudioSessionPortDescription: 0x11584c7d0, type = MicrophoneBuiltIn; name = iPad Microphone; UID = Built-In Microphone; selectedDataSource = Front>",
"<AVAudioSessionPortDescription: 0x11584cae0, type = USBAudio; name = 485B39 200095708064650803073200616; UID = AppleUSBAudioEngine:Digiducer.com :485B39 200095708064650803073200616:000957 200095708064650803073200616:1; selectedDataSource = (null)>"
)
But when it fails I only see the built in microphone.
VIB_TimeSeriesViewController:***Available inputs = (
"<AVAudioSessionPortDescription: 0x11584cef0, type = MicrophoneBuiltIn; name = iPad Microphone; UID = Built-In Microphone; selectedDataSource = Front>"
)
If I only see the built in microphone I immediately repeat the three lines of code and most of the "inputs" contains both the internal microphone and the USBAudioDevice
AVAudiosSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory :AVAudioSessionCategoryRecord error:&err];
NSArray *inputs = [audioSession availableInputs];
This fix always works on my M2 iPadPro and my iPhone 14 but some of my customers have older devices and even with 3 tries they still get faults about 1 in 10 tries.
I rolled back my code to a released version from about 12 months ago where I know we never had this problem and compiled it against the current libraries and the problem still exists. I assume this is a problem caused by a change in the AVAudioSession framework libraries. I need to find a way to work around the issue or get the library fixed.