I have a UIViewController that uses MKMapview to display the motion history trajectory. Repeatedly entering and exiting UIViewController will cause a crash, and the crash stack is as follows:
Exception Type: EXC_BAD_ACCESS (SIGSEGV)
Exception Subtype: KERN_INVALID_ADDRESS at 0x000000014bfc0fc8
Exception Codes: 0x0000000000000001, 0x000000014bfc0fc8
VM Region Info: 0x14bfc0fc8 is not in any region. Bytes after previous region: 217033 Bytes before following region: 61496
REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL
VM_ALLOCATE 14bf88000-14bf8c000 [ 16K] rw-/rwx SM=PRV
---> GAP OF 0x44000 BYTES
VM_ALLOCATE 14bfd0000-14bfd4000 [ 16K] rw-/rwx SM=PRV
Termination Reason: SIGNAL 11 Segmentation fault: 11
Terminating Process: exc handler [1881]
Triggered by Thread: 8
Thread 8 name: Dispatch queue: com.apple.root.background-qos
Thread 8 Crashed:
0 CoreFoundation 0x19e36ac40 CFRelease + 44
1 VectorKit 0x1ce16af6c md::TileGroupNotificationManager::~TileGroupNotificationManager() + 132
2 VectorKit 0x1cd6f7178 <deduplicated_symbol> + 76
3 VectorKit 0x1cdba8d74 -[VKSharedResources .cxx_destruct] + 32
4 libobjc.A.dylib 0x19b3321f8 object_cxxDestructFromClass(objc_object*, objc_class*) + 116
5 libobjc.A.dylib 0x19b32df20 objc_destructInstance_nonnull_realized(objc_object*) + 76
6 libobjc.A.dylib 0x19b32d4a4 _objc_rootDealloc + 72
7 VectorKit 0x1cdba93fc -[VKSharedResources dealloc] + 476
8 VectorKit 0x1cdafa3fc -[VKSharedResourcesManager _removeResourceUser] + 68
9 VectorKit 0x1cdafa380 +[VKSharedResourcesManager removeResourceUser] + 44
10 VectorKit 0x1cdafa2fc __37-[VKIconManager _internalIconManager]_block_invoke + 168
11 libdispatch.dylib 0x1d645b7ec _dispatch_client_callout + 16
12 libdispatch.dylib 0x1d6446664 _dispatch_continuation_pop + 596
13 libdispatch.dylib 0x1d6459528 _dispatch_source_latch_and_call + 396
14 libdispatch.dylib 0x1d64581fc _dispatch_source_invoke + 844
15 libdispatch.dylib 0x1d6453f48 _dispatch_root_queue_drain + 364
16 libdispatch.dylib 0x1d64546fc _dispatch_worker_thread2 + 180
17 libsystem_pthread.dylib 0x1f9b7e37c _pthread_wqthread + 232
18 libsystem_pthread.dylib 0x1f9b7d8c0 start_wqthread + 8
I have checked the code and did not find any issues. I have also tested on iOS 15, 16, and 18 without any issues. Could this be an error in the iOS 26 system? Have you ever met any friends? I hope to receive an answer. Thank you.
Explore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
I'm experiencing an issue where my List selection (using EditButton) gets cleared when a confirmationDialog is presented, making it impossible to delete the selected items.
Environment:
Xcode 16.0.1
Swift 5
iOS 18 (targeting iOS 17+)
Issue Description:
When I select items in a List using EditButton and tap a delete button that shows a confirmationDialog, the selection is cleared as soon as the dialog appears. This prevents me from accessing the selected items to delete them.
Code:
State variables:
@State var itemsSelection = Set<Item>()
@State private var showDeleteConfirmation = false
List with selection:
List(currentItems, id: \.self, selection: $itemsSelection) { item in
NavigationLink(value: item) {
ItemListView(item: item)
}
}
.navigationDestination(for: Item.self) { item in
ItemViewDetail(item: item)
}
.toolbar {
ToolbarItem(placement: .primaryAction) {
EditButton()
}
}
Delete button with confirmation:
Button {
if itemsSelection.count > 1 {
showDeleteConfirmation = true
} else {
deleteItemsSelected()
}
} label: {
Image(systemName: "trash.fill")
.font(.system(size: 12))
.foregroundStyle(Color.red)
}
.padding(8)
.confirmationDialog(
"Delete?",
isPresented: $showDeleteConfirmation,
titleVisibility: .visible
) {
Button("Delete", role: .destructive) {
deleteItemsSelected()
}
Button("Cancel", role: .cancel) {}
} message: {
Text("Going to delete: \(itemsSelection.count) items?")
}
Expected Behavior:
The selected items should remain selected when the confirmationDialog appears, allowing me to delete them after confirmation.
Actual Behavior:
As soon as showDeleteConfirmation becomes true and the dialog appears, itemsSelection becomes empty (count = 0), making it impossible to delete the selected items.
What I've Tried:
Moving the confirmationDialog to different view levels
Checking if this is related to the NavigationLink interaction
Has anyone encountered this issue? Is there a workaround to preserve the selection when showing a confirmation dialog?
Hi Everyone,
I’m encountering a layout issue in SwiftUI starting with iOS 26 / Xcode 26 (also reproduced on iPadOS 26). I would like to see whether anyone else has seen the same behaviour, and if there’s any known workaround or fix from Apple.
Reproduction
Minimal code:
struct ContentView: View {
@State private var inputText: String = ""
var body: some View {
TextField("", text: $inputText)
.font(.system(size: 14, weight: .semibold))
.background(Capsule().foregroundStyle(.gray))
}
}
Environment:
Xcode: 26.1 (17B55)
Devices / Simulators: iOS 26.0, iOS 26.0.1, iOS 26.1 on iPhone 17 Pro Max; iPadOS 26 on iPad.
OS: iOS 26 / iPadOS 26
Observed behaviour
When the TextField is tapped for the first time (focus enters), its height suddenly decreases compared to the unfocused state.
After this initial focus-tap, subsequent unfocus/focus cycles preserve the height (i.e., height remains “normal” and stable).
Prior to iOS 26 (i.e., in older iOS versions) I did not observe this behaviour — same code worked fine.
I have not explicitly set .frame(height:) or extra padding, and I’m relying on the default intrinsic height + the Capsule background.
Work-arounds
Adding .frame(height: X) (fixed height) removes the issue (height stable).
Thanks in advance for any feedback!
[Submitted as FB20978913]
When using .navigationTransition(.zoom) with a fullScreenCover, deleting the source item from the destination view causes a brief black flash during the dismiss animation. This is only visible in Light mode.
REPRO STEPS
Build and run the sample code below.
Set the device to Light mode.
Tap any row to open its detail view.
In the detail view, tap Delete.
Watch the dismiss animation as the list updates.
EXPECTED
The zoom transition should return smoothly to the list with no dark or black flash.
ACTUAL
A visible black flash appears over the deleted row during the collapse animation. It starts black, shortens, and fades out in sync with the row-collapse motion. The flash lasts about five frames and is consistently visible in Light mode.
NOTES
Occurs only when deleting from the presented detail view.
Does not occur when deleting directly from the list.
Does not occur or is not visible in Dark mode.
Reproducible on both simulator and device.
Removing .navigationTransition(.zoom) or using .sheet instead of .fullScreenCover avoids the issue.
SYSTEM INFO
Version 26.1 (17B55)
iOS 26.1
Devices: iPhone 17 Pro simulator, iPhone 13 Pro hardware
Appearance: Light
Reproducible 100% of the time
SAMPLE CODE
import SwiftUI
struct ContentView: View {
@State private var items = (0..<20).map { Item(id: $0, title: "Item \($0)") }
@State private var selectedItem: Item?
@Namespace private var ns
var body: some View {
NavigationStack {
List {
ForEach(items) { item in
Button {
selectedItem = item
} label: {
HStack {
Text(item.title)
Spacer()
}
.padding(.vertical, 8)
.contentShape(Rectangle())
}
.buttonStyle(.plain)
.matchedTransitionSource(id: item.id, in: ns)
.swipeActions {
Button(role: .destructive) {
delete(item)
} label: {
Label("Delete", systemImage: "trash")
}
}
}
}
.listStyle(.plain)
.navigationTitle("Row Delete Issue")
.navigationSubtitle("In Light mode, tap item then tap Delete to see black flash")
.fullScreenCover(item: $selectedItem) { item in
DetailView(item: item, ns: ns) {
delete(item)
selectedItem = nil
}
}
}
}
private func delete(_ item: Item) {
withAnimation {
items.removeAll { $0.id == item.id }
}
}
}
struct DetailView: View {
let item: Item
let ns: Namespace.ID
let onDelete: () -> Void
@Environment(\.dismiss) private var dismiss
var body: some View {
NavigationStack {
VStack(spacing: 30) {
Text(item.title)
Button("Delete", role: .destructive, action: onDelete)
}
.navigationTitle("Detail")
.toolbar {
Button("Close") { dismiss() }
}
}
.navigationTransition(.zoom(sourceID: item.id, in: ns))
}
}
struct Item: Identifiable, Hashable {
let id: Int
let title: String
}
SCREEN RECORDING
In my AppKit app I have a an outline view where each cell has a title (NSTextField), image view (NSImageView), and 4 NSButtons that are essentially static badges (their .bezelStyle is set to .badge). I’ve seeing very low performance when doing quick reloads in successions, for example when filtering data. My app is built on macOS Tahoe.
Profiling the app in instruments I can see that each initialization takes more than 70ms. Does this make sense?
Thanks!
Topic:
UI Frameworks
SubTopic:
AppKit
The code below is a test to trigger UI updates every 30 seconds. I'm trying to keep most work off main and only push to main once I have the string (which is cached). Why is updating SwiftUI 30 times per second so expensive? This code causes 10% CPU on my M4 Mac, but comment out the following line:
Text(model.timeString)
and it's 0% CPU. The reason why I think I have too much work on main is because of this from instruments. But I'm no instruments expert.
import SwiftUI
import UniformTypeIdentifiers
@main
struct RapidUIUpdateTestApp: App {
var body: some Scene {
DocumentGroup(newDocument: RapidUIUpdateTestDocument()) { file in
ContentView(document: file.$document)
}
}
}
struct ContentView: View {
@Binding var document: RapidUIUpdateTestDocument
@State private var model = PlayerModel()
var body: some View {
VStack(spacing: 16) {
Text(model.timeString) // only this changes
.font(.system(size: 44, weight: .semibold, design: .monospaced))
.transaction { $0.animation = nil } // no implicit animations
HStack {
Button(model.running ? "Pause" : "Play") {
model.running ? model.pause() : model.start()
}
Button("Reset") { model.seek(0) }
Stepper("FPS: \(Int(model.fps))", value: $model.fps, in: 10...120, step: 1)
.onChange(of: model.fps) { _, _ in model.applyFPS() }
}
}
.padding()
.onAppear { model.start() }
.onDisappear { model.stop() }
}
}
@Observable
final class PlayerModel {
// Publish ONE value to minimize invalidations
var timeString: String = "0.000 s"
var fps: Double = 30
var running = false
private var formatter: NumberFormatter = {
let f = NumberFormatter()
f.minimumFractionDigits = 3
f.maximumFractionDigits = 3
return f
}()
@ObservationIgnored private let q = DispatchQueue(label: "tc.timer", qos: .userInteractive)
@ObservationIgnored private var timer: DispatchSourceTimer?
@ObservationIgnored private var startHost: UInt64 = 0
@ObservationIgnored private var pausedAt: Double = 0
@ObservationIgnored private var lastFrame: Int = -1
// cache timebase once
private static let secsPerTick: Double = {
var info = mach_timebase_info_data_t()
mach_timebase_info(&info)
return Double(info.numer) / Double(info.denom) / 1_000_000_000.0
}()
func start() {
guard timer == nil else { running = true; return }
let desiredUIFPS: Double = 30 // or 60, 24, etc.
let periodNs = UInt64(1_000_000_000 / desiredUIFPS)
running = true
startHost = mach_absolute_time()
let t = DispatchSource.makeTimerSource(queue: q)
// ~30 fps, with leeway to let the kernel coalesce wakeups
t.schedule(
deadline: .now(),
repeating: .nanoseconds(Int(periodNs)), // 33_333_333 ns ≈ 30 fps
leeway: .milliseconds(30) // allow coalescing
)
t.setEventHandler { [weak self] in self?.tick() }
timer = t
t.resume()
}
func pause() {
guard running else { return }
pausedAt = now()
running = false
}
func stop() {
timer?.cancel()
timer = nil
running = false
pausedAt = 0
lastFrame = -1
}
func seek(_ seconds: Double) {
pausedAt = max(0, seconds)
startHost = mach_absolute_time()
lastFrame = -1 // force next UI update
}
func applyFPS() { lastFrame = -1 } // next tick will refresh string
// MARK: - Tick on background queue
private func tick() {
let s = now()
let str = formatter.string(from: s as NSNumber) ?? String(format: "%.3f", s)
let display = "\(str) s"
DispatchQueue.main.async { [weak self] in
self?.timeString = display
}
}
private func now() -> Double {
guard running else { return pausedAt }
let delta = mach_absolute_time() &- startHost
return pausedAt + Double(delta) * Self.secsPerTick
}
}
nonisolated struct RapidUIUpdateTestDocument: FileDocument {
var text: String
init(text: String = "Hello, world!") {
self.text = text
}
static let readableContentTypes = [
UTType(importedAs: "com.example.plain-text")
]
init(configuration: ReadConfiguration) throws {
guard let data = configuration.file.regularFileContents,
let string = String(data: data, encoding: .utf8)
else {
throw CocoaError(.fileReadCorruptFile)
}
text = string
}
func fileWrapper(configuration: WriteConfiguration) throws -> FileWrapper {
let data = text.data(using: .utf8)!
return .init(regularFileWithContents: data)
}
}
In our app, there is a UIWindow makeKeyAndVisible crash, and for now, it appears once, crash stack:
the crash detail:
crash.txt
in the RCWindowSceneManager class's makeWindowKeyAndVisible method, we check and set a window's windowScene and makeKeyAndVisible:
public func makeWindowKeyAndVisible(_ window: UIWindow?) {
guard let window else {
return
}
if let currentWindowScene {
if window.windowScene == nil || window.windowScene != currentWindowScene {
window.windowScene = currentWindowScene
}
window.makeKeyAndVisible()
}
}
and I set a break point at a normal no crash flow, the stack is:
why it crash? and how we avoid this, thank you.
The WatchOS app and view lifecycles for WatchKit and SwiftUI are documented in https://developer.apple.com/documentation/watchkit/working-with-the-watchos-app-life-cycle and https://developer.apple.com/documentation/swiftui/migrating-to-the-swiftui-life-cycle.
WatchOS 26 appears to change the app & view lifecycle from the behavior in WatchOS 11, and no longer matches the documented lifecycles.
On WatchOS 11, with a @WKApplicationDelegateAdaptor set, the following sequence of events would occur on app launch:
WKApplicationDelegate applicationDidFinishLaunching in WKApplicationState .inactive.
WKApplicationDelegate applicationWillEnterForeground in WKApplicationState .inactive.
View .onAppear @Environment(.scenePhase) .inactive
App onChange(of: @Environment(.scenePhase)): .active
WKApplicationDelegate applicationDidBecomeActive in WKApplicationState .active.
App onReceive(.didBecomeActiveNotification): WKApplicationState(rawValue: 0)
View .onChange of: .@Environment(.scenePhase) .active
In WatchOS 26, this is now:
WKApplicationDelegate applicationDidFinishLaunching in WKApplicationState .inactive.
WKApplicationDelegate applicationWillEnterForeground in WKApplicationState .inactive.
App onChange(of: @Environment(.scenePhase)): .active
WKApplicationDelegate applicationDidBecomeActive in WKApplicationState .active.
View .onAppear @Environment(.scenePhase) .active
When resuming from the background in WatchOS 11:
App onChange(of: @Environment(.scenePhase)): inactive
WKApplicationDelegate applicationWillEnterForeground in WKApplicationState .background.
App onReceive(.willEnterForegroundNotification): WKApplicationState(rawValue: 2)
View .onChange of: .@Environment(.scenePhase) inactive
App onChange(of: @Environment(.scenePhase)): active
WKApplicationDelegate applicationDidBecomeActive in WKApplicationState .active.
App onReceive(.didBecomeActiveNotification): WKApplicationState(rawValue: 0)
View .onChange of: .@Environment(.scenePhase) active
The resume from background process in WatchOS 26 is baffling and seems like it must be a bug:
App onChange(of: @Environment(.scenePhase)): inactive
WKApplicationDelegate applicationWillEnterForeground in WKApplicationState .background.
App onReceive(.willEnterForegroundNotification): WKApplicationState(rawValue: 2)
App onChange(of: @Environment(.scenePhase)): active
WKApplicationDelegate applicationDidBecomeActive in WKApplicationState .active.
App onReceive(.didBecomeActiveNotification): WKApplicationState(rawValue: 0)
View .onChange of: @Environment(.scenePhase) active
App onChange(of: @Environment(.scenePhase)): inactive
WKApplicationDelegate applicationWillResignActive in WKApplicationState .active.
App onReceive(.willResignActiveNotification): WKApplicationState(rawValue: 0)
View .onChange of: @Environment(.scenePhase) inactive
App onChange(of: @Environment(.scenePhase)): active
WKApplicationDelegate applicationDidBecomeActive in WKApplicationState .active.
App onReceive(.didBecomeActiveNotification): WKApplicationState(rawValue: 0)
View .onChange of: @Environment(.scenePhase) active
The app becomes active, then inactive, then active again.
The issues with these undocumented changes are:
It is undocumented.
If you relied on the previous process, this change can break your app.
A view no longer receives .onChange of: .@Environment(.scenePhase) .active state change during the launch process.
This bizarre applicationWillEnterForeground - applicationDidBecomeActive - applicationWillResignActive - applicationDidBecomeActive process on app resume does not match the documented process and is just...strange.
Is this new process what is intended? Is it a bug? Can an Apple engineer explain this new App resume from background process and why the View is created slightly later in the App launch process, so it does not receive the .onChange of @Environment(.scenePhase) message?
In contrast, the iOS 26 app lifecycle has not changed, and the iOS 18/26 app lifecycle closely follows the watchOS 11 app lifecycle (or watchOS 11 closely mimics iOS 18/26 with the exception that watchOS does not have a SceneDelegate).
Hi there! I'm making an app that stores data for the user's profile in SwiftData. I was originally going to use UserDefaults but I thought SwiftData could save Images natively but this is not true so I really could switch back to UserDefaults and save images as Data but I'd like to try to get this to work first. So essentially I have textfields and I save the values of them through a class allProfileData. Here's the code for that:
import SwiftData
import SwiftUI
@Model
class allProfileData {
var profileImageData: Data?
var email: String
var bio: String
var username: String
var profileImage: Image {
if let data = profileImageData,
let uiImage = UIImage(data: data) {
return Image(uiImage: uiImage)
} else {
return Image("DefaultProfile")
}
}
init(email:String, profileImageData: Data?, bio: String, username:String) {
self.profileImageData = profileImageData
self.email = email
self.bio = bio
self.username = username
}
}
To save this I create a new class (I think, I'm new) and save it through ModelContext
import SwiftUI
import SwiftData
struct CreateAccountView: View {
@Query var profiledata: [allProfileData]
@Environment(\.modelContext) private var modelContext
let newData = allProfileData(email: "", profileImageData: nil, bio: "", username: "")
var body: some View {
Button("Button") {
newData.email = email
modelContext.insert(newData)
try? modelContext.save()
print(newData.email)
}
}
}
To fetch the data, I originally thought that @Query would fetch that data but I saw that it fetches it asynchronously so I attempted to manually fetch it, but they both fetched nothing
import SwiftData
import SwiftUI
@Query var profiledata: [allProfileData]
@Environment(\.modelContext) private var modelContext
let fetchRequest = FetchDescriptor<allProfileData>()
let fetchedData = try? modelContext.fetch(fetchRequest)
print("Fetched count: \(fetchedData?.count ?? 0)")
if let imageData = profiledata.first?.profileImageData,
let uiImage = UIImage(data: imageData) {
profileImage = Image(uiImage: uiImage)
} else {
profileImage = Image("DefaultProfile")
}
No errors. Thanks in advance
Hello
I'm trying to use a TabView inside of the Sidebar in a NavigationSplitView.
I'm wanting to use .listStyle(.sidebar) in order to get the Liquid Glass effect.
However I can not find a way to remove the background of the TabView without changing the behavior of the TabView itself to paging with .tabViewStyle(.page)
NavigationSplitView {
TabView(
selection: .constant("List")
) {
Tab(value: "List") {
List {
Text("List")
}
}
}
} detail: {
}
Note: I wanting change the background of the TabView container itself. Not the TabBar.
Hey there,
I have an app that allows picking any folder via UIDocumentPickerViewController. Up until iOS18 users were able to pick folders from connected servers (servers connected in the Files app) as well.
On iOS26, the picker allows for browsing into the connected servers, but the Select button is greyed out and does nothing when tapped.
Is this a known issue? This breaks the whole premise of my file syncronization application.
I'm looking for clarification on a SwiftUI performance point mentioned in the recent Optimize your app's speed and efficiency | Meet with Apple video.
(YouTube link not allowed, but the video is available on the Apple Developer channel.)
At the 1:48:50 mark, the presenter says:
Writing a value to the Environment doesn't only affect the views that read the key you're updating. It updates any view that reads from any Environment key. [abbreviated quote]
That statement seems like a big deal if your app relies heavily on Environment values.
Context
I'm building a macOS application with a traditional three-panel layout. At any given time, there are many views on screen, plus others that exist in the hierarchy but are currently hidden (for example, views inside tab views or collapsed splitters).
Nearly every major view reads something from the environment—often an @Observable object that acts as a service or provider.
However, there are a few relatively small values that are written to the environment frequently, such as:
The selected tab index
The currently selected object on a canvas
The Question
Based on the presenter's statement, I’m wondering:
Does writing any value to the environment really cause all views in the entire SwiftUI view hierarchy that read any environment key to have their body re-evaluated?
Do environment writes only affect child views, or do they propagate through the entire SwiftUI hierarchy?
Example:
View A
└─ View B
├─ View C
└─ View D
If View B updates an environment value, does that affect only C and D, or does it also trigger updates in A and B (assuming each view has at least one @Environment property)?
Possible Alternative
If all views are indeed invalidated by environment writes, would it be more efficient to “wrap” frequently-changing values inside an @Observable object instead of updating the environment directly?
// Pseudocode
@Observable final class SelectedTab {
var index: Int
}
ContentView()
.environment(\.selectedTab, selectedTab)
struct TabView: View {
@Environment(\.selectedTab) private var selectedTab
var body: some View {
Button("Action") {
// Would this avoid invalidating all views using the environment?
selectedTab.index = 1
}
}
}
Summary
From what I understand, it sounds like the environment should primarily be used for stable, long-lived objects—not for rapidly changing values—since writes might cause far more view invalidations than most developers realize.
Is that an accurate interpretation?
Follow-Up
In Xcode 26 / Instruments, is there a way to monitor writes to @Environment?
Since iPadOS 26.1 I notice a new annoying bug when changing the dark mode option of the system. The appearance of the UI changes, but no longer for view controllers which are presented as Popover. For these view controllers the method "traitCollectionDidChange()" is still called (though sometimes with a very large delay), but checking the traitCollection property of the view controller in there does no longer return the correct appearance (which is probably why the visual appearance of the popover doesn't change anymore). So if the dark mode was just switched on, traitCollectionDidChange() is called, but the "traitCollection.userInterfaceStyle" property still tells me that the system is in normal mode.
More concrete, traitCollection.userInterfaceStyle seems to be set correctly only(!) when opening the popover, and while the popover is open, it is never updated anymore when the dark mode changes.
This is also visible in the standard Apps of the iPad, like the Apple Maps App: just tap on the "map" icon at the top right to open the "Map mode" view. While the view is open, change the dark mode. All of the Maps App will change its appearance, with the exception of this "Map mode" view.
Does anyone know an easy workaround? Or do I really need to manually change the colors for all popup view controllers whenever the dark mode changes? Using dynamic UIColors won't help, because these rely on the "userInterfaceStyle" property, and this is no longer correct.
Bugreport: FB20928471
I've being playing aground with long press gesture in scroll view and noticed gesture(LongPressGesture()) doesn't seem to work with scroll view's scrolling which doesn't seem to be the intended behavior to me.
Take the following example: the blue rectangle is modified with onLongPressGesture and the red rectangle is modified with LongPressGesture (_EndedGesture<LongPressGesture> to be specific).
ScrollView {
Rectangle()
.fill(.blue)
.frame(width: 200, height: 200)
.onLongPressGesture {
print("onLongPressGesture performed")
} onPressingChanged: { _ in
print("onLongPressGesture changed")
}
.overlay {
Text("onLongPressGesture")
}
Rectangle()
.fill(.red)
.frame(width: 200, height: 200)
.gesture(LongPressGesture()
.onEnded { _ in
print("gesture ended")
})
.overlay {
Text("gesture(LongPressGesture)")
}
}
If you start scrolling from either of the rectangles (that is, start scrolling with your finger on either of the rectangles), the ScrollView will scroll.
However, if LongPressGesture is modified with either onChanged or .updating, ScrollView won't respond to scroll if the scroll is started from red rectangle. Even setting the maximumDistance to 0 won't help. As for its counter part onLongPressGesture, even though onPressingChanged to onLongPressGesture, scrolling still works if it's started from onLongPressGesture modified view.
ScrollView {
Rectangle()
.fill(.blue)
.frame(width: 200, height: 200)
.onLongPressGesture {
print("onLongPressGesture performed")
} onPressingChanged: { _ in
print("onLongPressGesture changed")
}
.overlay {
Text("onLongPressGesture")
}
Rectangle()
.fill(.red)
.frame(width: 200, height: 200)
.gesture(LongPressGesture(maximumDistance: 0)
// scroll from the red rectangle won't work if I add either `updating` or `onChanged` but I put both here just to demonstrate
// you will need to add `@GestureState private var isPressing = false` to your view body
.updating($isPressing) { value, state, transaction in
state = value
print("gesture updating")
}
.onChanged { value in
print("gesture changed")
}
.onEnded { _ in
print("gesture ended")
})
.overlay {
Text("gesture(LongPressGesture)")
}
}
This doesn't seem right to me. I would expect the view modified by LongPressGesture(), no matter if the gesture has onChanged or updating, should be able to start scroll in a scroll view, just like onLongPressGesture.
I observed this behavior in a physical device running iOS 26.1, and I do not know the behavior on other versions.
In one of my apps, i am using .glassEffect(_:In) to add glass effect on various elements. The app always crushes when a UI element with glassEffect(_in:) modifier is being rendered. This only happens on device running iOS 26 public beta. I know this for certain because I connected the particular device to xcode and run the app on the device. When i comment out the glassEffect modifier, app doesn't crush.
Is it possible to check particular realeases with #available? If not, how should something like this be handled. Also how do i handle such os level erros without the app crushing. Thanks.
The same code that I have, runs fine on iOS 26.0, but on iOS 26.1, there's a delay in the Button with role .confirm to be shown properly and tinted.
Shown in the screen recording here ->
https://imgur.com/a/uALuW50
This is my code that shows slightly different button in iOS 18 vs iOS26.
var body: some View {
if #available(iOS 26.0, *) {
Button("Save", systemImage: "checkmark", role: .confirm) {
action()
}.labelStyle(.iconOnly)
} else {
Button("Save") {
action()
}
}
}
Topic:
UI Frameworks
SubTopic:
SwiftUI
I’m trying to figure out how to extend PaperKit beyond a single fixed-size canvas.
From what I understand, calling PaperMarkup(bounds:) creates one finite drawing region, and so far I have not figured out a reliable way to create multi-page or infinite canvases.
Are any of these correct?
Creating multiple PaperMarkup instances, each managed by its own PaperMarkupViewController, and arranging them in a ScrollView or similar paged container to represent multiple pages?
Overlaying multiple PaperMarkup instances on top of PDFKit pages for paged annotation workflows?
Or possibly another approach that works better with PaperKit’s design?
I mean it has to be possible, right? Apple's native Preview app almost certainly uses it, and there are so many other notes apps that get this behavior working correctly, even if it requires using a legacy thing other than PaperKit.
Curious if others have been able to find the right pattern for going beyond a single canvas.
Such a simple piece of code:
import SwiftUI
import WebKit
struct ContentView: View {
var body: some View {
WebView(url: URL(string: "https://www.apple.com"))
}
}
When I run this, the web content shows under the top notch’s safe area, and buttons inside that region aren’t tappable. I tried a bunch of things and the only “fix” that seems to work is .padding(.top, 1), but that leaves a noticeable white strip in non-portrait orientations.
What’s the proper way to solve this? Safari handles the safe area correctly and doesn’t render content there.
Hi there! I'm having this issue with my main windows. I'm having a big space on top of that without any logic explanation (at least for my poor knowledge).
Using the code below I'm getting this Windows layout:
Does anybody have any guidance on how to get out that extra space at the beginning?
Thanks a lot!
import SwiftUI
import SwiftData
#if os(macOS)
import AppKit
#endif
// Helper to access and control NSWindow for size/position persistence
#if os(macOS)
struct WindowAccessor: NSViewRepresentable {
let onWindow: (NSWindow) -> Void
func makeNSView(context: Context) -> NSView {
let view = NSView()
DispatchQueue.main.async {
if let window = view.window {
onWindow(window)
}
}
return view
}
func updateNSView(_ nsView: NSView, context: Context) {
DispatchQueue.main.async {
if let window = nsView.window {
onWindow(window)
}
}
}
}
#endif
@main
struct KaraoPartyApp: App {
@StateObject private var songsModel = SongsModel()
@Environment(\.openWindow) private var openWindow
var body: some Scene {
Group {
WindowGroup {
#if os(macOS)
WindowAccessor { window in
window.minSize = NSSize(width: 900, height: 700)
// Configure window to eliminate title bar space
window.titleVisibility = .hidden
window.titlebarAppearsTransparent = true
window.styleMask.insert(.fullSizeContentView)
}
#endif
ContentView()
.environmentObject(songsModel)
}
.windowToolbarStyle(.unifiedCompact)
.windowResizability(.contentSize)
.defaultSize(width: 1200, height: 900)
.windowStyle(.titleBar)
#if os(macOS)
.windowToolbarStyle(.unified)
#endif
WindowGroup("CDG Viewer", id: "cdg-viewer", for: CDGWindowParams.self) { $params in
if let params = params {
ZStack {
#if os(macOS)
WindowAccessor { window in
window.minSize = NSSize(width: 600, height: 400)
// Restore window frame if available
let key = "cdgWindowFrame"
let defaults = UserDefaults.standard
if let frameString = defaults.string(forKey: key) {
let frame = NSRectFromString(frameString)
if window.frame != frame {
window.setFrame(frame, display: true)
}
} else {
// Open CDG window offset from main window
if let mainWindow = NSApp.windows.first {
let mainFrame = mainWindow.frame
let offsetFrame = NSRect(x: mainFrame.origin.x + 60, y: mainFrame.origin.y - 60, width: 800, height: 600)
window.setFrame(offsetFrame, display: true)
}
}
// Observe frame changes and save
NotificationCenter.default.addObserver(forName: NSWindow.didMoveNotification, object: window, queue: .main) { _ in
let frameStr = NSStringFromRect(window.frame)
defaults.set(frameStr, forKey: key)
}
NotificationCenter.default.addObserver(forName: NSWindow.didEndLiveResizeNotification, object: window, queue: .main) { _ in
let frameStr = NSStringFromRect(window.frame)
defaults.set(frameStr, forKey: key)
}
}
#endif
CDGView(
cancion: Cancion(
title: params.title ?? "",
artist: params.artist ?? "",
album: "",
genre: "",
year: "",
bpm: "",
playCount: 0,
folderPath: params.cdgURL.deletingLastPathComponent().path,
trackName: params.cdgURL.deletingPathExtension().lastPathComponent + ".mp3"
),
backgroundType: params.backgroundType,
videoURL: params.videoURL,
cdfContent: params.cdfContent.flatMap { String(data: $0, encoding: .utf8) },
artist: params.artist,
title: params.title
)
}
} else {
Text("No se pudo abrir el archivo CDG.")
}
}
.windowResizability(.contentSize)
.defaultSize(width: 800, height: 600)
WindowGroup("Metadata Editor", id: "metadata-editor") {
MetadataEditorView()
.environmentObject(songsModel)
}
.windowResizability(.contentSize)
.defaultSize(width: 400, height: 400)
WindowGroup("Canciones DB", id: "canciones-db") {
CancionesDBView()
}
.windowResizability(.contentSize)
.defaultSize(width: 800, height: 500)
WindowGroup("Importar canciones desde carpeta", id: "folder-song-importer") {
FolderSongImporterView()
}
.windowResizability(.contentSize)
.defaultSize(width: 500, height: 350)
}
.modelContainer(for: Cancion.self)
// Add menu command under Edit
.commands {
CommandGroup(replacing: .pasteboard) { }
CommandMenu("Edit") {
Button("Actualizar Metadatos") {
openWindow(id: "metadata-editor")
}
.keyboardShortcut(",", modifiers: [.command, .shift])
}
CommandMenu("Base de Datos") {
Button("Ver Base de Datos de Canciones") {
openWindow(id: "canciones-db")
}
.keyboardShortcut("D", modifiers: [.command, .shift])
}
}
}
init() {
print("\n==============================")
print("[KaraoParty] Nueva ejecución iniciada: \(Date())")
print("==============================\n")
}
}
Hi everyone,
I’m building a full-screen Map (MapKit + SwiftUI) with persistent top/bottom chrome (menu buttons on top, session stats + map controls on bottom). I have three working implementations and I’d like guidance on which pattern Apple recommends long-term (gesture correctness, safe areas, Dynamic Island/home indicator, and future compatibility).
Version 1 — overlay(alignment:) on Map
Idea: Draw chrome using .overlay(alignment:) directly on the map and manage padding manually.
Map(position: $viewModel.previewMapCameraPosition, scope: mapScope) {
UserAnnotation {
UserLocationCourseMarkerView(angle: viewModel.userCourse - mapHeading)
}
}
.mapStyle(viewModel.mapType.mapStyle)
.mapControls {
MapUserLocationButton().mapControlVisibility(.hidden)
MapCompass().mapControlVisibility(.hidden)
MapPitchToggle().mapControlVisibility(.hidden)
MapScaleView().mapControlVisibility(.hidden)
}
.overlay(alignment: .top) { mapMenu } // manual padding inside
.overlay(alignment: .bottom) { bottomChrome } // manual padding inside
Version 2 — ZStack + .safeAreaPadding
Idea: Place the map at the back, then lay out top/bottom chrome in a VStack inside a ZStack, and use .safeAreaPadding(.all) so content respects safe areas.
ZStack(alignment: .top) {
Map(...).ignoresSafeArea()
VStack {
mapMenu
Spacer()
bottomChrome
}
.safeAreaPadding(.all)
}
Version 3 — .safeAreaInset on the Map
Idea: Make the map full-bleed and then reserve top/bottom space with safeAreaInset, letting SwiftUI manage insets
Map(...).ignoresSafeArea()
.mapStyle(viewModel.mapType.mapStyle)
.mapControls {
MapUserLocationButton().mapControlVisibility(.hidden)
MapCompass().mapControlVisibility(.hidden)
MapPitchToggle().mapControlVisibility(.hidden)
MapScaleView().mapControlVisibility(.hidden)
}
.safeAreaInset(edge: .top) { mapMenu } // manual padding inside
.safeAreaInset(edge: .bottom) { bottomChrome } // manual padding inside
Question
I noticed:
Safe-area / padding behavior
– Version 2 requires the least extra padding and seems to create a small but partial safe-area spacing automatically.
– Version 3 still needs roughly the same manual padding as Version 1, even though it uses safeAreaInset. Why doesn’t safeAreaInset fully handle that spacing?
Rotation crash (Metal)
When using Version 3 (safeAreaInset + ignoresSafeArea), rotating the device portrait↔landscape several times triggers a
Metal crash:
failed assertion 'The following Metal object is being destroyed while still required… CAMetalLayer Display Drawable'
The same crash can happen with Version 1, though less often. I haven’t tested it much with Version 2.
Is this a known issue or race condition between Map’s internal Metal rendering and view layout changes?
Expected behavior
What’s the intended or supported interaction between safeAreaInset, safeAreaPadding, and overlay when embedding persistent chrome inside a SwiftUI Map?
Should safeAreaInset normally remove the need for manual padding, or is that by design?