struct ContentView: View {
@State var isPresented = false
var body: some View {
Button {
isPresented.toggle()
} label: {
Text("Button")
}
.sheet(isPresented: $isPresented) {
SubView()
}
}
}
struct SubView: View {
@State var text = ""
var body: some View {
NavigationStack {
TextEditor(text: $text)
.toolbar {
ToolbarItemGroup(placement: .bottomBar) {
Button("Click") {
}
}
ToolbarItemGroup(placement: .keyboard) {
Button("Click") {
}
}
}
}
}
}
Explore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
[Also submitted as FB19313064]
The .disabled() modifier doesn't visually disable buttons inside a ToolbarItem container on iOS 26.0 (23A5297i) devices. The button looks enabled, but tapping it doesn't trigger the action.
When deployment target is lowered to iOS 18 and deployed to an iOS 18 device, it works correctly. It still fails on an iOS 26 device, even with an iOS 18-targeted build.
This occurs in both the Simulator and on a physical device.
Screen Recording
Code
struct ContentView: View {
@State private var isButtonDisabled = false
private var osTitle: String {
let version = ProcessInfo.processInfo.operatingSystemVersion
return "iOS \(version.majorVersion)"
}
var body: some View {
NavigationStack {
VStack {
Button("Body Button") {
print("Body button tapped")
}
.buttonStyle(.borderedProminent)
.disabled(isButtonDisabled)
Toggle("Disable buttons", isOn: $isButtonDisabled)
Spacer()
}
.padding()
.navigationTitle("Device: \(osTitle)")
.navigationBarTitleDisplayMode(.large)
.toolbar {
ToolbarItem {
Button("Toolbar") {
print("Toolbar button tapped")
}
.buttonStyle(.borderedProminent)
.disabled(isButtonDisabled)
}
}
}
}
}
I’m not seeing Liquid Glass on any standard components. A month ago around July 17th I ran our app and saw Liquid Glass on our tab view and various standard components. Those components have not been changed and yet I’m no longer seeing Liquid Glass in our app at all.
Components that were previously liquid glass but now are not include TabView and back navigation buttons.
I set the UIDesignRequiresCompatibility key explicitly to false but no luck. I was seeing this in Beta 7 and Beta 8 on a real device and on a sim.
When I try to show/hide the content in .safeAreaBar(edge: .bottom), especially the content with a large height, the background animation of the toolbar is very laggy.
iOS 26 RC
Feedback ID - FB19768797
import SwiftUI
struct ContentView: View {
@State private var isShown: Bool = false
var body: some View {
NavigationStack {
Button("Toggle") {
withAnimation {
isShown.toggle()
}
}
ScrollView(.vertical) {
ForEach(0..<100) { index in
Text("\(index)")
.padding()
.border(.blue)
.background(.blue)
.frame(maxWidth: .infinity)
}
}
.scrollEdgeEffectStyle(.soft, for: .bottom)
.safeAreaBar(edge: .bottom) {
if isShown {
Text("Safe area bar")
.padding(64)
.background(.red)
}
}
}
}
}
#Preview {
ContentView()
}
I have checked the sample project from the documentation page
and noticed there is an issue with image/images not being preselected. The issue happens on iOS 26.1 and above (checked iOS 26.2 beta).
I couldn't find any change to the PhotoPicker in the documentation.
Topic:
UI Frameworks
SubTopic:
UIKit
I have a UIKit app with a UIHostingController embedded as a child controller.
In this UIHostingController there's a SwiftUI view which expands and collapses with an animation to show/hide content within it.
The hosting controller uses .intrinsicContentSize sizing option.
This all works fine, and the animation of the expand/collapse looks good so far as the SwiftUI view, in a preview for example.
But running the app the hosting controller doesn't animate its view's size alongside the SwiftUI view animating its size.
Instead the hosting controller jumps from the correct start/end sizes without any animation.
So technically although it has the right size when the SwiftUI view is expanded/collapsed, this not a nice UX as the hosting controller jumps immediately from its small size to its larger one on expanding and vice versa for collapsing while the SwiftUI contents animates nicely.
I'm assuming there's somewhere I should be calling (in some form or another) a:
UIView.animate(withDuration: 0.4) {
hostingController.view.layoutIfNeeded()
}
alongside the SwiftUI animations - but I can't see any hook between my SwiftUI view .animation(value:) function and somewhere that hosting controller could jump in alongside this animation and animate its view's frame.
Topic:
UI Frameworks
SubTopic:
SwiftUI
I am using AlarmKit to schedule alarms in an app I am working on, however my scheduled alarms only show up on the lock screen. If I am on the home screen or elsewhere I only hear the sound of the alarm, but no UI shows up.
Environment:
iOS 26 beta 3
Xcode 26 beta 3
Topic:
UI Frameworks
SubTopic:
SwiftUI
I was wondering what the recommended way is to persist user settings with SwiftData?
It seems the SwiftData API is focused around querying for multiple objects, but what if you just want one UserSettings object that is persisted across devices say for example to store the user's age or sorting preferences.
Do we just create one object and then query for it or is there a better way of doing this?
Right now I am just creating:
import SwiftData
@Model
final class UserSettings {
var age: Int = 0
var sortAtoZ: Bool = true
init(age: Int = 0, sortAtoZ: Bool = true) {
self.age = age
self.sortAtoZ = sortAtoZ
}
}
In my view I am doing as follows:
import SwiftUI
import SwiftData
struct SettingsView: View {
@Environment(\.modelContext) var context
@Query var settings: [UserSettings]
var body: some View {
ForEach(settings) { setting in
let bSetting = Bindable(setting)
Toggle("Sort A-Z", isOn: bSetting.sortAtoZ)
TextField("Age", value: bSetting.age, format: .number)
}
.onAppear {
if settings.isEmpty {
context.insert(UserSettings(age: 0, sortAtoZ: true))
}
}
}
}
Unfortunately, there are two issues with this approach:
I am having to fetch multiple items when I only ever want one.
Sometimes when running on a new device it will create a second UserSettings while it is waiting for the original one to sync from CloudKit.
AppStorage is not an option here as I am looking to persist for the user across devices and use CloudKit syncing.
Since macOS 26 Beta 1, I notice that the window reopening behavior had changed.
Say there are two desktops (spaces), one might:
open an app window in desktop 1
close that window
switch to desktop 2
reopen the app window (by click on dock tile, spotlight search...)
Prior to macOS 26, that window will always reopen in current desktop. This is IMO the right behavior because these windows are most likely transient (message app, chat app, utilities app or note app).
In macOS 26, however, will switch to desktop 1 (where the window is closed) and reopen the window in desktop 1.
This is weird to me because:
Window is "closed", hence it should not be attached to desktop 1 anymore, unlike minimize.
Switching desktop interrupts user's current workflow. It's annoying to switch back specially when there're many desktops.
This behavior is inconsistent. Some reopen in current desktop, some reopen in previous desktop. Apps like Music, Notes and Calendar reopened in previous desktop, while Mail, Messages, and Freeform reopened in current desktop.
I did a little bit of experiment, and find out that apps that reopened in current desktop are most likely because they take an extra step to release the window when it's closed.
I believe this is a bug, so I fire a feedback (FB18016497) back in beta 1. But I did not get any response or similar report from others, to a point that I kinda wonder if this is intended.
I can easily force my app to reopen in current desktop by nullifying my window controller in windowWillClose, but this behavior essentially change how one can use the Spaces feature that I think I should bring this up to the community and see what other developers or engineers thinks about it.
It appears that hidesBottomBarWhenPushed no longer works in iOS 26 Beta 1.
Is it supposed to work, is it going away or is there a alternate behavior we should be using?
Topic:
UI Frameworks
SubTopic:
UIKit
Issue Summary:
On iOS 26.0.1 to 26.3, apps using multiple UITextFields for OTP input face a critical issue where the system autofill pastes the entire OTP string into a single text field, usually the focused one, rather than splitting digits across fields. Delegate events like textDidChange: do not trigger consistently on autofill, breaking existing input handling logic.
Expected Behavior:
OTP autofill should distribute each digit correctly across all OTP UITextFields.
Delegate or control events should fire on autofill to enable manual handling.
(BOOL)textField:(UITextField *)textField
shouldChangeCharactersInRange:(NSRange)range
replacementString:(NSString *)string {
if (string.length > 1) {
// Autofill detected - distribute OTP manually
for (int i = 0; i < string.length && i < self.arrayOTPText.count; i++) {
UITextField *field = self.arrayOTPText[i];
field.text = [NSString stringWithFormat:@"%c", [string characterAtIndex:i]];
}
UITextField *lastField = self.arrayOTPText[string.length - 1];
[lastField becomeFirstResponder];
return NO;
}
// Handle normal single character or deletion input here
return YES;
}
//
// Setup UITextFields - set .oneTimeCode on first field only
for (int i = 0; i < self.arrayOTPText.count; i++) {
UITextField *field = self.arrayOTPText[i];
field.delegate = self;
if (@available(iOS 12.0, *)) {
field.textContentType = (i == 0) ? UITextContentTypeOneTimeCode : UITextContentTypeNone;
}
}
What We’ve Tried:
Setting textContentType properly.
Handling OTP distribution in delegate method.
Verifying settings and keyboard use.
Testing on multiple iOS 26.x versions.
Impact:
Major usability degradation during OTP entry.
Forces fragile workarounds.
Inconsistent autofill reduces user confidence.
Request:
Request Apple fix OTP autofill to natively support multi-field UITextField OTP input or provide enhanced delegate callbacks for consistent behavior.
Did any one face this issue in recent time with iOS 26.0.1 to 26.3 beta version
On testing my app with tvOS 18, I have noticed the Siri Remote back button no longer provides system-provided behavior when interacting with tab bar controller pages. Instead of moving focus back to the tab bar when pressed, the back button will close the app, as if the Home button was pressed. This occurs both on device and in the Simulator.
Create tvOS project with a tab bar controller.
Create pages/tabs which contain focusable items (ie. buttons)
Scroll down to any focusable item (ie. a button or UICollectionView cell)
Hit the Siri Remote back button. See expect behavior below:
Expected behavior: System-provided behavior should move focus back to the tab bar at the top of the screen.
Actual results: App is closed and user is taken back to the Home Screen.
Has anyone else noticed this behavior?
My app doesn't respond on iPhone Air iOS 26.1.
After startup, my app shows the main view with a tab bar controller containing 4 navigation controllers. However, when a second-level view controller is pushed onto any navigation controller, the UI freezes and becomes unresponsive. The iPhone simulator running iOS 26.1 exhibits the same problem.
The debug profile shows CPU usage at 100%.
However, other devices and simulators do not have this problem.
I'm developing a rhythm game for iOS which has four buttons spanning the width of the screen in portrait. I noticed that my testers were having some missed inputs on the buttons on the left and right due to the fact that iOS, by default, tries to ignore accidental touches on the edges of the screen. So I enabled "Defer System Gestures" on the left and right edges, but then quickly started to notice a new, very specific, issue.
Description of the issue
If you have finger #1 touching and holding anywhere in the middle of the screen, and finger #2 touches on the far right or left edge of the screen just below the horizontal position of finger #1, those touches are inconsistently not recognized. If finger #1 is not present, this issue does not occur. If finger #2 is above or well below finger #1, this issue also does not occur. A dead zone is created on the right and left edges of the screen just below the horizontal position of the first touch.
Here is a rough representative example of where touches #1 and #2 need to be for this issue to manifest, in case the text above is not clear.
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9;&#9;|
|&#9; 1&#9;&#9;|
|&#9;&#9;&#9; 2|
|&#9;&#9;&#9;&#9;|
It just so happens that this issue is causing major usability problems with my game, as it results in what the user sees as sporadic and inconsistent response when the game calls for two notes to be played at the same time.
Steps to recreate the issue
Here are the steps if you want to recreate the problem yourself using the "Create New Gesture" pane in "Assistive Touch" (Note that this problem is not specific to the Settings app, but rather is an issue across the system—however this panel defers system gestures and shows where touches are being read, so it is a great place to demonstrate):
(1) Go to Settings > Accessibility > Touch > Assistive Touch > Create New Gesture...;
(2) With one finger, touch the middle of the screen and hold it through step 3;
(3) With a second finger, tap 4 times along the right (or left) edge of the screen in the following places:
(a) well above the vertical position of the first touch,
(b) just above the vertical position of the first touch,
(c) just below the vertical position of the first touch, and
(d) well below the vertical position of the first touch;
(4) Notice how, more than half the time, touch (c) does not register. I have found that this problem is more replicatable when the first touch is on the lower half of the screen, but I have been able to replicate it when the finger is higher as well, just not as consistently.
Here are the four positions described in the steps above:
Position a: both touches register
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9; 2|
|&#9; 1&#9;&#9;|
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9;&#9;|
Position b: both touches usually register
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9;&#9;|
|&#9; 1&#9; 2|
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9;&#9;|
Position c: only touch 1 registers
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9;&#9;|
|&#9; 1&#9;&#9;|
|&#9;&#9;&#9; 2|
|&#9;&#9;&#9;&#9;|
Position d: both touches register
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9;&#9;|
|&#9; 1&#9;&#9;|
|&#9;&#9;&#9;&#9;|
|&#9;&#9;&#9; 2|
Is there anything I can do to resolve this behavior? My app requires gesture deferment to be on for the expected experience by the user, and this bug is causing other issues for my testers that kind of need to be resolved before I can confidently release the game.
An app built on Xcode 26 (beta4) presents various UIViewCOntrollers. Presentation of any UIViewController that contains a UIToolbar leads to a UIKit crash when run on an iOS 18.5 device, it does not crash when run on iOS 26.
The exception logged:
*** Terminating app due to uncaught exception 'NSInvalidUnarchiveOperationException', reason: 'Could not instantiate class named TtGC5UIKit17UICoreHostingViewVCS_21ToolbarVisualProvider8RootView because no class named TtGC5UIKit17UICoreHostingViewVCS_21ToolbarVisualProvider8RootView was found; the class needs to be defined in source code or linked in from a library (ensure the class is part of the correct target)'
Anyone else seen this?
I've submitted a bug report via Feedback Assistant, including a minimal Xcode workspace that reproduces the crash, hoping this will get some attention.
Hi,
I’m seeing a crash when running my app on iOS 18 simulators or devices using Xcode 26 beta 3.
My app’s minimum deployment target is iOS 17, and the crash does not happen when running from Xcode 16.4.
The crash occurs specifically at this line:
return UIStoryboard(name: storyboard.rawValue, bundle: nil)
.instantiateViewController(withIdentifier: view.rawValue)
Crash Details:
** Terminating app due to uncaught exception 'NSInvalidUnarchiveOperationException', reason: 'Could not instantiate class named _TtGC5UIKit17UICoreHostingViewVCS_21ToolbarVisualProvider8RootView_ because no class named _TtGC5UIKit17UICoreHostingViewVCS_21ToolbarVisualProvider8RootView_ was found; the class needs to be defined in source code or linked in from a library (ensure the class is part of the correct target)'
*** First throw call stack:
(0x191c3321c 0x18f0cdabc 0x191c91ea0 0x19d740774 0x19d740a18 0x19d740cac 0x194626680 0x194dbc784 0x19d740890 0x19d740cac 0x1949aadd8 0x19d740890 0x19d740a18 0x19d740cac 0x194802e24 0x1945f008c 0x194ed1808 0x107a8bfa0 0x107a8c05c 0x1945ec128 0x19d740890 0x19d740cac 0x1945eba60 0x19d740890 0x19d740a18 0x19d740cac 0x1945f07dc 0x1945eaea4 0x19492ee80 0x10763de00 0x1076e56fc 0x1076e5674 0x1076e5e04 0x19496108c 0x194f9b9a0 0x1949072c4 0x194f998cc 0x194f9af04 0x19445139c 0x19445ac28 0x194467508 0x1079afaec 0x1079aff5c 0x1944189a0 0x194417be4 0x1944114e4 0x194411404 0x194410ab4 0x19440c1e4 0x191b28a8c 0x191b288a4 0x191b28700 0x191b29080 0x191b2ac3c 0x1ded09454 0x19453d274 0x194508a28 0x1073564f4 0x1b89fff08)
terminating due to uncaught exception of type NSException
The crash occurs immediately at app launch, when attempting to load a storyboard-based UITabBarController.
Works as expected on:
Xcode 16.4 + iOS 18 (simulator/device)
Xcode 26 beta 3 + iOS 26 (simulator/device)
Running from Xcode 26 beta 3 onto iOS 18 simulators or devices and it immediate crash from the particular storyboard
Setup:
Xcode: 26 beta 3
macOS: 15.5
iOS Simulators: iOS 18.5
Minimum deployment target: iOS 17
UIKit-based app (not using SwiftUI)
No custom toolbars or host views in use explicitly
Is this a known compatibility issue when building with the iOS 26 SDK and running on iOS 18?
Are there any workarounds or recommendations for running apps targeting iOS 17+ on iOS 18 simulators when using Xcode 26?
We're observing new crashes specifically on iOS 18.4 devices with this pattern:
Exception Type: SIGTRAP
Exception Codes: fault addr: 0x000000019bc0f088
Crashed Thread: 0
Thread 0
0 libsystem_malloc.dylib _xzm_xzone_malloc_from_tiny_chunk.cold.1 + 36
1 libsystem_malloc.dylib __xzm_xzone_malloc_from_tiny_chunk + 612
2 libsystem_malloc.dylib __xzm_xzone_find_and_malloc_from_tiny_chunk + 112
3 libsystem_malloc.dylib __xzm_xzone_malloc_tiny_outlined + 312
4 CoreGraphics CG::Path::Path(CG::Path const&) + 132
5 CoreGraphics _CGPathCreateMutableCopyByTransformingPath + 112
6 CoreGraphics _CGFontCreateGlyphPath + 144
7 CoreGraphics _CGGlyphBuilderLockBitmaps + 1112
8 CoreGraphics _render_glyphs + 292
9 CoreGraphics _draw_glyph_bitmaps + 1116
10 CoreGraphics _ripc_DrawGlyphs + 1464
11 CoreGraphics CG::DisplayList::executeEntries(std::__1::__wrap_iter<std::__1::shared_ptr<CG::DisplayListEntry const>*>, std::__1::__wrap_iter<std::__1::shared_ptr<CG::DisplayListEntry const>*>, CGContextDelegate*, CGRenderingState*, CGGStack*, CGRect const*, __CFDictionary const*, bool) + 1328
12 CoreGraphics _CGDisplayListDrawInContextDelegate + 340
13 QuartzCore _CABackingStoreUpdate_ + 612
14 QuartzCore ____ZN2CA5Layer8display_Ev_block_invoke + 120
15 QuartzCore -[CALayer _display] + 1512
16 QuartzCore CA::Layer::layout_and_display_if_needed(CA::Transaction*) + 420
17 QuartzCore CA::Context::commit_transaction(CA::Transaction*, double, double*) + 476
18 QuartzCore CA::Transaction::commit() + 644
19 UIKitCore ___34-[UIApplication _firstCommitBlock]_block_invoke_2 + 36
20 CoreFoundation ___CFRUNLOOP_IS_CALLING_OUT_TO_A_BLOCK__ + 28
21 CoreFoundation ___CFRunLoopDoBlocks + 352
22 CoreFoundation ___CFRunLoopRun + 868
23 CoreFoundation _CFRunLoopRunSpecific + 572
24 GraphicsServices _GSEventRunModal + 168
25 UIKitCore -[UIApplication _run] + 816
26 UIKitCore _UIApplicationMain + 336
27 app _main + 132
28 dyld __dyld_process_info_create + 33284
Key Observations:
Crash occurs during font glyph path creation (CGFontCreateGlyphPath)
Involves memory allocation in malloc's xzone implementation
100% reproducible on iOS 18.4, not seen in prior OS versions
Occurs during standard CALayer rendering operations
Not tied to any specific font family or glyph content
Questions for Apple:
Is this crash signature recognized as a known issue in iOS 18.4's CoreGraphics?
Could changes to xzone memory management in iOS 18.4 interact poorly with font rendering?
Are there specific conditions that might trigger SIGTRAP in CGPathCreateMutableCopyByTransformingPath?
Any recommended mitigations for text rendering while awaiting system updates?
When testing with iOS 18.4 Beta on iPhones which support Dynamic Island, after doing a Face ID authentication, the amount of time it takes before the AppDelegate method applicationDidBecomeActive() is called takes longer than iPhones that do not support Dynamic Island. The time it takes is about double, 1.2 seconds vs 2.5 seconds on average. This does not occur with versions before 18.4 Beta.
Anyone else seeing this?
Topic:
UI Frameworks
SubTopic:
UIKit
I have been playing around with the new AsyncImage Api in SwiftUI
I am using the initialiser that passes in a closure with the AsyncImagePhase, to view why an image may not load, when I looked at the error that is passed in if the phase is failure, the localised description of the error is "Cancelled" but this is happening before the view is being displayed.
I am loading these images in a list, I imagine I am probably doing something which is causing the system to decide to cancel the loading, but I cannot see what.
Are there any tips to investigate this further?
I noticing that Monterey defaults to the NSWindowToolbarStyleAutomatic / NSWindowToolbarStyleUnified toolbar style, which suppresses the "use Small Size" menu item and customization checkbox.
So I've set the window to use NSWindowToolbarStyleExpanded. However, the toolbar will no longer change to a smaller icon size, as it did in MacOS 10.14, 10.15, and 11.0.
I've tried to set the toolbar item sizing to "Automatic" for all of our toolbar icons, but that results in bad positioning in both Regular and Small Size mode -- the height is way too big.
The native size of the icon .png files are 128 x 128. What's odd is that if I resize the window with the toolbar to be wider, the NSToolbarItems in the overflow area will be displayed in the toolbar are 128 x 128, where the rest of the toolbar icons get displayed as a 32 x 32 icon.
The only way to get it to layout remotely correct is to make the NSToolbarItem to have an explicit minimum size of 24 x 24 and maximum size of 32 x 32. And that USED to allow "small size", but on Monterey, it no longer does.
Anyone had any success with small size icons on Monterey?