Hi all — I wanted to share an idea I recently submitted through Feedback Assistant that I think could improve safety and usability for drivers using CarPlay:
Add an option to overlay live weather radar (rain, snow, storms, etc.) directly onto CarPlay Maps while navigating. Similar to how traffic conditions are shown now, this would allow drivers to visually track incoming weather in real time without switching apps or relying on separate devices.
Why this matters:
• Enhances driver safety by increasing situational awareness
• Helps with trip planning and route adjustments around severe weather
• Reduces distractions by integrating everything into one screen
• Useful for everyday drivers, long-haul travelers, and first responders
I submitted this via Feedback Assistant, but I’d love to know what others think. If you also see value in this feature, consider submitting your own version via Feedback Assistant so Apple sees there’s interest.
Let’s push for smarter, safer navigation — thanks for reading!
General
RSS for tagExplore the art and science of app design. Discuss user interface (UI) design principles, user experience (UX) best practices, and share design resources and inspiration.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello. I've made a shape in the app which looks like the hello sign on apple products at startup. Is this considered plagiarism, or is it acceptable to use it in an app?
P.s: i've used Path for it and drawed it with curves
My newly released App Snapshot-Chess-Move, #1592848671, is not creating a public database of chess moves as I expect. What steps do I need to do inorder for my App to be using a public database. It appears as if each of my iOS devices, iPhone, iPad and Mac mini each have a private database of chess moves. When I change my data on the iPad, I expect the new data to appear (with slight delays) on the Mac.. I do not know what to do next. Please help me. This was working in Development mode but not in Production when I submitted my App for release.
UPDATE:
The cloud data is copied locally to a @Quary variable and updated by using .insert, .delete and .save commands. So, I deleted and re-downloaded my apps on each device, iPad, iPhone, and Mac and obtained the same cloud data. So how do users get the most recent copy of the cloud. Do they need to delete their App and start over? Is there a .update command that can do this updating for me? Also, I pushed the App out of the background and restarted the App to obtain the updated cloud data.
Will the draw (animation) functionality of SF Symbols be available when symbols are cut/pasted into a Keynote presentation? Better yet, will the draw tools from SF Symbols 7 be available in Keynote for objects created in Keynote?
I am creating a shooting game which uses Game Center for multiplayer and I want to make the Matchmaker view look different. How can I do that?
Hi community my name is Adam Robles. I have a question. I recently downloaded the beta on my iPad Pro 13 inch and I noticed where are the iPad wallpapers I only get one wallpaper, but where what happened to the wallpaper that was released with iPad last year
I was wondering if errors are common for the code below for saving SwiftData data and what would be the best way to handle them (popup, closing the app)?
do {
try modelContext.save()
} catch {
print("error")
}
i accidentally updated my iphone with the ios 18 and i dislike it. the emoji display were too huge and the picture gallery was kinda messy. i hope apple could fix this by bringing back the old emoji display and the gallery settings.
Hi there
I have developed an offline algorithm for calculating tides, which works based on the built-in database of tidal stations in some regions. The algorithm works correctly and the results match the real data.
I would like to receive and add moon data from Weather Kit to the algorithm for performance improvement experiments. What requirements do I need to apply to the application so that I can use data from Weather Kit in my algorithm ?
With the code below, JSON data is parsed and is stored in the variable data in the .onAppear function, however an empty set of data is passed to the Content view. How can that be fixed so that the JSON data passes to the DataView?
struct ContentView: View {
@State var data: [Data]
@State var index: Int = 0
var body: some View {
VStack {
DataView(data: data[index])
}
.onAppear {
let filePath = Bundle.main.path(forResource: "data", ofType: "json")
let url = URL(fileURLWithPath: filePath!)
data = getData(url: url)
}
}
func getData(url: URL) -> [Data] {
do {
let data = try Data(contentsOf: url)
let jsonDecoded = try JSONDecoder().decode([Data].self, from: data)
return jsonDecoded
} catch let error as NSError {
print("Fail: \(error.localizedDescription)")
} catch {
print("Fail: \(error)")
}
return []
}
}
https://developer.apple.com/documentation/appstoreconnectapi/devicecreaterequest/data-data.dictionary/attributes-data.dictionary
According to the API documentation above, the parameter values for platform can be three: IOS, MAC-OS, and UNIVERSAL. After debugging, it was found that IOS and MAC-OS can be used normally, but UNIVERSAL encountered an error UNIVERSAL' is not a valid value for the attribute 'platform'. Expected one of: 'IOS', 'MAC_OS', I would like to know if this value has been deprecated or if the API interface requires new version support, and how to use this value! Please help me solve it! thank you!
Liquid Glass was introduced as a universal design language for all platforms, but isn't supported by visionOS 26 beta.
For a small team creating a visionOS app targeted for release in fall 2026, should we focus our design work on Liquid Glass, or for the existing visionOS design language?
Hi Guys, I noticed that with Icon Composer when you export the icon it does not export a square image, is there any way of doing it or that's how it is now?
Checking the icon on the iPhone something doesn't seem right...
How can I achieve the result of buttons glass effect like sample videos that was show at de WWDC25? I tried a lot of approaches and I still far a way from the video.
I would like something like the pictures attached. Could send a sample code the get the same result?
Thanks
As of right now Icon Composer does not support creating app icons for visionOS and tvOS. It appears that only system apps can provide glass icons for those platforms. How should developers handle this? In extreme cases, the flat icon on those platforms will look wildly different from their glass counterparts.
From what I have seen visionOS and tvOS also do not apply any automatic treatment like on iOS where legacy icons get a glass effect.
So, third party app icons are just going to look out of place for (hopefully just) a year on those platforms? What is the recommended approach here? You could obviously fake the effect, but I feel like that would be worse.
This has been an issue since installing the first beta and I haven't seen anyone else say anything about it so I feel like I'm going kind of bonkers. While using Dark Mode, double-tapping to select a word while using the Liquid Glass keyboard, like in Messages or Safari, produces a flashlight-like effect over the text with each tap. While this is a fine animation and pretty helpful when moving the cursor, the brightness of the flashlight effect on white text in Dark Mode makes it impossible to see what you're actually tapping to select. The brightness seems to intensify 100x when rapidly tapping. This is not an issue in Light Mode when the text itself is black.
To recreate this, just turn your phone to Dark Mode, go into Messages and type a few words into a thread. Try to double-tap to select a word in the text box. The brightness of the selected word should intensify to the point of being unable to see the word itself.
Has anyone been bothered by this? Is there a way to fix or adjust it? I've tried reducing transparency and a bunch of other settings but nothing has worked.
I’m delighted with the introduction of new color folders. Although, I can’t help but wonder why we still need both color folders and tags. Aren’t the color folders sufficient for our needs?
We have found that on iOS 26 beta some of our app icons built from an Xcode 16 asset catalog containing a single 1024x1024 .png file have a Liquid Glass effect applied to them while others have not.
The documentation states that
If you choose not to use Icon Composer, you can still use an AppIcon asset catalog in your project containing individual app icon images and let the system apply the Liquid Glass material.
and
If you prefer, you can take advantage of the system’s automatically generated treatment that is applied to all app icons.
Is there any insight into how the system treats app icons that have not yet been updated with Icon Composer?
Feedback id: FB16140301
Below are the steps to reproduce the bug in Contacts app.
Open Contacts app.
Now search for a contact and didSelect that contact.
Now slightly hold swipe right(from view's center leading position) as to pop the view but not fully swipe, now release the finger and you can see the back nav bar button missing and tapping the back button position also doesn't perform dismiss action.
Now do fully swipe from left to right to dismiss(pop) current view.
Here you can see the search bar missing.-> That's the bug.
On earlier iOS versions Live Activity displays correctly according to mode set.
Can't find an opened issue for that
version: iOS 26
device: iPhone 16