Context & Issue
I am developing an iOS application.
My app icon uses colors that are relatively close to each other.
When the user enables Accessibility → Display & Text Size → Color Filters → Grayscale (or similar modes), the icon becomes harder to distinguish because it loses color and contrast is reduced.
Goal
When iOS switches to grayscale mode, I want the app icon to maintain good contrast between its elements so it remains clearly recognizable.
What I’ve tried
Redesigned the icon with more contrasting colors.
Added strokes/outlines, but it still doesn’t look much better in grayscale.
Researched how iOS renders app icons when grayscale is enabled, but couldn’t find a way to override or provide an alternative icon.
Specific questions
Is there any API or mechanism in iOS that allows providing a different version of the app icon when the user has grayscale mode enabled?
If there’s no direct API, are there any best practices for designing iOS app icons to ensure good contrast when converted to grayscale?
Do we have to design grayscale version for app icon?
Thank you!
Explore the art and science of app design. Discuss user interface (UI) design principles, user experience (UX) best practices, and share design resources and inspiration.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi everyone,
I’m currently testing iOS 26 on my iPhone as part of the developer program. According to Apple’s documentation and demo materials, a new screenshot animation was introduced in this version. However, when I take a screenshot on my device, the animation remains the same as in previous iOS versions.
I’ve double-checked that I’m running the correct build of iOS 26, and I haven’t found any settings that might enable or disable this feature.
Is anyone else experiencing the same issue? Could this new animation be device-specific, region-limited, or require additional configuration?
Any insight would be appreciated!
Thanks in advance,
Alonso Rivera
Hi all,
Very new to this. Just getting into swift data, and am frustrated with the canvas not working with modelContainers in SwiftData. My understanding is that they work if inMemory = true, but not in the default case where data is persistent after an app is quit.
Can anyone tell me if it is possible to conditionally create the modelContainer type based on a flag... If Bool:Canvas then inMemory = True, Else False... Then using this flag for all data models so my list views populate on the canvas, without having to run the simulator each time... I would assume you could also pre-populate the inMemory option if it is empty also...
Or is there a simple and obvious solution that I am oblivious to.
If it is possible, is it worth the time, hassle, and any possible issues?
I'm coding an iPhone app using Swift and I'm getting this scoping error. Attached.
With the new ios 26 beta 3 helps some stabillty and performance issues but most of the liquid glass has been removed or made very frosty look; and it defeats the whole purpose of a big redesign, and even thought the changes are because of readability and contrast complaints it should not take away liquid glass design. I think apple should consider adding a toggle or choice to choose if they would want a more frosted look or a more liquid glass look the the original plan.
Hello,
Im new to Xcode, ive been taking some classes and watching YouTube videos as well as using AI. Im having an issue I cannot find a video on, and AI just keeps screwing up my layout and sizing.
Here is the issue, I have a Custom Made Image for my Sign In button, for my log in page on Xcode. The issue being that I can barely see the button and when I go to adjust the size the whole layout gets screwed up. My Logo Image (supposed to take up the top 50% of the screen) takes over the whole Botton of the screen and I loose my username and password Text threads and images. I guess my question is, is this an issue with the size of image ive uploaded or is this an issue with my code? I changed the size of the Image I created in Canva to 900pixles for the width and 300pixals for the height and that did absolutely nothing to my image in Xcode.
Below is the Button and Create Account section in my code that seems to be having issues. Ppppplease help me.
var body: some View {
NavigationStack(path: $navigationPath) {
ZStack {
// Background image
Image("Background1")
.resizable()
.scaledToFill()
.ignoresSafeArea()
.clipped()
// Main content
ScrollView {
VStack(spacing: 20) {
// Logo
Image("DynastyStatDropLogo")
.resizable()
.scaledToFit()
.padding(.top, -160)
.padding(.bottom, -30)
// Form elements
// Username field
ZStack {
Image("UsernameBar")
.resizable()
.aspectRatio(contentMode: .fill)
.padding()
TextField("UserName:", text: $textInput)
.padding(.horizontal, 75)
.background(Color.clear)
.foregroundColor(.red)
.focused($focus, equals: .username)
.submitLabel(.next)
.onSubmit {
focus = .password
}
}
.frame(height: 50)
.clipShape(RoundedRectangle(cornerRadius: 10))
.padding(.horizontal)
// Password field and Forgot Password link
VStack(spacing: 20) {
ZStack {
Image("PasswordBar")
.resizable()
.aspectRatio(contentMode: .fill)
.padding()
SecureField("Password:", text: $textInput2)
.padding(.horizontal, 75)
.background(Color.clear)
.foregroundColor(.red)
.focused($focus, equals: .password)
.submitLabel(.go)
.onSubmit {
submitForm()
}
}
.frame(height: 50)
.clipShape(RoundedRectangle(cornerRadius: 10))
.padding(.horizontal)
// Forgot Password link (right-aligned)
HStack {
Spacer()
Text("Forgot Password?")
.foregroundColor(.blue)
.onTapGesture {
navigationPath.append("passwordRecovery")
}
}
.padding(.horizontal, 90)
}
Spacer(minLength: -110)
// SignIn Button - Explicitly showing it
HStack {
Spacer()
Button {
submitForm()
} label: {
Image("signinButton")
.resizable()
.frame(width: 500, height: 400)
}
Spacer()
}
Spacer(minLength: -300)
// Create Account (centered)
HStack {
Spacer()
Text("Create Account")
.foregroundColor(.blue)
.onTapGesture {
navigationPath.append("accountCreation")
}
Spacer()
}
.padding(.bottom, -10)
}
}
}
.onAppear {
focus = .username
}
.navigationDestination(for: String.self) { destination in
switch destination {
case "dashboard":
DSDDashboard()
case "passwordRecovery":
PasswordRecoveryView()
case "accountCreation":
AccountCreationView()
default:
EmptyView()
}
}
.alert(isPresented: $showAlert) {
Alert(
title: Text("Missing Information"),
message: Text("Enter UserName and Password to continue to DSD"),
dismissButton: .default(Text("OK"))
)
}
}
}
// Function to handle form submission
func submitForm() {
focus = nil
if textInput.isEmpty || textInput2.isEmpty {
showAlert = true
} else {
print("Login with username: \(textInput), password: \(textInput2)")
navigationPath.append("dashboard")
}
}
// Enum to manage focus states
enum FormFieldFocus: Hashable {
case username, password
}
}
As someone who genuinely appreciated the Liquid Glass effect introduced in iOS 26 Beta 1–2, I am deeply disappointed by its reduction in Beta 3. Liquid Glass wasn’t just eye candy it gave iOS a unique identity, futuristic feel, and a visual soul.
Now, the UI looks flat, generic, and indistinguishable from other platforms. I feel Apple is stepping back from a bold vision due to readability complaints that could’ve been solved with an option or toggle not by removing the whole design language.
Please consider restoring the full Liquid Glass look, or at least offer a toggle so users who believe in Apple’s design language can choose it. Don’t let this innovation fade because of short-term complaints.
Hello! I am developing an ebook reader iOS app that uses c/c++ codec as a page renderer.
The codec uses TrueType as a font rendering engine that requires access to .ttf (or .ttc) files.
Currently, I supply TrueType with fonts embedded in the app package, so they lay within the app sandbox.
The codec supports the whole unicode plane and many languages that ebooks may use, but the fonts I supply don't have some of the important glyphs (i.e. katakana or hangul).
I see that iOS has its own font storage, located in /System/Library/Fonts/ directory. The codec is able to parse this directory and read .ttf files located inside, using these fonts as a fallback in the case when the supplied fonts can't draw certain glyphs.
I use opendir and fopen(in "rb" mode) as a way to read the data, and it works well.
Does this type of access to the system directory violate the sandbox rule for an app distribution, and, if yes, is there a way to get access to stored .ttf files not violating the mentioned rule?
I'm using .glassEffect(.clear) on a transparent circle over a gradient background. While it's closer to clear than the plain glass effect, it't not really clear, like clear glass. Here is the code. Is there a glass effect that really looks like clear glass?
Circle()
.fill(.clear)
.frame(width: 180, height: 180)
.glassEffect(.clear)
There is a display issue when browsing wireless networks in the dropdown menu.
In iOS version 18.1.1, the Wi-Fi switch is in the closed state;
Step 1: Open the notification dropdown, and the first image bug appears;
It will take some time for it to display [normally.]
Hey there,
I redesigned my apps icons for Liquid Glass in the icon composer app. I have to say it's been a pleasure to use and my icons look stunning when rendered in the icon composer app, whatever rendering mode and context I've been testing.
But once in a developer release on my device (iOS 26 beta 3), the rendering is very disappointing. They look blurry, very far from what icon composer is showing.
I would like to know whether I have a design issue, or if the current state of the beta release is known to not render icon properly. I'm kind of panicking :)
I downloaded iOS 26 beta 3. I was very happy with how it turned out, but when I activated Siri, I noticed the rainbow pulsing glow that bordered the phone was missing, and all that was left was the original Siri bubble. I was very disappointed, does anyone know how to get this back? I loved that design feature.
Development environment:
Simulator: iOS 26 beta 3 iPhone 16 (for testing)
Simulator 2: iPadOS 26 beta 3 iPad Air 13 inch (M3) (for testing)
Connected Device: iPadOS 26 beta 3 iPad Pro 11 inch (M4) (for testing)
Dev Device: macOS Tahoe 26 beta 3 Macbook Air
When using the NavigationSplitView element, the sidebar has a built-in panel toggle button.
However, when I click the toggle button to toggle the sidebar section in SwiftUI 26 on both simulator 2 and the connected device, it has a slight animation glitch before going back to normal. What's going on?
This is my code for the specific view that has the NavigationSplitView (and all views are connected through TabViews):
RecordsPage.swift
Here are image references:
When I clicked the toggle:
After 1~2 seconds:
These images are in the state of the panel being hidden.
I am creating a Generate note app but I don't see the text in the button when I applied the MeshGradient. I removed the Mesh Gradient and text is there. Need some help to find the issue. I am new to app Development and I am learning how to use it. Below is the code:
import SwiftUI
struct ContentView: View {
@State private var inputText: String = ""
@State private var isLoading: Bool = false
var body: some View {
ZStack {
Color(.systemGray6).edgesIgnoringSafeArea(.all)
VStack (spacing: 20) {
Text("Generate Notes")
.font(.title)
.fontWeight(.bold)
.frame(maxWidth: .infinity, alignment: .leading)
Text("Transform your thoughts into well-structured notes using artificial intelligence.")
.font(.subheadline)
.foregroundStyle(.secondary)
.frame(maxWidth: .infinity, alignment: .leading)
TextEditor(text: $inputText)
.frame(height: 200)
.padding()
.background(RoundedRectangle(cornerRadius: 16)
.fill(Color(.systemGray6)))
Button(action: {}) {
HStack {
if isLoading {
ProgressView()
.tint(.white)
} else {
Image(systemName: "sparkles")
}
Text(isLoading ? "Generating..." : "Generate Notes")
}
.padding()
.frame(maxWidth: .infinity)
.background(
MeshGradient(width: 3, height: 3, points: [
.init(0, 0), .init(0.5, 0), .init(1, 0),
.init(0, 0.5), .init(0.5, 0.5), .init(1, 0.5),
.init(0, 1), .init(0.5, 1), .init(1, 1)
], colors: [
.blue, .purple, .indigo,
.orange, .white, .blue,
.yellow, .green, .mint
])
)
.mask(
RoundedRectangle(cornerRadius: 16)
.stroke(lineWidth: 16)
.blur(radius: 8)
)
.overlay(
RoundedRectangle(cornerRadius: 16)
.stroke(.white, lineWidth: 1)
.blur(radius: 1)
.blendMode(.overlay)
)
.background(.black)
.foregroundColor(.white)
.cornerRadius(16)
.background(
RoundedRectangle(cornerRadius: 16)
.stroke(.black.opacity(0.5), lineWidth: 1)
)
.shadow(color: .black.opacity(0.15), radius: 20, x: 0, y: 20)
.shadow(color: .black.opacity(0.1), radius: 15, x: 0, y: 15)
}
.disabled(isLoading || inputText.isEmpty)
Spacer()
}
.padding(32)
.background(Color(.systemBackground))
.cornerRadius(44)
.shadow(color: .black.opacity(0.1), radius: 20, x:0, y:10)
}
}
}
#Preview {
ContentView()
}
Scenario is when keyboard is opened within the app being developed then switch to other app, for instance, Notes app and create a note to enable keyboard from there. While the Notes app keyboard is active switch back to the developed app the keyboard in it is dismissed. Any thoughts?Thanks
In the past, we had a red badge without a number on the app icon. We want to bring it back. Please provide instructions.
I noticed a discrepancy between the Material specifications for tvOS on the Developer page and the naming in the Design Resources (Sketch files). Which one should we consider authoritative?
https://developer.apple.com/design/human-interface-guidelines/materials
Hi everyone,
I’m testing our SwiftUI app on both Xcode simulator and a real iPhone. On the simulator, everything looks clean and aligned. But when I run it on an actual iPhone (same build, iOS 18), the layout looks broken—fonts overlap, spacing is off, and elements are misaligned.
Both screenshots are from the exact same screen and time. First is simulator, second is iPhone.
Any idea why this difference happens? Is there something I should check in terms of rendering or layout settings?
Thanks in advance!
[Iphone11]
Hi everyone 👋
I’ve been using the Apple Pencil Pro with my iPad Pro M4 and absolutely love it — the squeeze gesture, rotation, and haptics are amazing for creative work. But I’ve run into a little roadblock…
Right now, only one Pencil Pro can be paired at a time. So while one is charging, you can’t use another as a backup without unpairing and re-pairing, which interrupts the workflow.
I’d really love to see one of two things:
The ability to use one Pencil while another charges
or
An official external charger (or support for third-party ones)
Personally, I’d happily buy both a second Pencil and a charger if this became possible. I’ve even chatted with other creatives who feel the same — it would make a huge difference for long projects or working on the go.
Just wanted to share this idea and see if anyone else here would like this too. Thanks for reading!