Hey, I have been trying out the Xcode 16 beta's code completion for the last couple of days. I went do disable it through the settings, in the components section. I did this to go along with a tutorial and I didn't want it to help me out.
But now that I want it back. I can't find a way to enable it again.
I tried reinstalling both the beta and regular xcode, but it didn't show up again.
Wanted to ask if someone knows how to get this back. Thanks!
Posts under Beta tag
200 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hi, i am currently updating my app in SwiftUI from iOS 16.4 to 18.0, i am getting in xcode these new warning
@State' used inline will not work unless tagged with '@Previewable'
so i fix the problem with the suggestions and add the new tag @Previewable and put the variable in the first line of the preview
#Preview {
@Previewable @State var enabled: Bool = true
return NotificationLabel(enabled: $enabled, type: "test", icon: "star", title: "test", description: "test")
}
the problem is that my app is available from iOS 16.4 to iOS 18.0 and the preview gives me this error
'Previewable()' is only available in iOS 18.0 or newer
but if i put a versione if at the top i get another error
if #available(iOS 18.0, *){
@Previewable @State var enabled: Bool = true
@Previewable' items must be at root scope in the preview block
is there a way to silence the warning making a two version preview, one for iOS 18 and one for the previous versions?
Since the iOS 18 and Xcode 16, I've been getting some really strange SwiftData errors when passing @Model classes around.
The error I'm seeing is the following:
SwiftData/BackingData.swift:409: Fatal error: This model instance was destroyed by calling ModelContext.reset and is no longer usable.
PersistentIdentifier(id: SwiftData.PersistentIdentifier.ID(url: x-coredata://34EE9059-A7B5-4484-96A0-D10786AC9FB0/TestApp/p2), implementation: SwiftData.PersistentIdentifierImplementation)
The same issue also happens when I try to retrieve a model from the ModelContext using its PersistentIdentifier and try to do anything with it. I have no idea what could be causing this.
I'm guessing this is just a bug in the iOS 18 Beta, since I couldn't find a single discussion about this on Google, I figured I'd mention it.
if someone has a workaround or something, that would be much appreciated.
Related to https://developer.apple.com/forums/thread/756767 and https://feedbackassistant.apple.com/feedback/13893850 which was dismissed.
Hi,
I'm trying to install multiple Xcode versions with simulators on a single mac
The problem is: xcrun simctl or any other simulator related process would hang several minutes after installing Xcode 16 beta additional components.
If I delete /Library/PrivateFrameworks dir and reinstall additional components of Xcode 15.4, xcrun works fine without any hang.
When hanging on CLI, simdiskimaged process skyrockets in CPU usage %, so that's why I assume CoreSimulator Framework is the main source of hang.
Is there any way to use additional components(technically CoreSimulator Framework) of Xcode 15.4 on Xcode 16 beta?
※Plus, as I described in related thread, the hang disappears if I run Simulator.app on GUI. I already tried xattr -r -d com.apple.quarantine /Library/PrivateFrameworks or Xcode 16 beta with no use.
Is there any command I can open Simulator.app headlessly on CLI?
Error when trying to generate CoreML performance report, message says
The data couldn't be written because it isn't in the correct format.
Here is the code to replicate the issue
import numpy as np
import coremltools as ct
from coremltools.converters.mil import Builder as mb
import coremltools.converters.mil as mil
w = np.random.normal(size=(256, 128, 1))
wemb = np.random.normal(size=(1, 32000, 128)) # .astype(np.float16)
rope_emb = np.random.normal(size=(1, 2048, 128))
shapes = [(1, seqlen) for seqlen in (32, 64)]
enum_shape = mil.input_types.EnumeratedShapes(shapes=shapes)
fixed_shape = (1, 128)
max_length = 2048
dtype = np.float32
@mb.program(
input_specs=[
# mb.TensorSpec(enum_shape.symbolic_shape, dtype=mil.input_types.types.int32),
mb.TensorSpec(enum_shape.symbolic_shape, dtype=mil.input_types.types.int32),
],
opset_version=mil.builder.AvailableTarget.iOS17,
)
def flex_like(input_ids):
indices = mb.fill_like(ref_tensor=input_ids, value=np.array(1, dtype=np.int32))
causal_mask = np.expand_dims(
np.triu(np.full((max_length, max_length), -np.inf, dtype=dtype), 1),
axis=0,
)
mask = mb.gather(
x=causal_mask,
indices=indices,
axis=2,
batch_dims=1,
name="mask_gather_0",
)
# mask = mb.gather(
# x=mask, indices=indices, axis=1, batch_dims=1, name="mask_gather_1"
# )
rope = mb.gather(x=rope_emb.astype(dtype), indices=indices, axis=1, batch_dims=1, name="rope")
hidden_states = mb.gather(x=wemb.astype(dtype), indices=input_ids, axis=1, batch_dims=1, name="embedding")
return (
hidden_states,
mask,
rope,
)
cml_flex_like = ct.convert(
flex_like,
compute_units=ct.ComputeUnit.ALL,
compute_precision=ct.precision.FLOAT32,
minimum_deployment_target=ct.target.iOS17,
inputs=[
ct.TensorType(name="input_ids", shape=enum_shape),
],
)
cml_flex_like.save("flex_like_32")
If I remove the hidden states from the return it does work, and it also works if I keep the hidden states, but remove both mask, and rope, i.e, the report is generated for both programs with either these returns:
return (
# hidden_states,
mask,
rope,
)
and
return (
hidden_states,
# mask,
# rope,
)
It also works if I use a static shape instead of an EnumeratedShape
I'm using macOS 15.0 and Xcode 16.0
Edit 1:
Forgot to mention that although the performance report fails, the model is still able to make predictions
Potential Apple Pay Notification Delay in iOS 18 Developer Beta
Users are reporting a delay in receiving purchase notifications in Apple Wallet after using Apple Pay. Transactions are confirmed within the Wallet app, but the notification arrives hours later.
As this issue is occurring on the iOS 18 developer beta, it's likely a software bug related to Apple Pay integration with Wallet.
I have reported this to Feedback.
Is anyone else having the same problem?
I was able to AirDrop files to other Mac and connect to Vision Pro before updating the system to 15.0 Beta
Currently, here is what I have:
MacOS Version 15.0 Beta (24A5264n) fails to AirDrop files to the other one, while it can drop to the IPhone and that Mac can receive files from that IPhone without any issue
Regarding the Vision Pro, it takes forever for this Mac to connect after pairing.
Does anyone have any idea on what might be the cause?
As per subject. Is this possible at all?
The WWDC video only talks about the communication / push notifications. There doesn't seem to be any update to UNNotificationContent to allow setting of attributed strings.
If your app makes use of communication notifications you can even include Genmoji and other image glyphs in your notifications with the new "UNNotificationAttributed MessageContext API". For push notifications, the payload just needs to contain a rich text representation that may contain image glyphs.
We recommend that you use a Notification Service Extension to parse the rich text, download assets, create the attributed body and update the notification content
I am very new to App Intents and I am trying to add them to my On Device LLM ChatBot app so my users can get answers to any questions anywhere in iOS.
I have the following code and it is working wonderfully in the Shortcuts app.
import AppIntents
struct AskAi: AppIntent {
static var openAppWhenRun: Bool = false
static let title: LocalizedStringResource = "Ask Ai About"
static let description = "Gets an answer from Ai for your question."
@Parameter(title: "Question")
var question: String
static var parameterSummary: some ParameterSummary {
Summary("Ask Ai About \(\.$question)")
}
@MainActor
func perform() async throws -> some IntentResult & ReturnsValue<String> {
let bot: Bot = Bot()
await bot.respond(to: self.question)
return .result(
value: bot.output
)
}
}
class AppShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: AskAi(),
phrases: [
"Ask \(.applicationName) \(\.$question)",
"Get \(.applicationName) answer for \(\.$question)",
"Open \(\.$question) using \(.applicationName) ",
"Using \(.applicationName) get help with \(\.$question)"
],
shortTitle: "Ask Ai",
systemImageName: "sparkles"
)
}
}
I can create a shortcut for this AppIntent and that allows me say speak the response.
I can call my shortcut via iOS 18 Beta 1 by the Shortcut name I set in the Shortcuts app and that allows it to work.
It does not work at all by just Asking Siri any of the phrases I have defined.
The info.plist has an app name alias defined just to be sure.
I even added the Siri capability in Xcode-beta.
I also tried using the ProvidesDialog return type too.
Whatever I do the AppIntent is invisible to Siri.
Siri tries to search the web, looking for my app name in the contacts or have an error Apple Cash which has nothing to do with what I was talking about.
Is there anything else I am missing for setting up iOS AppIntents to work with Siri?
After I installed the profile certificate in VPN and device management, I can't see the installed certificate in the certificate trust settings
See FB13917278 (App Becomes Unresponsive for iOS/iPadOS 18 Regular Size Class Interactions When Selecting One Particular View in Sidebar).
The widget I have create for iOS 17 uses the containerBackground to display an image in the background. This works fine. But when I set the home screen to the iOS 18 tinted option the background disappears. I want the background to stay because it contains an image of that is meaningful to the user.
I use the following code:
@ViewBuilder
var body: some View {
if let memory = entry.memory, let uiImage = memory.image {
Group {
if entry.showCaption {
memoryBody(with: memory)
} else {
Color.white.opacity(0.0000000001)
}
}
.foregroundStyle(.white)
.widgetBackground(
Image(uiImage: uiImage)
.resizable()
.scaledToFill()
)
} else if let memory = entry.memory {
memoryBody(with: memory)
.widgetBackground(Color.gray)
} else {
noMemoryBody()
}
}
extension View {
func widgetBackground(_ backgroundView: some View) -> some View {
if #available(iOSApplicationExtension 17.0, *) {
return containerBackground(for: .widget) {
backgroundView
}
} else {
return background(backgroundView)
}
}
}
hello yesterday was testing the new ios 18 in my iphone 13 after finish the update, the device is not recieving charge from original charger with original cable, i reboot that from ios normal, and now not recieve any instruction in the screen, how i can fix that? i connect itunes and not recognoce iphone, and i canoot shoutdown...
somebody that can help with this super issue
I would like to install the beta of macOS 15 on an empty volume, rather than on top of an existing version of macOS. Is it possible? I see that I can download an .ipsw file, but I don't understand what can be done with it.
While compiling with the Xcode Beta1 version, the 'module' keyword used in the previous method as a parameter is now recognized as a reserved keyword, causing compilation errors and preventing our project from continuing to run. Please fix this issue as soon as possible.
Additionally, I would like to vent: It's quite uncomfortable that the MacOS Beta version cannot be downgraded directly. I am now facing the dilemma of being unable to open Xcode 15 and having Xcode 16 fail to compile.
Here is an App using CoreML API with ML package format, it works fine in iOS17, while it is crashed when calling [MLModel modelWithContentsOfURL ] to load model running in iOS18. It seems an exception is raised "Failed to set compute_device_types_mask E5RT: Cannot provide zero compute device types. (1)". Is it a bug of iOS18 beta version , and it will be fixed in the future?
The stack is as below:
Exception Codes: #0 at 0x1e9280254
Crashed Thread: 49
Application Specific Information:
*** Terminating app due to uncaught exception 'NSGenericException', reason: 'Failed to set compute_device_types_mask E5RT: Cannot provide zero compute device types. (1)'
Last Exception Backtrace:
0 CoreFoundation 0x0000000199466418 __exceptionPreprocess + 164
1 libobjc.A.dylib 0x00000001967cde88 objc_exception_throw + 76
2 CoreFoundation 0x0000000199560794 -[NSException initWithCoder:]
3 CoreML 0x00000001b4fcfa8c -[MLE5ProgramLibraryOnDeviceAOTCompilationImpl createProgramLibraryHandleWithRespecialization:error:] + 1584
4 CoreML 0x00000001b4fcf3cc -[MLE5ProgramLibrary _programLibraryHandleWithForceRespecialization:error:] + 96
5 CoreML 0x00000001b4fc23d8 __44-[MLE5ProgramLibrary prepareAndReturnError:]_block_invoke + 60
6 libdispatch.dylib 0x00000001a12e1160 _dispatch_client_callout + 20
7 libdispatch.dylib 0x00000001a12f07b8 _dispatch_lane_barrier_sync_invoke_and_complete + 56
8 CoreML 0x00000001b4fc3e98 -[MLE5ProgramLibrary prepareAndReturnError:] + 220
9 CoreML 0x00000001b4fc3bc0 -[MLE5Engine initWithContainer:configuration:error:] + 220
10 CoreML 0x00000001b4fc3888 +[MLE5Engine loadModelFromCompiledArchive:modelVersionInfo:compilerVersionInfo:configuration:error:] + 344
11 CoreML 0x00000001b4faf53c +[MLLoader _loadModelWithClass:fromArchive:modelVersionInfo:compilerVersionInfo:configuration:error:] + 364
12 CoreML 0x00000001b4faedd4 +[MLLoader _loadModelFromArchive:configuration:modelVersion:compilerVersion:loaderEvent:useUpdatableModelLoaders:loadingClasses:error:] + 540
13 CoreML 0x00000001b4f9b900 +[MLLoader _loadWithModelLoaderFromArchive:configuration:loaderEvent:useUpdatableModelLoaders:error:] + 424
14 CoreML 0x00000001b4faaeac +[MLLoader _loadModelFromArchive:configuration:loaderEvent:useUpdatableModelLoaders:error:] + 460
15 CoreML 0x00000001b4fb0428 +[MLLoader _loadModelFromAssetAtURL:configuration:loaderEvent:error:] + 240
16 CoreML 0x00000001b4fb00c4 +[MLLoader loadModelFromAssetAtURL:configuration:error:] + 104
17 CoreML 0x00000001b5314118 -[MLModelAssetResourceFactoryOnDiskImpl modelWithConfiguration:error:] + 116
18 CoreML 0x00000001b5418cc0 __60-[MLModelAssetResourceFactory modelWithConfiguration:error:]_block_invoke + 72
19 libdispatch.dylib 0x00000001a12e1160 _dispatch_client_callout + 20
20 libdispatch.dylib 0x00000001a12f07b8 _dispatch_lane_barrier_sync_invoke_and_complete + 56
21 CoreML 0x00000001b5418b94 -[MLModelAssetResourceFactory modelWithConfiguration:error:] + 276
22 CoreML 0x00000001b542919c -[MLModelAssetModelVendor modelWithConfiguration:error:] + 152
23 CoreML 0x00000001b5380ce4 -[MLModelAsset modelWithConfiguration:error:] + 112
24 CoreML 0x00000001b4fb0b3c +[MLModel modelWithContentsOfURL:configuration:error:] + 168
When creating a plain window in SwiftUI on macOS 15.0 beta, the TextEditor within the window does not accept keyboard input. I suspect it's because plain windows cannot become key windows.
Here is a minimal example:
import SwiftUI
struct ContentView: View {
@State var text: String = "Hello World"
var body: some View {
VStack {
TextEditor(text: $text)
}
}
}
@main
struct BigTextApp: App {
var body: some Scene {
Window("Text Window", id: "TextWindow") {
ContentView()
}
.windowStyle(.plain)
}
}
Attempting to type gives the following warning in Xcode: Warning: -[NSWindow makeKeyWindow] called on SwiftUI.AppKitWindow which returned NO from -[NSWindow canBecomeKeyWindow].
Is there a way to make the window capable of becoming a key window without dropping down to UIKit? Or is there another way to allow the TextEditor to accept keyboard input?
I have tried focusing the TextEditor, but that didn't work.
Hello everyone. Is iPadOS18’s Eye-tracking feautre available on iPad Pro 12.9” 2018? If not, where can I find the compatible devices where Eye-tracking available?
In the XCode 16 beta, it isn't possible to create a new StoreKit configuration file because the templates list doesn't contain that option.
Adding the storekit extension to a file crashes XCode or doesn't have an effect at all. But, loading an existing file still works.
I try Control Widget
I check API availability like this
don't work WidgetBundleBuilder.buildBlock()
so i use ControlWidgetConfigurationBuilder.buildBlock ! but it didn't work too
what should i do ?