Post not yet marked as solved
I am working on a project that involves tabular classification using Create ML Components and I'm having trouble getting my model to work correctly.
I've been following the official documentation here:
https://developer.apple.com/documentation/createmlcomponents/
and also i have followed the wwdc talks. This one seems to be the most relevant one:
https://developer.apple.com/wwdc22/10019
In that talk the construction of a tabular regressor is explained(20:23 Tabular regressor) . I tried using BoostedTreeClassifier similar to that. I am having trouble getting it to work though. While training with 5 featureColumns and one annotation columns appears to have worked. i am unsure how to get the label (the categorical value that the model is supposed to return) after calling .predict .
This is the last bit of the example from apple (tabular regression)
static func predict(
type: String,
region: String,
volume: Double
) async throws -> Double {
let model = try task.read(from: parametersURL)
let dataFrame: DataFrame = [
"type": [type],
"region": [region],
"volume": [volume]
]
let result = try await model(dataFrame)
return result[priceColumnID][0]!
}
The last line is what i am wondering how to write for a Tabular classifier. It should return a category in my case.
Is there a tutorial or example for a tabular classifier code somewhere?
Post not yet marked as solved
Hello,
I am currently stuck on a problem involving the newly released feature of CreateML components.
I followed Get to know CreateML components, however, its not clear to me how this model can train with multiple iterations. I got a model to go through one iteration with the code shown in the video (it wasnt released as a project file), but there doesnt seem to be a way to increase the iterations. I also looked through the only associated project on CreateML components, but the code was different from what was described in that video and lacked the audio classifier example to see how it ticked.
It was also mentioned in the video that there might be issues saving the model in a CoreML file format due to it being custom, but that leaves the question of how ones supposed to save the trained model once done. It seems like saving a model would be really beneficial to machine learning tasks, right?
Here is the code I am using in swift playgrounds:
truct ImageRegressor{
static let trainingDataURL = URL(fileURLWithPath: "Project/regression_label")
static let parameterURL = URL(fileURLWithPath: "Project/parameters")
static func train() async throws -> some Transformer<CIImage, Float>{
let estimator = ImageFeaturePrint()
.appending(LinearRegressor())
let data = try AnnotatedFiles(labeledByNamesAt: trainingDataURL, separator: "-",type: .image)
.mapFeatures(ImageReader.read)
.mapAnnotations({_ in Float()})
let (training,validation) = data.randomSplit(by: 0.8)
let transformer = try await estimator.fitted(to:training,validateOn: validation){
event in guard let trainingMaxError = event.metrics[.trainingMaximumError] else{
return
}
guard let validationMaxError = event.metrics[.validationMaximumError] else{
return
}
print("Training max error: \(trainingMaxError), Validation max error: \(validationMaxError)")
}
let validationError = try await meanAbsoluteError(
transformer.applied(to: validation.map(\.feature)),
validation.map(\.annotation))
print("Mean absolute error: \(validationError)")
try estimator.write(transformer, to:parameterURL)
return transformer
}
}
func doSomething() {
Task{
let transformer = try await ImageRegressor.train()
}
}
doSomething()
Post not yet marked as solved
I encountered an error while experimenting with the new CreateMLComponents in playground with the following code:
import CreateMLComponents
import CoreML
var fullyConnected = FullyConnectedNetworkRegressor<Float>.init()
fullyConnected.hiddenUnitCounts = [2]
let feature: AnnotatedFeature<MLShapedArray<Float>, Float> = .init(feature: .init(scalars: [2, 3], shape: [2]), annotation: 5)
let fitted = try? await fullyConnected.fitted(to: [feature, feature])
print(fitted)
The generated error message is included (partially) at the end of this post. I later found out that this same code works fine in an actual app. Any insights?
The error message:
Playground execution failed:
error: Execution was interrupted, reason: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0).
The process has been left at the point where it was interrupted, use "thread return -x" to return to the state before expression evaluation.
* thread #1, queue = 'com.apple.main-thread', stop reason = EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0)
* frame #0: 0x00007ffb2093728d SwiftNN`SwiftNN.Tensor.scalar<τ_0_0 where τ_0_0: SwiftNN.TensorScalar>(as: τ_0_0.Type) -> τ_0_0 + 157
frame #1: 0x00007ffb20937cbb SwiftNN`SwiftNN.Tensor.playgroundDescription.getter : Any + 91
frame #2: 0x000000010da43ac4 PlaygroundLogger`___lldb_unnamed_symbol491 + 820
frame #3: 0x000000010da45dbd PlaygroundLogger`___lldb_unnamed_symbol505 + 189
(some more lines ...)
PlaygroundLogger`___lldb_unnamed_symbol428 + 124
frame #65: 0x000000010da41dad PlaygroundLogger`playground_log_hidden + 269
frame #66: 0x000000010ca59aba $__lldb_expr14`async_MainTY1_ at CreateMLComp.xcplaygroundpage:12:5
frame #67: 0x000000010ca59fb0 $__lldb_expr14`thunk for @escaping @convention(thin) @async () -> () at <compiler-generated>:0
frame #68: 0x000000010ca5a0c0 $__lldb_expr14`partial apply for thunk for @escaping @convention(thin) @async () -> () at <compiler-generated>:0
Post not yet marked as solved
How can I add images to my assets and upload them to AnnotatedFiles in iOS app?